Institute of Education

Research & Expertise to Make a Difference in Education & Beyond

How International Tests Fail to Inform Policy: the Mystery of the Australian Case

Education experts Tatiana Khavenson and Martin Carnoy of IOE and Leila Morsy of the University of New South Wales (Australia) have analyzed Australia’s PISA and TIMSS data for a decade-and-a-half period through 2015 to test the validity of the most popular explanations that Australian education officials have voiced in recent years for the students’ steadily shrinking performance on PISA tests. The study results have been published in Elsevier’s International Journal of Educational Development.    

Between 2000 and 2015, Australian secondary schools saw a persisting decline in students’ performance on OECD’s Programme for International Student Assessment (PISA) tests, namely in mathematics, which was universal across the country’s territories, socioeconomic groups and institutional types (i.e., state-run vs. private schools).

In and of itself, a steady downward movement in national PISA scores is not unusual and has occurred in various participant countries, including the Netherlands, Finland, Canada, New Zealand, the United States, etc. However, fundamental triggers of such decreases have this far received little investigation to enable a holistic picture of what is actually going on in school environments and how policy action, or lack of such action, has been contributing to these processes.

Australia’s deteriorating PISA results of the past decade and a half have prompted growing concerns among education experts, policy strategists and other officials as to socioeconomic perils and downsides this situation may induce unless a positive turnaround in student achievement is secured. Various commentators have voiced multiple possible explanations – though largely anecdotal in nature – for why Australia has been faltering on the PISA assessment framework, which point to such factors as teacher qualification, patterns in math curriculum, socioeconomic aspects, and policymaking. Thus, some have argued that students’ worsening math performance is primarily due to declines in teacher quality. Others have noted less classroom time spent on math and the prevalence of the problem-based learning approach in mathematics over teaching background facts and concepts as the key reason. Also, the increased privatization of Australian education has been pointed out, which has arguably propelled the migration of high-achieving students from public schools to Catholic and independent private schools, all of this pulling down the learning outcomes at state-run institutions and the sector at large. Finally, some have even asserted that the recent persisting failures on PISA must be attributed to equality gaps and obsessive digitization.

Confronted with this broad multiplicity of expert opinions, the study authors Tatiana Khavenson and Martin Carnoy of IOE and Leila Morsy of the University of New South Wales (Australia) have primarily aimed to comprehensively review such different viewpoints on what has been driving down student PISA scores, and to test the validity of each major hypothesis through a comprehensive empirical methodology. While the research team is not claiming to provide any clear answers to what exactly stands behind this shrinking performance or give any definitive guidance on policies to rectify the curve, the analysis performed allows singling out at least some clues that may subsequently help gain more representative insights into the nature of this persisting setback on the PISA framework.

To assess experts’ conjectures about the causes of Australia’s declining performance in math, the study authors have devised a complex four-stage comparative approach relying on data from the country’s 2000–2015 PISA tests and 1999–2015 TIMSS tests.

In the first stage, the researchers compared convergent and divergent score trends on PISA versus TIMSS over the analyzed timeframe. Also, PISA scores were benchmarked for groups of students with different level of family academic resource (i.e., a household’s book stock). Next, the PISA scores for public and private schools were compared after adjustments for student and school socio-economic variances. In the second stage, the authors analyzed how students from different states performed on PISA over time, adjusting test scores for individual student socio-economic background and other attributes, as well as average school data, including type of school (government, Catholic, and independent). In the third stage, the adjusted performance trends on PISA were compared by different Australian states and types of schools. Finally, in the fourth stage, the analytical findings obtained were used to test the broadly maintained expert assumptions as to the key triggers of PISA setbacks, including those about downside effects from declines in math curriculum minutes and a lack of appropriately trained teachers.

Based on the above empirical approach, the authors have arrived at the following major conclusions with respect to the pool of the most popular expert explanations for Australia’s persistently downward performance on PISA mathematics:

  • Declines in adjusted PISA scores are observed for students in all Australian states, for both higher-status and less advantaged students, and in both government and private schools, although the trends vary from state to state and by type of schools.
  • The assumption that student performance in government schools has declined to a greater extent as compared to private schools, and that these government school failures are pulling down the national average is not warranted. While some politicians are suggesting funding cuts for government schools, the study results may be speaking for need of precisely the opposite policy, i.e., to increase their funding.
  • The assumed factors of ‘teacher quality issues’ and ‘less time spent on math’ have been assessed as modestly relevant and considerably relevant, respectively, on a case-by-case basis. When evaluated altogether, these factors suggest that the quality of math teaching may have declined (‘less time on math’ effect), though to an insignificant extent, and at least in some states, teaching quality may have even improved.
  • As suggested by variations in test score declines between largely similar states (Victoria vs. New South Wales), the teacher approach to delivering math may noticeably impact performance dynamics. The approach in Victoria, where less sizeable score declines have been recorded, appears to emphasize math education as a system of understanding patterns in the world, and strategic and conceptual thinking in which students are actively engaged; in New South Wales, where deeper declines have occurred, the approach appears to more passively engage students.
  • Australia’s Catholic schools, where the greatest declines in math performance are observed, may be enrolling more low-achieving students across social classes, or suffering less effective teaching.
  • Australia’s government strategy to encourage parental choice and competition between the state and private segments may have created a situation where the state of affairs in public schooling has persistently lacked due regulatory attention, this resulting in scarce supply of proper policy action to improve such aspects as teacher quality, accountability, funding, etc.

Read full paper