Have Kids Stopped Trying on PISA and NAPLAN?

Sep 15, 2018

A much-ignored aspect of school results in Australia over the past decade or more is the sharp contrast between declining or stagnating scores on international and national tests for Years 9 and 10 and solid improvements in Year 12 results. How is it that trends in school outcomes only two or three Year levels apart are so different?

Australia’s reading, mathematics and science scores in the OECD’s Programme for International Assessments (PISA) for 15-year-old students (who are mostly in Year 10) have declined significantly over the last 10-15 years. The declines are amongst the largest in the OECD and are equivalent to 6-12 months of learning. In addition, Year 9 NAPLAN results in reading, writing and numeracy have stagnated since the tests began.

In contrast, Year 12 results have improved significantly. The percentage of the Year 12 population that completed Year 12 increased from 68% in 2001 to 76% in 2016. The percentage of the Year 12 population achieving an ATAR score of 50 or above increased from 25% in 2006 to 42% in 2015.

In addition, a larger proportion of disadvantaged students now complete Year 12. For example, the percentage of low socio-economic status (SES) students who completed Year 12 increased from 62% in 2001 to 73% in 2016. Retention and completion rates for Indigenous and remote area students have improved as well.

These are indicators of an improving education system, not a deteriorating one. 

One possible explanation for the contrasting results is that Year 12 standards have declined, but there is no strong evidence of this. Instead, there is credible evidence that students do not try as hard on PISA and NAPLAN tests as they do in Year 12 assessments.  

A new study published by the US National Bureau of Economic Research (NBER) shows that a large proportion of Australian students do not take the PISA tests seriously. PISA is seen as a low stakes test because it has no personal consequences for students (they don’t even get their results) in contrast to the high stakes of Year 12 which has a major influence on their future careers and lives.

Using data from PISA 2015, the study estimated that 22.5% of Australian students did not take the science test seriously. The proportion of non-serious students varied enormously by country, from 13.6% in Korea to 67% in Brazil. The percentages in other high achieving countries were lower than in Australia: Finland – 15.8%, Japan – 18.1%, Taiwan – 14.2% and Singapore – 17%. 

The study found that high SES, low performing and male students tended to take the test less seriously than other students. Countries in which students reported sitting for more “high stakes” exams had a higher proportion of non-serious students.

There are other indications of low effort by Australian students at PISA tests. Data from PISA 2012 shows that the effort made in the mathematics test by low performing Australian students was amongst the lowest of the participating countries. They were ranked equal 9th lowest out of 30 ranked scores on the PISA “effort thermometer”.

There is also anecdotal evidence of low effort in PISA. For example, a student who participated in PISA 2015 said: “My peers who took part in this test were unanimous in that they did not, to the best of their ability, attempt the exam.”  

Many overseas studies over the past 20 years have shown that students make lower effort in low stakes tests and it leads to lower results. For example, one recent study examined differences in student effort in PISA 2009 and found that between 32 and 38% of the cross-country variation in PISA scores was explained by different levels of student effort across countries. 

Another recent NBER study found that US students increased their effort in a low stakes mathematics test only when offered financial incentives. A Swedish study found that when students perceive a test to be unimportant they put in less effort and achieve lower results.

If nearly one-quarter of Australia’s 15-year-old students do not take the PISA tests seriously, it is likely to account for at least some part of the difference with Year 12 results. There is some evidence from PISA 2003 and 2012 that student effort in PISA has declined. There was also a significant decline in student engagement at school which may have caused declining effort. 

Student motivation and effort should also be considered in assessing NAPLAN results. There is research evidence of low effort in NAPLAN by some students and there are widespread anecdotes about teenagers not taking it seriously. 

There are higher stakes associated with NAPLAN than PISA in that students may see their results through the reports to parents. Also, NAPLAN is a big event in many schools because results are posted on the My School website and affect school reputations. Schools devote more time and resources to spur students to achieve good results.

Greater student effort in NAPLAN may contribute to the difference between the declining results in PISA and the stagnation in Year 9 NAPLAN results. However, Year 9 NAPLAN has fewer personal consequences for students than Year 12 and they may make less effort than in Year 12, thus contributing to the differences between the Year 9 and 12 results. 

The studies also call into question the worth of international and national league tables. Country and school rankings are affected by differing proportions of students in different countries and schools who do not take the tests seriously. Rankings can move up and down depending on student effort. The new NBER study estimated that Australia’s PISA rank would move up several places with greater student effort.

The important point from the NBER study and other studies is that national and international test results are affected by student motivation and effort. The results could be as much a measure of student effort as a measure of student learning. Therefore, caution is needed in interpreting the results from PISA and NAPLAN and drawing policy conclusions. 

Trevor Cobbold is National Convenor of Save Our Schools. This article is a summary of a new Education Research Brief published by SOS.

Share and Enjoy !

Subscribe to John Menadue's Newsletter
Subscribe to John Menadue's Newsletter

 

Thank you for subscribing!