Nearly three-quarters of Australian students didn’t fully try on the OECD’s PISA 2018 tests, new figures show.
The data, published by the Australian Council for Educational Research (ACER), found the majority of Australian students (73%) indicated that they would have invested more effort if the PISA test counted towards their marks.
The research also found that only 56% of students who participated in PISA 2018 put in high effort, 40% said they only put in medium effort and 5% said they put in low effort (figures don’t add to 100% due to rounding).
However, a significant 91% of students surveyed said they would put in high effort if the tests counted towards their school marks
In a policy brief responding to the data, Save Our Schools’ national convenor, Trevor Cobbold, called the figures “a remarkable revelation”.
“How is it possible to accept the PISA results as an accurate measure of Australia’s education performance if three-quarters of students didn’t fully try in the tests?” Cobbold said.
“These results suggest that PISA is not the accurate, reliable, and valid measure of educational quality it claims.”
‘A castle built on sand’
Cobbold said that while PISA is seen as the gold standard for assessing the performance of education systems, “it is a castle built on sand”.
“There is also high variability between countries in the proportion of students not fully trying,” he said.
“This variation calls into question the validity of league tables of countries based on PISA results which attract so much publicity and commentary.”
Cobbold said the important conclusion from the ACER and other studies of student motivation and effort is that the PISA results could be as much a measure of student effort as a measure of student learning.
“Therefore, they are not fully reliable as many assume and much caution is needed in interpreting the results and drawing strong policy conclusions,” he said.
“The new results also raise the question as to the extent to which NAPLAN test results might also be affected by varying levels of effort and motivation by different groups of students.”
Cobbold pointed out that to date, no such research has been conducted.
“It should be on the research agenda for ACER and the new Australian Education Research Organisation to better inform the public and schools about the accuracy and reliability of NAPLAN.”
However, ACER Deputy CEO (Research) Dr Sue Thomson said there should be “no question mark” over the accuracy and reliability of PISA tests.
“The tests are not designed to be a fine-grained examination of student achievement, but rather a broad examination of the overall output of an education system. To draw the conclusion that PISA is a ‘castle built on sand’ is drawing rather a long bow,” Dr Thomson told The Educator.
“There certainly should be no questions about the accuracy or reliability of the data – ACER’s data handling/analysis techniques are world-class and validated by the OECD.”
Dr Thomson said that if the concern is whether or not PISA reflects all students’ best efforts then, according to ACER’s research, “the answer is no”.
“Students self-report not trying their hardest at what is a low-stakes test. However, I don't think that anyone who has been a teacher would find these to be ‘remarkable revelations’,” she said.
“The first question many students ask when presented with an assessment is "does it count?", and effort applied is proportional to the answer. Australia is not unusual in this as research from many other countries shows.”
Dr Thomson said another way to explore students’ effort is by observing their engagement with the PISA assessment.
“Students who are disengaged would be less likely to respond to questions, particularly the open-ended questions, which require students to provide a written response. The data shows that there were very few students who did not respond to all questions,” she said.
“The real question is what are PISA data good for? They are a point-in-time ‘temperature check’ of student performance that helps education systems identify potential problem areas and investigate further.”
Dr Thomson said the rankings themselves are not particularly useful, as the number of countries participating is not static.
“PISA data are best used to make internal comparisons, benchmarking against our previous performance as well as with the best in the world. PISA is therefore a crucial driver of best practice, which is the reason that the number of countries participating continues to grow,” she said.
“As our latest Snapshots report demonstrates, PISA can reveal a great deal of information that is incredibly valuable in helping policymakers make improvements. Like many other countries this includes whether or not our students are giving their all on these tests.”