In February, it was announced that NAPLAN would undergo sweeping changes after Australia’s education ministers unanimously agreed on a new proficiency standard aimed at improving students’ literacy and numeracy outcomes.
The new proficiency standard, with four levels of achievement [Exceeding, Strong, Developing and Needs additional support], will replace the previous 10-band structure and the old national minimum standard set in 2008.
Another important change to the national assessment is that from this year, it will be conducted entirely online and held in March instead of May, with parents and schools to receive individual results in July.
The Australian Curriculum and Assessment Reporting Authority (ACARA) said the overhaul “will provide a clearer picture of student learning progress and put data into the hands of teachers and parents sooner.”
How meaningful will the changes be?
Professor Sam Sellar, Dean of Research: UniSA Education Futures and Professor of Education Policy at the University of South Australia, said the changes address two longstanding concerns about NAPLAN: the amount of time spent preparing for the test and the lack of time in which to use the results.
“Moving the test from May to March reduces the amount of time in which schools and teachers may feel the need to focus on test preparation,” Professor Sellar told The Educator.
“Teachers can feel a sense of responsibility to prepare students to succeed in NAPLAN, but time dedicated to test preparation crowds out other learning opportunities and can create anxiety for students.”
Professor Sellar said the earlier testing window will hopefully reduce this negative effect of NAPLAN and won’t lead to even more preparation in February and March, or push test preparation back into the previous school year.
“Changes to the timing and reporting of results may enable more effective use of the data that NAPLAN generates. With the move to conducting the assessments online, it is now possible to provide more personalised results and students reports should be available in July,” he said.
“This encourages greater use of the results to inform teaching and support for students, rather than simply to create league tables of school performance.”
However, Professor Sellar said the suitability of the test as a diagnostic tool is still uncertain.
“NAPLAN was established to provide parents with information about schools and while the recent changes may improve its value as a diagnostic tool, the question remains whether NAPLAN is the right tool for this purpose.”
‘A more mature debate about NAPLAN is needed’
In 2017, Professor Sellar, along with Professors Bob Lingard and Greg Thompson, wrote a report in which the trio called for a shift in the NAPLAN debate “to tease out a range of issues associated with national testing” and “invite broader publics into debates about the relationship between measurement and value in education”.
“Since we wrote that line, we have seen some shifts in the public debate about NAPLAN. For example, a few years ago questions were raised about the value of NAPLAN, prompted by concern that we had not seen much change in the results over time,” Professor Sellar said.
“It is important to have these public conversations about the costs and benefits of large and resource-intensive assessment programs that have significant impact on our schools.”
Professor Sellar said the recent changes to NAPLAN are also designed to address problems that have been longstanding matters of public debate.
“However, debates about NAPLAN often remain framed by the belief that we must measure learning in order to improve it. While we can usefully measure some dimensions of learning, other very important dimensions are not measurable,” he pointed out.
“When we give great emphasis to testing, we risk valuing what can be measured rather than measuring what is valuable. The skills that are most valuable to individuals and society also change over time.”
Professor Sellar said NAPLAN directs significant attention towards teaching and testing basic skills that, as leading economists have long warned, will be less important when they can be replaced by Artificial Intelligence.
“This is becoming increasingly evident as we watch new developments like ChatGPT quickly change the landscape of education and employment,” he said.
“We still need a more mature debate about the limitations of measurement in education and a more forward-looking view on what the most important skills will be for the next generation.”