
In February, the NSW Government announced a plan to lift Year 5 and 9 students’ NAPLAN scores by requiring schools to “strive for ambitious new goals” such as strong gains across literacy and numeracy subjects.
The latest NAPLAN data shows that more than 29.5% of NSW students are below the national standard, an increase from 28.65% the previous year, with younger students struggling the most.
However, some experts suggest these targets be supported through appropriate resourcing, such as easing school shortages and supporting teacher professional development.
Dr Jennifer Dove from WSU’s School of Education, is also concerned that undermining teachers’ expertise runs the risks of harming students’ long-term learning and critical thinking skills.
“Student equity and pedagogy is at risk. Where students have the appropriate cultural capital to meet NAPLAN’s standardised testing, creativity and student choice in learning is possible,” Dr Dove told The Educator.
“Otherwise, usually in lower SES areas, the emphasis is on learning/teaching standardised forms, which restrict development of students’ writing.”
Dr Dove said students need something to say, or some kind of learning in conjunction with learning how to write, just as they need to read for meaning rather than spotting the grammatical feature.
“Student thinking is at risk when they are required to conform to particular writing structures and encouraged to choose inflated language. Writing and reading for pleasure are not emphasised, yet these can improve student engagement and outcomes,” she said.
“Pressure to increase results makes it less likely that teachers will adopt culturally and linguistically responsive pedagogies [CLRP] appropriate for diverse student groups.”
Dr Dove pointed to an August 2024 statement by ACARA CEO Stephen Gniel that the ‘data shows that while there were small increases and decreases across domains and year levels, overall the results were broadly stable’.
“He noted that this stability was in the context of the earlier date for NAPLAN, and I’d note that students returned later themselves this year, decreasing the possibility of any large jump in results off the back of such a short period of teaching,” she said.
“Comparing year by year is not reasonable because the same students are not being tested. Small changes in scale scores have been overemphasised in the media [Larsen, 2022]. Participation rates affect reliability of the test data [Thompson, Adie, & Klenowski, 2018], and student attendance continues to decline.”