by Troy Martin, VP APAC, Instructure
NAPLAN, the annual assessment for students in Australia, has resulted in no general improvement in math or English skills since they were introduced ten years ago. Used by governments, education authorities, schools, teachers and parents, the tests were designed to measure students’ knowledge, and enable a standard metric to compare the performance of schools.
The questions raised over NAPLAN highlights the need to re-evaluate the way progress is measured in education. Instead of standardised examinations, which only test at the end point of the year, ongoing assessment must appraise research skills, applied knowledge and practical ability — all vital in paving the way to employment and beyond.
Technology now enables education providers to develop a more sophisticated adoption of analytics. Harnessed in the correct way, data can help educators understand students’ progress, as well as their behaviours and areas to improve. Teachers will be able to personalise learning journeys rather than teaching to the crowd.
The limitations of data in education
Data is seen by some as a blunt tool that has turned teachers into data managers and schools into audit factories. Used in the wrong way, data analysis can be inflexible and by itself, is unable to capture what really happens in the classroom. Systems such ias NAPLAN can place a greater emphasis on the system and school rather than helping improve individual student performance.
This approach has led to a suggestion that there is good and bad data. Bad data is labelled "national accounting" – designed to report to politicians how the system as a whole is performing and to contrast good practice with bad. Good data is the nuanced ability to understand an individual student’s strengths and weaknesses. Critics say, the good is eclipsed by the bad – and exercises to pass or fail a school win the day.
But for us, it is a good thing that this debate is happening at all. It provides a clear call to the education industry to reassess how it measures performance and tracks progress – and asserts that doing so will benefit students and teachers alike.
Calling for a new approach
Following the recent report by David Gonski, the debate over measurement has gained significant momentum. The report claims that too many children are failing to reach their potential because of the restrictive nature of year-level progression, and directly criticises NAPLAN.
If the report’s recommendations are taken on board, there will be a concerted push to move towards individualised learning for all. Schools will benefit from the country-wide implementation of a new online assessment tool that teachers can use to diagnose the exact level of literacy and numeracy a child has achieved.
Teachers can then use the data to create individual learning plans for students that aren’t tied to what year group they are in, but instead are designed to help them to learn in a way – and at a pace that suits them.
This commitment to the continuous measurement of student progress is crucial to making data analysis work. It is only when this data is used effectively that it can play a progressive role in education. Turning static information that determines the success or failure of schools into actionable insights which fuels change is the real key to making data analysis useful.
An ongoing, skills-focused approach to learning and assessment
This approach will also help in the ongoing quest to acquire skills capable of morphing to suit the changing workplace. Rather than encouraging the development of narrow skill sets that can be commoditised, educators know that they need to be laying the groundwork that encourages the development of a polymath mindset.
In this environment, the ability to pass tests and learn rote material – the traditional measures of success – become less important, and the ability to apply knowledge, learn quickly, and work collaboratively becomes crucial.
This is, of course, more difficult to measure than the pass/fail tests, and calls for broader, ongoing, and real-time assessment.
The amount of learning that goes on outside the classroom makes quantifying what students have learned very difficult. A more sophisticated skills-focused learning and assessment approach has a critical role in ensuring the success of the entire employment ecosystem — from individuals’ careers to the prosperity of businesses, industries and economies
But for data driven education to work, and to address fears of blunt and inflexible processes which don’t affect outcomes, a focus on continuous assessment and the ability to use data to act must be front of mind for everyone.