Bridget Forster, a Victorian teacher with more than 20 years of experience, has seen her fair share of changes in education, but nothing quite like the rise of Artificial Intelligence.
Forster, who is the Head of Kerferd Library and a VCE Literature Teacher at Mentone Girls Grammar, has been keen to explore approaches to AI Large Language Models in the English classroom, covering issues of identity, creativity, and ethics.
Among the questions being explored are how teachers can identify cultural bias and ethical issues in the use of AI in the English classroom, what are the copyright implications of AI, and how students can be taught to be ethical users in this new and evolving context.
On Tuesday, Forster became the fifth educator to receive the Copyright Agency’s 2023 Reading Australia Fellow for Teachers of English and Literacy and teacher librarians – an accolade that comes with a $15,000 grant to take her research project to the next level.
Forster said the wide adoption of AI Large Language Models such as ChatGPT in Australia “changes the tone, rhythm, and language of our communications, diluting our rich and distinct Australian voice.”
“Aside from the well-voiced concerns regarding accuracy, plagiarism and academic integrity, we should be concerned about issues of identity and culture,” Forster told The Educator.
“Ideally, we want to offer English students access to a wide variety of Australian voices and perspectives throughout their education.”
Will AI ‘dumb down’ learners?
Forster pointed to a warning published in The Atlantic magazine by writer Adrienne la France who warned against relying on technologies that “dull the wisdom of our own intellect and aesthetic”.
“Obviously, the datasets that power Large Language Models such as ChatGPT do not have a particularly Australian aesthetic. Even prompting Chatbots to employ an Australian voice gain mixed results,” Forster said.
“It would be sad if students had their own distinct voices compromised by the overuse of this technology – in both consuming and producing texts. Ultimately, we want to equip students to become critical and ethical users of these new tools, and I believe that English teachers and teacher-librarians play a key role in addressing this challenge.”
AI as an enabler, not inhibitor, of human voice and creativity
Forster said her project aims to pilot an online collaborative project allowing students from Scotland, Canada, and Australia to share Literature from their countries and critique the extent to which it reflects their own lived experience and cultural identity.
“Following their reading, students would use AI generated fictional texts set in each country to explore what makes Literature from their region unique,” she said.
“Principals can support the ethical use of AI by ensuring that students have an understanding of how the technology works and its limitations. A great place to start would be to examine the datasets that power the technology and how cultural and gender bias can be embedded in these.”
Forster said students also need to learn skills in evaluating information, including lateral reading and tracking back to original sources.
“AI Chatbots can ‘hallucinate’ and provide inaccurate information, so these information literacy skills are imperative to their responsible use. Finally, we must value human creativity, by encouraging students to write in their own voices and avoid relinquishing their unique perspectives to a chatbot.”
Forster will share her research and findings broadly with colleagues next year.