Out of control AI...grieving robots...underwater cities. These were just a few of the thought-provoking concepts raised by school children to panellists during a tantalizing Q&A at the Game Changer Challenge launch at Google on Tuesday.
Sixteen teams from schools across NSW were flown to Game Changer HQ in Sydney during Education Week to compete in a three-day design thinking program where, working alongside leading industry professionals, teams will go head-to-head to consider this year’s question: ‘How might we humanise technology?’
The Game Changer Challenge’s Q&A panel – which was moderated by NSW Department of Education Secretary, Mark Scott – included Melanie Silva, managing director, Google Australia and New Zealand, University of Technology Sydney (UTS) Professors Toby Walsh and Mary-Anne Williams, ethicist Dr Matt Beard, entrepreneur Jillian Kilby and Microsoft national technology officer Lee Hickin.
Responding to a question from a Lake Munmorah High School student about whether AI should be seen as scary or exciting, Professor Walsh said he wasn’t too worried, adding the “fantastical” portrayal of AI in Hollywood movies often hyped the topic up.
“Don’t believe what you see in Hollywood. When you go and see a James Bond movie, you know it’s not real, and the same could be said of AI,” he said.
“It’s actually a much more useful, down-to-earth aspect of our lives than what Hollywood would have you believe.”
His colleague Professor Williams said that while this might be true today, society is seeing “glimpses of tomorrow” and the wide availability of AI has the potential to pose serious ethical challenges moving forward.
“AI is general purpose technology just like electricity, and it can be used for a wide variety of applications by a very wide variety of people,” Professor Williams cautioned.
“Google has made their software available for everybody, so it can be used in many different ways.”
She said society to pay greater attention to laws and the economic drivers that will shape AI.
“Ethics encapsulate our values, but it will be the laws that constrain and enforce AI, and the economic drivers will help drive the innovation that we need, and help humans build trust with AI,” Professor Williams said.
One student from Elizabeth Macarthur High School asked ethicist Dr Matt Beard if it would be ethical to program robots to suffer in a physical, emotional or existential sense, and feel empathy.
Dr Beard responded that there is nothing wrong with suffering and creating the potential for suffering.
“What is wrong is inflicting suffering on someone or being careless about the fact that someone else is suffering,” he said.
“The big question when it comes to wanting machines or robots who are able to experience something like the emotions and states we feel with regards to suffering’ is why we would want them to feel that.”
Dr Beard pointed to work being done in the field of robotic care partners for those who are lonely.
“You might want a care partner to experience some kind of empathy and empathetic suffering. If there is someone who has just lost a loved one and the robot is unable to experience that suffering alongside them, that might lead that person to feel even more lonely than they already did,” he said.
“There is nothing lonelier than being alone in suffering, so there might be reasons why you would want that machine to suffer alongside someone else.”
At the conclusion of the panel discussion, Silva – who was interviewed by Plumpton High School students – said the most important skill students needed for the future was curiosity.
“The number one thing is curiosity,” Silva said.
Silva said that if something is taken as a given without further investigation, the opportunity for important questions – and answers – is missed.
“My dream for my kids when they are at school is to think about problems and how they can solve them and be resilient enough to keep trying, even if you don’t solve it the first time,” she said.