Humans are better at dealing with the complexity around us than AI. Unless you can describe it, you can't automate it.
ChatGPT has shaken up many people in healthcare, with possibilities for clinical application that are both tempting and mildly unsettling.
Fortunately, Australia has homegrown AI safety experts, like Professor Farah Magrabi, who are spearheading the push for regulatory and ethical frameworks to maximise the opportunities of AI while mitigating the risks.
Professor Magrabi did a PhD in AI years before ChatGPT made it a hot topic. She is now professor of biomedical and health informatics at the Australian Institute of Health Innovation at Macquarie University and is instrumental in the Australian Alliance for Artificial Intelligence in Healthcare.
Professor Magrabi says there is vast opportunity for AI to improve healthcare but that the use of consumer-grade technology â like ChatGPT â is a problem.
Whatâs been keeping you busy?
Last month I co-chaired MedInfo 2023 â the World Congress for Medical Informatics. It’s the biggest international meeting for our discipline. We had 2500 delegates from 63 countries. It was a fantastic meeting, a great mix of health service industry and academia as well as the tech sector and Australia did a great job of hosting it. It was also a bit of a party for five days so Iâm still coming down from the high of that.
The other thing keeping me busy is my work for Australian Alliance for Artificial Intelligence in Healthcare (AAAiH). Weâre a group of about a hundred stakeholders all interested in looking at how we might best ensure the safe and responsible use of AI in healthcare and coming up with a national agenda for that.
Weâre working out what are the most important policies that we need to have in place to make sure that we use this technology, but then also try to understand and mitigate some of its side effects.
Hasnât the AAAiH previously submitted a proposal like that to government?
In part. In December 2021 the AAAiH put together a Roadmap for AI in Healthcare for Australia. There’s a bunch of recommendations there and safety, quality and ethics were top of that agenda. However, the roadmap did not include specific policy recommendations around who needs to do what.
Did the government pick up on the roadmap in 2021?
I don’t think it was very much on people’s radars. It’s taken ChatGPT to really create action. The Department of Industry Science and Resources this year put out a consultation for supporting responsible AI in Australia. (Submissions now closed.)
Definitely the government has a role, but then health services will have a role as well. The AAAiH is meeting this month to really unpack the roadmap and work out what policy we need, and what else is needed to ensure that we’re ready to take advantage of all the benefits of this technology.
What are the key issues for health with generative AI like ChatGPT?
ChatGPT, and similar technologies, might be highly effective at, for example, generating a clinical summary after a GP consultation, but nobody’s really tested it for that purpose.
Nobody has really systematically studied it for clinical purposes. That’s where we need to be careful. When consumer-grade systems are starting to be used for clinical purposes what we’re saying is to just pause a little bit there. Let’s think about what’s really going on.
If youâre using a transcription program, what you’re putting in is what you’re getting out. However, the minute you start using AI to summarise information, on the surface the system may look really accurate, but you don’t know if it’s been tested for example, for rare conditions.
What keeps you awake at night?
It’s things like climate change and what’s going to happen with the Voice to Parliament.
I ask myself, âWhat can I do about these big and small issues so that I can sleep better at night?â and the answer is making sure that I take action where I can.
For example, I sit on the editorial board of journal of the American Medical Informatics Association. About two years ago, the editor wanted to put together a special issue on climate change. They were looking around the room and I thought, well, somebody needs to do it and I think it’s really important. So, I said, âLet me have a go. I think I know somebody who can help me. Let me make sure we do something.â Thatâs my approach, making sure I do what I can do.
Professor Nick Talley, emeritus editor of the Medical Journal of Australia, has been doing this for a very long time. What he often says is âWhat will you say when the next generation asks you âwhat did you do about climate changeâ?â
To quote another researcher, hopefully we will say that we acted with foresight, we acted with courage, and some haste as well. Because we all need to be doing our little bit. This is not a problem for someone else to solve.
Itâs reassuring that youâre not keep awake at night by the threat of AI.
Machines are really smart but I think I put my money in the humans. Things in AI go wrong when the humans are out of the loop and I think there’s a lot of complexity in this world. Humans are better at dealing with the complexity around us. Unless you can describe it, you can’t automate it.
If you were Mark Butler for a day what would you do?
Make sure that the health system put sustainability on the agenda and be serious about tackling climate change. Make sure that sustainability is by design. The world is changing but we’re not adapting what we do to what’s happening around us.
Wild card question: whatâs your favourite season?
It has to be this time where we’re coming to the end of winter and starting to get into summer. I love Sydney at this time of the year.
It sort of seems unfair to brag to people living in cold cities about the beautiful sunshine in Sydney. But it’s this time where we get the magnolia and cherry blossoms. Then a few weeks weâll get into wisteria and jacaranda. It’s just lovely to get out there with all the colours as the days get a little bit longer, a little bit warmer.
Professor Magrabi will be speaking at Wild Health Summit in Sydney on Monday 11 September, on the implications of large-scale generative AI on healthcare and the clinical workforce.