There are also downsides to rejecting the use of AI to improve the care given to patients.
A leading expert in digital health has challenged Australians to balance their fears about sharing data and using AI in healthcare against the risks of not sharing it.
Professor Clair Sullivan, Director of the Queensland Digital Health Centre at the University of Queensland, told an audience of software vendors in Brisbane earlier this week that the conversation around the importance of data sharing for optimal healthcare “lacked maturity”.
“As a society, we have a relatively immature and unbalanced discussion about the risks of data sharing, and have not had one about not sharing,” Professor Sullivan told the Medical Industry Software Association forum last week.
“So how do you start to have that conversation? We need to talk about the good that sharing data does and the risks of not sharing the data.
“If we’re not sharing data, we can develop inequalities – we have randomised control trials with people who are wanting to participate in a randomised clinical trial. We exclude people who are busy, who have social disadvantage, or have no interest or trust in participating in a [process] that will fail to include multiple types of populations.”
Engendering trust was key to encouraging participation in processes around the sharing of data, particularly health data, Professor Sullivan said.
“We must mitigate risks if we are going to share and data must be absolutely secure. We must protect privacy. We would like to obtain informed consent wherever possible. If we’re not going to do that, it needs proper waiver of consent,” she said.
Professor Sullivan began her presentation by positing that the human brain was no longer “fit for purpose” for the delivery of optimal healthcare.
“Research says we can only think of four things at once,” she said.
“If you’re trying to work out an insulin prescription alongside with remembering that this person also has osteoporosis, menopause, and she’s got depression – if doctors, nurses and allied health practitioners can only think about four things at once, and more than half their patients in the next decades are going to [be increasingly complex], how do you think the human brain is going to care for those people?
“Clinical brains, such as mine are evolved to think about one or two things at a time, because when we were trained, and when the lexicon of medicine was developed, people had one or two or three major things [wrong with them].
“Now, in the 2020s people have depression, osteoporosis, diabetes, hypertension, renal failure hypercholesterolaemia, all at once. And that means that things such as hypertension, cholesterol, anaemia, might may not get attended to when you’re looking after the renal failure and diabetes because your brain simply cannot hold that amount of bandwidth.”
When it comes to difficult topics like data sharing and artificial intelligence, Professor Sullivan said, it was now a question of: what if we don’t use them?
“If we don’t do this – if we don’t take on the dangers and risks associated with AI, what are the risks to our patients?” she said.
“We have to accept [our brains] are no longer fit for purpose. We need to digitally transform healthcare, we need to increase that compute power.”
She said achieving this required effective digital workflows and the creation of new and innovative models of care.
“Digital health is our only way to continue to provide high quality care,” she said.
“There are risks to digital health and AI, but there are also significant risks in not using the data to improve health. I think this is a very exciting time to be in healthcare.”