Tensions over responsibility for AI guard rails

4 minute read


Regulating AI is like trying to control ‘a drug that’s changing every day’. Should it be the TGA or the National AI in Healthcare Council? Good question.


Experts say legislative change is needed to expand the TGA’s remit to include a broader range of AI software and devices, but how responsibility for various stages of oversight over AI in healthcare should be carved up remains fraught. 

Testifying before the senate inquiry into adopting AI yesterday, several members of the Australian Alliance for Artificial Intelligence in Healthcare (AAAiH) involved in developing the National Policy Roadmap for AI in healthcare were divided on how ongoing evaluation of AI applications should be undertaken after implementation of the roadmap. 

According to AMA federal councillor Dr Michael Bonning, while the AMA supported the roadmap in principle, there were also several challenges around the practicalities of implementing its 16 recommendations. 

That includes the question of navigating responsibility for regulatory oversight of AI applications between the TGA and the National AI in Healthcare Council proposed by the roadmap.   

“As is always the case, there are a breadth of issues in the roadmap, most of which are clear and well supported, it gets down to the nuance of how you achieve many of the issues in the roadmap,” Dr Bonning said. 

“[The National AI in Healthcare Council] still has to sit alongside the fact that therapeutic goods are and have been for a long time administered by the TGA. 

“It would be a challenge to excise certain things, but not others from the TGA’s role, [particularly as] what is considered software as a medical device, and what is considered an AI solution will become more and more blurry over time. 

“Under the Therapeutic Goods Act, [the TGA] is actually the one in power to do that. [But given] the resources at the moment that would be a question for them into how they would remove that role into some other organisation. 

“That’s why we say at a principles level, the idea that there is expertise put in place to review govern and continually reassess AI is important, but how you do it is the is still a challenge.” 

According to Professor Enrico Coiera, director for the AAAiH, who agreed expanding the TGA’s regulatory scope was vital to address gaps in current frameworks, there was also the need for ongoing, post-market evaluation of AI applications which fell outside the TGA’s remit. 

“There is a nuance. The TGA is essentially pre-market. It’ll take the technology as given by a vendor and run it through the hoops [and] once it’s out in the market, you presume it’s going to continue to work as it did before,” Professor Coiera said. 

“But generative AI will keep changing over time, so there is sort of a post-market requirement for surveillance to make sure things are still okay, and that might fall within an accreditation space, like the commission for safety and quality in health care (ACSQHC).” 

In terms of negotiating the various levels of responsibility for pre- and post-market oversight over AI applications between the TGA, ACSQHC and the National AI in Healthcare Council, Professor Coiera did not elaborate beyond describing the council’s aim of fostering greater collaboration and communication between stakeholders. 

“The thinking was there are so many organisations with partial responsibility and they don’t always talk to each other, and we also don’t know where the gaps necessarily are,” he said. 

“It would just be great, without creating something large and complex, [to bring] all the people in the room that need to be talking to each other.” 

Professor Farha Magrabi from the Australian Institute of Health Innovation also emphasised the need to widen the TGA’s remit to address “grey zones” with AI applications entering the market without any performance assessment. 

“The TGA is the first port of call [for regulation],” Professor Magrabi said. 

“At the moment, these software systems are being provided to GPs, they’re available, but there’s no data on how well or how badly they perform.  

“They fall through a gap at the moment because software that is just there for record-keeping is not subject to the TGA medical device regulations. 

“They put out a statement in May clarifying that if generative AI is being used in software, it’s subject to medical device regulations, so they’ve kind of pushed as far as the current legislation allows them to, but there is that grey zone where it doesn’t cover record-keeping software.” 

End of content

No more pages to load

Log In Register ×