AI scribes need to be treated carefully.
The technology offers real time-saving advantages, but there can be pitfalls that doctors need to beware of.
Health practitioners’ use of AI scribes to assist in note taking during patient consultations is a hot topic both in Australia and around the world.
And while there are definitely early uptakers who love the technology, and those who feel it is still a bit too early for adoption into practice, the reality is that it is here, it is being used and there are a number of legal obligations to consider.
Anthony Mennillo, MIGA’s Head of Claims and Legal Services, says the medical defence organisation is receiving a lot of calls on the topic from members.
And while he concedes it might still be early days for the technology, it has some fantastic potential and the “gates are well and truly already open”.
“Many of the calls about AI scribes are about consent and privacy, primarily about getting the patient’s consent, what that looks like and what are the privacy issues that arise or may arise,” he says.
“It’s important to understand what the AI scribe is doing. Is it just transcribing and summarising the consultation in the right format, or is it generating referral letters? Is it generating prescription medications?
“The practitioner needs to be extra careful about what it’s doing, because one of the things that AI tools generally, not just scribes can do is hallucinate and they may make up something or interpret something in a way, on the summary that is actually not correct.
“There have been some examples reported of medication being prescribed that was contraindicated for the patient that emphasise that need for doctors to be very careful.”
Mr Mennillo points to resources developed by the Royal Australian College of General Practice (RACGP) and the Australian Health Practitioners Regulation Agency (AHPRA) provide detailed and easy-to-understand guidance around the use of AI scribes for notetaking in patient consultations.
In its resource updated in January this year, the RACGP highlights some of the “potential benefits” of using AI scribes, noting that “in most instances, these claims are not yet backed by substantial research evidence”.
“Some have suggested that using an AI scribe might reduce administrative task burden for GPs; allow the GP to focus on the patient during the consultation, instead of a computer; improve patient satisfaction; and reduce doctor burnout,” it says.
However the RACGP also flags some of the potential problems in this “emerging field”, noting that “there are a number of known issues with AI products that may affect AI scribes”.
“Unforeseen legal problems might also arise as their use increases,” the statement says. These include clinical issues, privacy and security issues, and workflow and practice issues, the RACGP says.
In its guidance, AHPRA says “the potential of AI to transform and support innovation in healthcare has been the subject of much media and professional commentary”.
“AHPRA and National Boards support the safe use of AI in healthcare recognising the significant potential to improve health outcomes and create a more person-centred health system,” it says.
“While the potential of AI to improve health outcomes through improved diagnostics and disease detection has been reported for some time, recent commentary has focussed on the benefits for health practitioners with improved care and patient satisfaction by reducing administrative burdens and health practitioner burnout.”
The regulatory body also includes case studies that show some of the pitfalls in practice. One in particular raises some of the concerns Mr Mennillo says are very important to consider.
The case study tells of a practitioner who uses a health scribing tool to automatically generate clinical notes and a referral letter, which includes an additional plausible diagnosis generated by the tool.
“This scenario raises several concerns. Firstly, this scenario raises similar concerns as case study1 about privacy and data awareness, and the need for health practitioners to seek informed consent to input confidential patient/client data into an AI tool,” AHPRA says.
“Secondly, the practitioner needs to be aware if they are using a tool that is regulated by the TGA. Some practitioners may assume that the ’intended use’ of all AI scribing tools is limited to scribing, however since this tool suggested a diagnosis it meets the definition of a medical device and therefore should be regulated by the TGA.
“This type of tool must undergo premarket approval and be included in the ARTG prior to supply in Australia unless an exemption applies. If unsure, practitioners can check with the vendor or search the ARTG to check if the tools they are using are registered.”
Regardless of the technology errors, the responsibility rests with the practitioner to ensure the final notes, referrals and any prescriptions are checked and signed off on.
Mr Mennillo cautions against complacency in trusting the technology without checking its accuracy when the transcript arrives. As he suggests, this is also a good opportunity for the practitioner to make any corrections and additional thoughts or notes to the patient record.
“The AI generated note is still the GP’s notes, and the GP is responsible for that note that is produced,” he says. “I think it’s always incumbent on the practitioner to review the notes in their entirety before approving, and they have the ability to amend and to add anything they want to describe.
“What the scribe generates is a working draft. That’s how I like to describe it. And the doctor has the right to and is in fact obliged to review and amend as they see fit.”
Obtaining informed consent from patients to record the consultation is absolutely vital, and while the gold standard is always written consent, verbal consent noted in the patient records is also acceptable, Mr Mennillo says.
And if the patient says they do not consent? That’s a deal breaker he says.
“Recording without an individual’s consent could well be illegal and I would strongly recommend that the recording be switched off,” he says.
“If it can’t be switched off, then the consultation may not be able to proceed or you go back to paper records.”
Mr Mennillo also says any practice that introduces AI scribes into consultations will also need to update their privacy policies to reflect this. This should include information about how, where and by whom the transcripts are stored, especially if this occurs through a software provider.
Despite all the potential pitfalls, Mr Mennillo is excited about the future for the technology in the healthcare setting.
“I think it’s a great tool to consider implementing into a doctor’s practice,” he says.
“There are certainly things to be cautious about, but there are also things to be really excited about. And the gate has well and truly opened – practitioners are using it, so we need to make sure that everyone who is doing so is fully informed and aware of their responsibilities.”
