A self-described early adopter of AI scribe technology has taken an active role in making sure the technology is safe and reliable, working with Lyrebird Health.
In the middle of 2023 Dr Kim Drever was stressed, burned out and trying to find a better way to balance the administrative and clinical demands the modern healthcare practitioner faces.
“I was thinking about exiting [the profession],” Dr Drever, a Melbourne developmental-behavioural paediatrician, told The Medical Republic.
“After the pandemic my patients faced unprecedented mental health challenges, with greater severity and acuity than ever before.
“Tragically, I lost two patients to suicide, and my secretary, overwhelmed by burnout, left – leaving my practice stretched to its limit.
“I was overwhelmed by the demands of my profession and seeking something, anything, that could help with the endless documentation, the weight of balancing my work and family and the growing frustration with the inefficiencies in our system.
“I wasn’t sure how much longer I could continue in the career that I love.”
Dr Drever was first introduced to the idea of an AI scribe while attending a paediatrics conference in Darwin last year. Over coffee, a friend showed her the Lyrebird app and said: “I hear you’re looking for more admin support, maybe this would help – I haven’t picked up a pen in weeks.”
“We literally did a mock consultation in the middle of this big exhibition hall, where I pretended to have a headache. He (jokingly) offered me Panadol, gin and a good lie down, and the app created a summary within seconds.”
This encounter sparked something in Dr Drever, who ditched the conference and spent the remainder of her time in Darwin researching the app and how it worked, including reaching out to Lyrebird co-founders Kai Van Lieshout and Linus Talacko.
After doing her due diligence, which involved meetings with the Lyrebird team and consulting her MDO and the TGA, Dr Drever reached a point where she felt comfortable asking patients for their consent to use the technology in her clinical practice.
She ran into issues with the scribe almost immediately, however.
“Within a few days of using it I realised it wasn’t picking up what was happening in my consultations,” she said.
“As a developmental paediatrician my consultations are not typically medical. I’m asking about sleep, about friends at school. To someone who doesn’t know what’s happening it looks like I’m just having coffee with a mother, but they’re strategic questions.”
The problems were due to how the technology had been refined and used to that point, primarily in the GP, orthopaedic surgery and physiotherapy space.
“My work is completely different to those sorts of consultations, which are very procedural and algorithmic in their approach,” she said.
“So, the AI scribe wasn’t picking up the sort of things that I do which are a bit more subtle or non-linear. It would hallucinate and try to make sense of why I would be asking about their food preferences, who their best friend at school was or whether they preferred spelling or maths.”
This led Dr Drever to go back to the Lyrebird founders and plead her case for further refining of the AI model.
“You need paediatricians,” she told them.
Related
“We have to do complex work that is documentation [and] admin heavy that also sits across frameworks and disciplines.
“If you get this thing working for someone like me, you can unlock our potential across the system. We get to communicate with teams that are desperate to hear from one another, but our impact is limited by the timeliness and the quality of the communication between team members.
“Yet doctors are often the rate-limiting step because we have to sign off on everything.”
The Lyrebird founders bought into it.
“It aligned with their purpose,” said Dr Drever.
“Kai originally wanted to build this kind of technology to help his own GP, who had supported him through a difficult time in his life.”
Dr Drever spent the next six weeks working with the Lyrebird team, reviewing the AI-generated consultation notes at the end of each day to identify, understand and fix problematic areas of the summaries.
This period of working as a bridge between the end users and the developers was an insightful and rewarding one for Dr Drever.
“[Knowing] it is possible to work with a company that’s building something new for us [and] not building something for themselves… was a really powerful experience for me. It’s rare to see,” she told TMR.
A key turning point for Dr Drever’s understanding and appreciation of why an open, trusting relationship between healthcare and technology companies was important occurred earlier this year, after she had been using the AI scribe for three or four months, when a child she was treating disclosed they had been the victim of sexual abuse.
“The ambient scribe sitting in the background was trained to censor potentially harmful content, so it flagged and removed that disclosure from my notes,” she said.
The deletion of the disclosure disappointed Dr Drever, who had been enjoying the fact that the scribe allowed her to not be tethered to the EMR when listening to patients.
Knowing that this sensitive information had been deleted was less than ideal, as Dr Drever knew she would have to formally document the disclosure at the end of the day after seeing other families, hoping that she could remember the child’s story accurately.
Disclosures of sexual assault trigger the system to act protectively, which means it is likely that Dr Drever will be required to appear in court to provide evidence on what happened to the child. This process can have significant long-term consequences for the child and the family, so it is critical that Dr Drever has complete and accurate records.
But this encounter provided Dr Drever with an opportunity.
“I picked up the phone, I called the team and explained what happened,” she said.
“Within hours they had spoken to the people at Microsoft and refined the AI model so that these sorts of disclosures, in my context, are not censored.”
While it was not possible to get that particular recording back, Dr Drever is pleased to know that clinical and technological expertise were able to work together so quickly to make sure this situation didn’t happen again.
“This is a model for how we can navigate this brave new world together,” Dr Drever said.
“We are at an inflection point here with digital health. We have a chance to align technology with the timeless values of healthcare. [But] we need to create these tools together.
“It’s not just about technology. It’s about people. It’s about relationships, trust – creating a system where clinicians can thrive, [and] patients can trust the tools that we use whilst they’re in a relationship with us as we care for them.
“I believe that AI and healthcare makes us more human, not less.”
AI.Care 2024 was held in Melbourne on 27 and 28 November.