Think of the savings!
Healthcare and medicine have been prominent victims of the spiteful and recklessly destructive forces now sweeping through US institutions under the banner of cost-cutting.
The USAID shutdown will, among other horrors, leave thousands of clinical experimentees stranded while the cuts to NIH grant funding (temporarily blocked by courts) threaten to bring the world’s largest medical research machine to a standstill.
Then there’s the cats with bird flu.
And who knows what havoc awaits when this guy, as seems increasingly likely, wins his confirmation.
So here’s a fun tidbit you may have missed.
A bill introduced into the House of Representatives by Arizona Republican David Schweikert late last week seeks to amend the Federal Food, Drug, and Cosmetic Act “to clarify that artificial intelligence and machine learning technologies can qualify as a practitioner eligible to prescribe drugs”.
This merry amendment, if passed, will be known as the “Healthy Technology Act of 2025” and will add this clause to section 503(b) of the Federal Food, Drug, and Cosmetic Act (21 U.S.C. 353(b)):
“(6) In this subsection, the term ‘practitioner licensed by law to administer such drug’ includes artificial intelligence and machine learning technology that are—
“(A) authorized pursuant to a statute of the State involved to prescribe the drug involved; and
“(B) approved, cleared, or authorized under section 510(k), 513, 515, or 564.”.
These sections refer to FDA premarket notification and approval requirements for device manufacturers and authorisation for emergency use of medical products.
So Dr Botface would have to pass the rigorous vetting processes of the not-at-all-independence-challenged FDA, which is surely able to withstand the influence of Big Tech just as easily as it does that of Big Pharma.
Mr Shweikert’s government webpage, by way of explanation, reprints a story from policymed.com that starts with this cute scenario:
“The year is 2030, you have been in the doctor’s office for two hours, answering what seems like hundreds of questions on their augmented reality computer. You are excited about meeting your new doctor. Finally, the office staff robot announces, ‘the computer will see you now.’ Your initial reaction is, ‘What!?!’ The private equity firm that owns the practice has apparently just bought the recently-released HAL 35 computers to replace your retiring physician.
“How did we get here? … ”
Reassuring.
The folks at MedPage Today spoke to the director of AI programs at Beth Israel Deaconess Medical Centre and assistant professor at Harvard Medical School Adam Rodman, who sanguinely told them the tech was not ready for its close-up, but that the legislation “reflects enthusiasm about the technology” and that such a provision might help address the “huge care gaps” when people can’t see a doctor.
Note the absence of the doctor in that scenario – per Professor Rodman the AI would not be engaging in admin or aiding clinical decision making, but would be there to prescribe drugs when a doctor couldn’t.
And you thought pharmacist prescribing was a bit dicey.
The American College of Physicians put out a position statement on AI last year whose top, first, No 1 recommendation was that “AI-enabled technologies should complement and not supplant the logic and decision making of physicians and other clinicians”.
Professor Stephen Fihn, executive deputy editor of JAMA Network Open, gave a more appropriately alarmed comment to MedPage, saying “huge sets of regulations” would be needed – which in the current climate is like saying huge swarms of parasitic wasps would be needed.
The bill’s application of AI and machine-learning technologies “seems premature” Professor Fihn said, as it appears to enable the “actual prescribing of drugs, some of which are very low risk, and some of which are very high risk”.
But AIs are less prone to error than humans, right?
As Futurism notes: “AI has already fumbled in healthcare repeatedly – like the time an OpenAI-powered medical record tool was caught fabricating patients’ medical histories, or when a Microsoft diagnostic tool confidently asserted that the average hospital was haunted by numerous ghosts, or when an eating disorder helpline’s AI Chatbot went off the rails and started encouraging users to engage in disordered eating.”
The best that can be said is that in a health system so far up the spout it makes ours look like Denmark’s, maybe the bots can’t make it much worse.
Send numerous ghosts to penny@medicalrepublic.com.au.