There is cognitive dissonance between the promise of technology and the messy, lived complexity of clinical practice
A constant frustration for many in healthcare is the cognitive dissonance between the elegant, highly anticipated promise of technology solutions and the messy, lived complexity of clinical practice
In this context, I was fascinated â and feel compelled to share â this unexpectedly revealing excerpt from a recent (and, as always, captivating) a16z podcast, featuring a conversation a16z founder Marc Andreessen and board partner Balaji Srinivasan recorded at Stanford.
Following an extensive conversation about the factors associated with startup success and venture capital (VC) success, as well as about emerging (or re-, re-emerging) trends such as artificial intelligence (AI), an audience member asked whether AI might not select investments better than actual VCsâa VC version of the âwill computers replace doctors?â gauntlet that tech VCs have thrown down before the medical establishment.
Andreessenâs response (at around the 40-minute mark) speaks for itself â but also, Iâd argue, for most in healthcare (emphasis added):
The computer scientist in me and engineer in me would like to believe this is possible, and Iâd like to be able to figure this outâfrankly, Iâd like us to figure it out.
The thing I keep running up againstâ Â the cognitive dissonance in my head I keep struggling with, is what I keep seeing in practice (and talk about in theory vs. in practice) â like in theory, you should be able to get the signals â founder background, progress against goals, customer satisfaction, whatever, you should be able to measure all these things.
What we just find is that what we just deal with every day is not numbers, is nothing you can quantify; itâs idiosyncrasies of people, and under the pressure of a startup, idiosyncrasies of people get magnified out to like a thousand fold. People will become like the most extreme versions of themselves under the pressure they get under at a startup, and then thatâs either to the good or to the bad or both.
People have their own issues, have interpersonal conflicts between people, so the day job is so much dealing with people that youâd have to have an AI bot that could, like, sit down and do founder therapy.
My guess is weâre still a ways off.
Who knew that developing data-driven tech solutions could be challenging in a profession that at its core is focused on human idiosyncrasies, especially under conditions of stress?
Dr David Shaywitz is the chief medical officer at DNAnexus. This was first published on The Health Care Blog.