Failing gracefully – what we can learn from Toy Story

12 minute read


Medicine's complex universe of regulation and technology makes the idea that “failure is not an option” unrealistic


The penultimate scene in the animated children’s epic, Toy Story, is when space ranger Buzz Lightyear and Woody have an idea to help their friends in a desperate situation, and Woody lights a rocket strapped to Buzz’s back, launching Buzz and Woody skywards, where Buzz manages to glide his way to the family car that had left them behind as they were moving.

Buzz couldn’t actually fly. He had learnt, unceremoniously, that he was just a toy and not actually a flying, laser-firing, space ranger, after watching a TV commercial about himself and then trying to fly down some stairs, and falling to the ground, breaking his arm.

Buzz, initially inconsolable that he didn’t have the power of flight, adapted, but only after failing.  His friend, Woody, said to him prophetically earlier: “That’s not flying, that’s falling … with style”.

Is it possible to fail in medicine, with “style”?

It’s not only possible, it’s an acceptable and practical way of dealing with what is now an impossibly complex day to day job, says Dr Kevin Fong, a consultant anaesthetist at UCL Hospitals, and anaesthetic Lead for the patient emergency helicopter response team in Kent and Surrey in England. Especially given the vast and often dysfunctional behemoth of systems and regulation that the typical modern western medical ecosystem represents.

Speaking at last month’s DAS SMACC (Social Media and Clinical Care) conference in Berlin, Dr Fong, who has studied failure in systems with NASA and the US military and emergency care services, said the modern doctor faced complexity and failure at unprecedented levels and learning to manage failure amid such complexity should be incorporated into all medical learning.

“I think we aren’t the doctors who we grew up thinking we would be. We aren’t the single, full-spectrum doctor who is hyper-competent, who, if, on even their best day, can get through everything flawlessly and not make mistakes. That’s not what we are, and the systems we’re in don’t really support us to be that person anyway,” Dr Fong said.

“Though much of the issue is system based, we don’t have to tear the system down and start again, but we do need to start with a recognition from everybody – ourselves, politicians and society – that the landscape within which we operate is what it is.

“You will always fail at some point in your working career. Probably you will fail at some point in your working day. The idea of an unavoidable death, the unavoidable failure, in a complex system, is a myth.”

Dr Fong said that modern doctors faced a particularly difficult journey because society’s expectations about risk and failure were increasingly departing from the realistic ability of doctors to manage risk in complex systems.

166-22072014040434
The idea sociopathic tendencies can be advantageous in life-and-death situations is rubbish

A key issue is what he terms “failure porn”.

“We [as a society] love characters within stories that can do everything better than anyone else. And within that there is an aspiration, and we think that a lack of failure, or protecting ourselves completely from failure, is the way to go.”

Referring to infallible movie characters such as James Bond and Jason Bourne, Dr Fong said: “These are the archetype individuals who tell us that this is what you want to aspire to. You want to be able to drive the tank, drive the car, fly the plane and be an expert marksman. And yes, you being a slightly unbalanced character is OK, because you’re competent at everything else.

“The problem is that this is just not true. These people don’t exist. People who can do everything substantially without failing don’t exist, even though we like to kid ourselves that they can.”

The other myth that Dr Fong feels is very misleading, is the idea that high-performance people often have sociopathic tendencies, and that is OK.

Charles Morgan, a psychiatrist out of Yale University, who did extensive profiling of US SAS soldiers, found that candidates with even light sociopathic tendencies got screened out very early on.

“Being a psychopath or sociopath doesn’t make you well suited for high-performing tasks,” Dr Fong said.

“Why would it? It doesn’t even make you suitable for being a trained killer. So, it’s not going to help you in your practice of medicine.”

So what does help?

Patient advocate, leadership trainer and pilot for a major UK airline, Martin Bromley, who lost his wife to an anaesthetic accident during a relatively simple procedure in 2009, told the SMACC Berlin conference that, of all the skills a doctor needed to develop to help manage complexity and failure, the most important was what he termed “confident humility”.

Confident humility, according to Bromley, “is having the confidence that you have the skills that you have practiced and that you are as good as anybody at those, but you are humble enough to know that you just might be wrong”.

“[In complex systems] there are so many unintended consequences of the things we do, and what we need more than ever at the moment is people who are prepared to ask questions and listen,” Bromley said.

Although Bromley lost his wife as a result of a tragic series of errors he ended up very quickly forgiving the anaesthetist and surgical team who failed, recognising after researching the circumstances of her death – a situation where she could not be intubated and needed urgently to be ventilated – although, seemingly simple, were complex. The anaesthetist, to this day, suffers trauma from the event.

Bromley has spent the last eight years researching and presenting on medical failure and forgiveness, hoping to enlighten as many people as he can about how the increasing complexity of medical services are facilitating more failure situations, not reducing them. And that we need to recognise the precarious position we are putting doctors of all persuasions in.

“The problem is you are at the tip of the triangle all the time. We give you all this responsibility and information and we say, ‘be careful, double check, don’t get it wrong’. We put you in the error-prone situation and then we expect you to act as the last line of defence, when the system should not put you in that position.”

One issue Bromley has identified is the gap between the reality of systems and how they are built and designed by people who haven’t the experience at the coalface. That is, regulators and system designers who are upstream from the day-to-day workings of the system.

“We need to reduce that gap and this comes down to listening really,” Bromley said.

Changes can be effective over time, however, according to Bromley, who cited the case of the death of Ayrton Senna in Formula One racing nearly 25 years ago. Until Senna’s accident, the rate of fatal accidents in Formula One racing had averaged one driver per year for the previous 25 years.

Dr Sid Watkins, who was the chief medical officer of Formula One at the time and close friend of Senna’s, determined to change that situation.

“Sid didn’t go around to the drivers and say: ‘Hey, just take it a bit easy out there … slow down a bit, maybe.’  He knew that wouldn’t work. He took his coalface knowledge of the workings of the race and worked on subtle changes to the overall system that could make a difference and support the drivers. Changes to track design, changes to the cars, small rule changes, and standardising medical facilities.”

Nearly 25 years on, there has been only one Formula One death since Senna’s fatal crash.

“That’s a pretty good lesson in systems safety, from a doctor,” said Bromley.

But if the systems are too complex, and doctors are just a cog in the machine, what design principles should the regulators and clinician leaders be aiming for to avoid the most catastrophic medical accidents, and what sort of oversight body can be used to monitor and improve our medical systems?

Bromley and Dr Fong both refer to the aviation industry and its intimate methods of investigation, especially when a catastrophic accident occurs, as an example of the sort of methodology and process required to advance systems safety for doctors.

The UK has recently established the Health Safety Investigations Board (HSIB), based largely on the principles of the long-standing aviation board, the Air Safety Investigations Board. A key part of the make-up of the board are not only coalface clinicians, but representatives of patient victims of accidents, or their relatives. The latter had been particularly helpful, said Dr Fong, in understanding how to manage accidents when, inevitably, they did happen.

Of all the modes of failure Dr Fong studied, including those listed in the table above, he found the most acceptable model for healthcare while working for the Mars Landing team at Jet Propulsion Laboratories in the US.

Studying hundreds of systems risk-procedure documents, he was intrigued by the following protocol in one of the Mars Lander’s power architecture processes: “This will be a single string architecture, with graceful degradation”.

When he asked one of the engineers what was graceful degradation, the engineer said to him: “Remember in the late 80s and 90s, when after using a windows DOS computer for a couple of hours and you were getting tired, and then you got the message something like: ‘abort, retry, fail’ And you lost everything?”

Dr Fong remembered.

“Well, graceful degradation is the opposite of that,” the engineer told him.

Graceful degradation is the degradation of a complex system which allows it to maintain some, if not all of its function, in the face of discrete failures.

In the context of your iPone, this might mean your iTunes app bugs out, but that doesn’t bring down your whole iPhone. You simply have to restart that app and continue.

In the context of medicine, it means that your inevitable failure as a doctor, is rarely, if ever catastrophic, in a systems context, thus avoiding, as much as possible, the most severe consequences of full failure.

It’s a medical a paradigm that aims to not place a doctor as the very last line of defence on something that is easy to make a mistake on.

Dr Fong refers to the discovery in the late 17th century that the heart functioned as a pump, to make the analogy that system designers needed to recognise in adopting the concept of “graceful degradation or failure”.

“How is that we’ve slaughtered animals since antiquity, but no one worked out the heart was a pump until the 17th century?” he asked. “The reason is that it wasn’t until the middle of 17th century that mechanical pumps became ubiquitous to society. The reason we couldn’t understand the function of the heart was that there wasn’t a technology metaphor for us to understand it by.”

Dr Fong said the highly complex systems of NASA’s Mars Landing missions provided healthcare with the appropriate metaphor for medical-systems safety, in their graceful degradation protocols.

Another key NASA principle for medicine arose out of the Challenger missions, Dr Fong said. That was that although these missions could have used the latest and highly sophisticated chips for launch and landing procedures, they only used relatively simple Intel 386 chips to manage the whole process. They used a chip that was, at the time, in most desktop computers around the world.

The reason?

According to NASA engineers, the reliability and robustness of that processor was far more important in the safety of the mission than the performance.

The lesson for medicine was: “Reliability is more important that performance.”

Both Bromley and Fong emphasised in their talks that in a system which is so complex, where failure is inevitable, then there had to be serious thought around how patients and their relatives could be included in the analysis of failure in order that there is, eventually, understanding and forgiveness. This is because failure is inevitable, so you need to manage failure, as much as try to prevent it.

Bromley said the stories of patient victims and their relatives were vital to achieving success in accident events in the future.

“Generally, we have a greater desire to see that the same thing doesn’t happen to someone else, than to blame people,” Bromley said.

A key part of Bromley’s forgiveness of his wife’s surgical team was the inclusiveness of the medical staff in all the major meetings about his wife from the time of the accident.

Dr Fong, who was involved in the establishment of the HSIB in the UK, said that during the discussions among various groups on how to the set up the body, he was humbled when one key patient advocate member made the following contribution:

“I think healthcare is more about love than most things. At its core, it is human beings who have agreed to be in a relationship where one is trying to relieve the suffering of the other. This is love.”

To which one of the clinicians in the process added: “And if there is love in the system, there has to be forgiveness.”

Dr Fong’s take on the situation: “Forgiveness is me giving up my right to hurt you for hurting me.” (He is quoting Beyonce!)

“As individuals, when we fail we need to understand that failure came about as a result of love.

“Love that comes from our profession as doctors, and possibly most importantly, we must forgive ourselves for those failures,” Dr Fong said.

Screen Shot 2017-07-28 at 11.34.38 AM

End of content

No more pages to load

Log In Register ×