Rep. Murtha’s Contribution to the Checklist and System Safety

11 02 2010

This was a headline earlier this week.  Read the first two paragraphs.

Normally I’d just glaze over the details of Rep. Murtha’s passing and accept the insinuation that the operating room mishaps like this are not the norm.  But I just finished reading The Checklist Manifesto by Dr. Atul Gawande, and all of a sudden the term ‘complications’ means so much more.

A routine procedure like the laparoscopy Rep. Murtha was having is fallible because it is routine, and because the task at hand is so minimally-invasive and trite compared to, say, an emergency lobotomy.  It’s simple.  Or, at least, straightforward enough to not give surgeons stage fright in the operating room.  So why did this happen?

When we say that there were complications, we admit that the problem is one of complexity.  Complexity refers not only to there being many players, which is true – the proper tools, personnel, and preparation need to be in place – but also to the way they must interact for there to be a reliably successful outcome.  With people, this interaction is teamwork and the ability to manage all available resources cohesively and quickly.

A startling example Gawande gives of this is that one item on the medical checklists being used in many institutions around the world makes sure – yes, makes sure – that everybody in the operating room knows everybody else’s names.  Introductions.  Apparently, formalities like this were routinely ignored until something happened, at which communication rose to little more than stumbling commands to nameless faces.

James Reason’s 1997 M.T.R.O.O.A. reminds us to ask not why the failure of the surgeon who “hit his intestines” happened, but how it failed to be corrected, especially when we already know that human nature will make good on any opportunity to err under pressure to succeed with only one chance, regardless of skill, know-how, or determination.  It happens that checklists allow an “activation phenomenon” to occur when each doctor, nurse, anesthesiologist, or resident is allowed to contribute.  People will begin to feel valuable and important to the cause (the patient on the operating table) and will be more inclined to speak up if they see something wrong.  If sharp-end operators are going into the workplace with the mid-set that they are completely independent and fully capable of performing alone, then they are more ignorant to the existence of critical dependencies in the medical system than I originally thought.

Perhaps the hairy eyeball for things like checklists and communications comes from the fact that power, or at least the feeling of it, can be lost.  Which is true, granted.  The decentralization of authority is featured by Reason when he describes a “flexible culture” as a requirement for an organizational safety culture.  When emergencies arise, the hierarchy needs to collapse and front-line workers need to be autonomous and trusted to handle the situation promptly.  Seeking approval for quick corrective action wastes valuable time and can have dire consequences.  Gawande discusses this at length when he talks about the federal government and the Hurricane Katrina fiasco.

Also, many industries and professions become obsessed with improving individual components or addressing specific concerns to such a point that they can’t see the forest for the trees; they can’t see latent, systemic threats.  Or worse, they can, but they’ve got a bureaucratic wedgie big enough to keep them from being able to do anything about it in any effective sense.  An army of external distractions and chaotic variances has been encroaching on the safe and simple practices people and organizations have learned to take for granted.  And that’s why we can’t sweat the stupid stuff any longer.

How the nick on Rep. Murtha’s intestines failed to be corrected is probably a result of too little being done too late.  Reactive safety procedures are effective only in pacifying the devil and angel team on our shoulder who scream together, “Well, we tried!”.  When the patient is gushing blood is not the time to start thinking about what to do.  This is where Gawande makes hiscase for the checklist as a tool that can “instill a discipline of higher performance” consistent with predicting failure and preparing for the worst.  Had the surgeon preempted a slip of his hand (which I understand is a common surgical occurrence) with a plan for coordination and by briefing his staff, then we would begin to see the way each player acts as part of a collaborative unit instead of as just a collection of players.

“Man is fallible, but maybe men are less so” says Dr. G.

This is all to say that we can do with less professional arrogance (to be blunt) and fewer who believe that “our jobs are too complicated to reduce to a checklist”.  Again, Gawande says of checklists: “They are quick and simple tools aimed to buttress the skills of expert professionals” not to belittle or replace them.

I have a grand sum of zero experience with medicine, but yet, this post was fairly easy for me to think about.  I just pretended that everything had to do with aviation.

It appears that like gall-bladder surgery, flying an airplane is simple, too.  The acting-as-a-crew part, or the focusing-in-an-emergency part is what’s difficult.  That’s why the first item on the emergency checklist for an engine failure in any single-engine aircraft is stupid: FLY THE AIRPLANE.

Rest in peace, Representative Murtha.

Brian

Advertisements




Fledgling and chaotic, what’s it all mean?

3 01 2010

It may strike some as odd that a blog about safety and security in aviation would shoot off references to chaos and disorder.

This blog’s reference to chaos is more scientific and quantifiable than naive and presumptive.  In the 196os, Edward N. Lorenz coined the term “butterfly effect” to covey the gist of a theory called ‘chaos’.  Chaos theory takes a stab at explaining how the outcome of a system that changes with time – like a flight – can vary wildly if changes are made to the original circumstances surrounding that system.

The original connection was with the idea that the flap of a butterfly’s wings can forever change the course of weather (that should clear some things up), but for the purposes of this blog, you have to view aviation operations for what they are: a system that is dynamic with many external attractors and not linear or inherently prim and proper.  As meticulously-orchestrated, precisely-executed, and well-intentioned as every flight may appear to be, even that 45-minute jump from NYC to Boston is more than just sequential pulling and pushing on the yoke.  An “unsafe” event can still occur if connecting passengers are late and the crew is excessively interrupted during their preflight cockpit flows.  Or if a dispatcher calls in sick at the last minute.  Or if a butterfly skipped a beat in Argentina.

60934869.1927_EmiratesA345_A6ERC_JFK31Ldepoverspeed_BF

Many of the bumps with baggage carts, misinterpreted clearances, runway incursions, technical malfunctions, and sudden cushionless meetings of stone and metal (thanks, Mr. Gann) have palpable causes, an unsurprising chain of events, and conceivable mitigation strategies that are well within the realm of feasibility for aircraft operators but were never implemented or otherwise acted on.  A focus on human factors and risk management has never been far from the industry and there is no shortage of acronyms to describe the work that has already been done.  After ASAP, FOQA, IEP, CRM, and SMS, chaos can expand on the famed Swiss Cheese model by making us aware of the way imperfections in all of our safeguards interact and work together.

The aviation industry is not greater than the sum of its parts – it IS the sum of its parts.  As any poster in a flight school will say, safety is not an accident, and most definitely will not happen by any sort of ‘strength in numbers’ logic.  My effort here is to realize the inherent complexity of aviation systems and its vulnerability to shortcomings in any and all areas of a given operation.

Aviation happens to be impervious to safety because of how intricate it is.

Please join me with your thoughts, comments, and criticisms.  I’m really looking forward to discussing and learning.

Brian Futterman