If the healthcare industry were to adopt the characteristics and methodologies of other dangerous industries, could reliability be improved? Dr Dan Cohen from Datix offers a personal perspective.
The healthcare industry is defined by continuous change. However, continuous change does not necessarily mean continuous improvement. Emerging technologies provide great promise for advancing diagnostic and therapeutic options although there is a liability that the increasing number and complexity of healthcare options raises the risk of latent system failures possibly harming patients.
Every day thousands of patients are harmed and hundreds die in modern well-equipped hospitals staffed by highly trained individuals that have devoted themselves to careers as helpers. Benevolent intentions do not necessarily translate to safety, however, and some of the reasons for this are known, some unknown. No matter how you choose to cut the pie, the healthcare industry is dangerous for patients and also for staff working inside the industry. The challenge that remains is to understand how so many things can go wrong when the intention is to achieve quality outcomes.
High reliability organizations (HROs) are those that function in industries that are inherently dangerous and that have established cultures and supporting processes designed to dramatically reduce the likelihood of human error and harm. HROs recognize that in the interactions between humans and technologies it is the humans that represent the most substantial sources of risk. For example, in the aviation industry airplanes are so well designed, with multiply redundant highly engineered systems, that the risks arise primarily from the aircrew. Human beings and human factors are the sources of most risk.
It has been argued that if the healthcare industry were simply to adopt the characteristics and methodologies of other dangerous industries, if it adopted the principles of HROs, it would move the bar for quality and safety higher and would improve outcomes. If this is true, why is there such inertia in the healthcare system; inertia that plagues improvement strategies?
My perspective is that the healthcare industry possesses a variety of complex and unique characteristics. Even though principles that work in HROs can be transferred or translated to healthcare in specific circumstances, there are other factors that encumber processes that may not apply to other industries.
Authors Karl Weick and Kathleen Sutcliffe have summarized the operations of a HRO as:
• Sensitivity to operations—a constant awareness by leaders and staff to risks and prevention, a mindfulness of the complexities of systems in which they work and on which they rely;
• Reluctance to simplify—avoidance of overly simplistic explanations for risks or failures and a commitment to delve deeply to understand sources of risk and vulnerabilities within systems;
• Preoccupation with failure—a focus on predicting and eliminating catastrophes rather than reacting to them; a ‘collective mindfulness’ that things will go wrong and that near-misses are opportunities to learn;
• Deference to expertise—leaders and supervisors listening to and seeking advice from front-line staff who know how processes really work and where risks arise; and
• Resilience—leaders and staff trained and prepared to respond when systems fail and that work effectively as teams to overcome urgent challenges.
In my view, if the healthcare industry is a forest of complexities then two huge trees set it apart from other high risk industries. These giant coastal redwoods are (1) the frequency of human to human interactions that results in a knock-on effect on complexity related to inherent challenges in communication; and (2) a highly appropriate world view that envisions patients not just as passive recipients of healthcare services but rather as essential components in a system appropriately focused on sustaining health and achieving optimal healthcare outcomes. Communication and partnering with patients for active engagement—two trees standing high above the forest floor; it’s pretty complicated.
A PERSONAL VIGNETTE
Professor James Reason’s Swiss cheese metaphor for accident causation is a highly regarded model of how multiple aspects often align in causality and how prevention/avoidance barriers exist in most circumstances. I have learned this lesson well in my career as a medical doctor.
A 10-month-old child was admitted at the weekend for evaluation of a renal mass, likely a Wilms’ tumor. The institutional protocol required the oncology registrar to administer actinomycin D intravenously as soon as the renal vein had been clamped. I wrote the orders correctly and legibly using our standard double-check process.
In addition to covering the inpatient oncology service I had weekend obligations for the outpatient clinic and the bone marrow transplant unit, located in two different, though adjacent, hospitals.
Usually this multiple coverage obligation was not a problem but this weekend two children with leukaemia were to receive outpatient l-asparaginase, and I had to be present because of the risk of anaphylaxis. Recognizing this dilemma, I arranged for the anaesthesiologist to administer the actinomycin D and briefed her thoroughly regarding the dosage, even providing a copy of the prescription.
An emergent cardiac surgery case occurred on the same weekend so the anaesthesiologist had to take responsibility for that case. She briefed a substitute anaesthesiologist and felt that the situation was well in hand.
However, the pharmacist made a decimal point error and instead of preparing a dose of 97 micrograms of actinomycin D, he sent up a syringe containing 970 micrograms; the substitute anaesthesiologist did not recognize this error.
Several hours later the error was identified. The syringe that had contained the medication was attached to the medical record (a standard procedure at that time). I was shocked. Though not immediately toxic, the effect on this child’s bone marrow would be profound beginning a week after administration, and I was reasonably certain that we had killed this child and that I was ultimately responsible.
I called my consultant immediately and, after calming me down, he said some things that really resonated. “Dan, we do not know that this child is going to die. We can expect that she will encounter severe bone marrow suppression, but we do not know the outcome of that, and we need to be factual and honest when we meet with the parents.”
The following morning I carefully explained to them that their daughter had received a higher than desired dose of medication and that we were very concerned about this. I apologised, explained that we would investigate this further, outlined the steps we would take to protect their child and promised to correct any discrepancies in care we identified—in other words, full disclosure.
This is what we identified:
Elements of unreliability identified—system problems:
• Protocol for intra-operative chemotherapy not evidence-based, ie, experimental; informed consent for this not obtained preoperatively;
• Single oncologist responsible for coverage in multiple hospitals/settings;
• Cultural barrier forestalled calling for back-up unless dire emergency;
• Not all anaesthesiologists qualified for all procedures; and
• Pharmacy double-check for chemotherapy not established.
Elements of unreliability identified—personnel accountability issues:
• Anaesthesiologist did not inform oncologist regarding coverage issues;
• Pharmacist errors in preparation;
• Substitute anaesthesiologist administers unfamiliar drug without self-identified need for verification of dose or side-effects; and
• I did not call for qualified back-up.
So, what happened to this little girl? To make a long story short, though she encountered profound bone marrow failure and spent three weeks in isolation, with much procedural pain and fear, she came through her experience wonderfully and was cured of her Wilms’ tumor. The parents were incredibly grateful for my honesty and support and hugged me tightly when their daughter finally went home. Though personally devastated at the time of this incident, I felt restored and validated by the parents’ heartfelt gratitude.
If the healthcare industry is truly to function like an HRO, then the kinds of challenges and variances portrayed above must be anticipated beforehand so that appropriate fail-safe mechanisms can be established to provide for all contingencies.
Oh by the way, the higher dosage of actinomycin D was subsequently adopted for treatment of resistant sarcomas.
Reliability, dangerous industries, characteristics, Technology, Danger, High reliability organizations (HROs), high reliability healthcare