Delving deeply

02-05-2014

Delving deeply

What can healthcare organizations learn from HROs? Dr Dan Cohen, international medical director of Datix, shares a personal experience to illustrate how high reliability principles can enhance integrity and patient safety.

It has been argued that if the healthcare industry were to adopt the principles of High Reliability Organizations (HROs), it would move the bar for quality and safety higher and therefore improve outcomes. HROs are those that function in industries that are inherently dangerous but that have established cultures and supporting processes designed to dramatically reduce the likelihood of human error and harm. 

HROs recognize that in the interactions between humans and technologies it is the humans that represent the most substantial sources of risk. For example, in the aviation industry the airplanes are so well designed, with multi-redundant highly engineered systems, that the risks arise primarily from the aircrew. Human beings and human factors are the sources of most risk. So what can we learn from HROs?

One important characteristic that defines HROs is the reluctance to simplify—the avoidance of simplistic explanations for risks or failures. There is a commitment to delve deeply to understand the vulnerabilities, especially when they involve correctable human factors. 

Errors in diagnosis occur for a variety of reasons, including deficiencies in education and training, cognitive impairments, over-reliance on heuristics, and important parameters of behavior elucidated by Jerome Groopman in How Doctors Think, published in 2007, such as confirmation bias, attribution errors and diagnostic momentum.

In addition, human factors such as workload pressures, task saturation, personality issues, and even family and social pressures all work to impair diagnostic capabilities, often resulting in errors of omission or commission, patient harm and the potential for malpractice claims.

Professor James Reason’s ‘Swiss cheese model’ for accident causality outlined in the publication Human Error, 1990, requires us to think beyond the holes in the first or second slices of cheese and to meander through the substance of the Emmental, to probe for the weak spots, the tiny holes or the ‘about-to-become’ holes, in order to identify the numerous root causes for harm.

Case report

A while back I was asked to review the findings of a root cause analysis investigation involving the death of a patient following a routine, non-urgent cholecystectomy. The patient was morbidly obese, had diabetes mellitus and had a longstanding history of psychological problems and hypochondriacal complaints.

What struck me about this investigation was the range of factors that contributed to this catastrophic outcome and the fact that had basic principles learned in medical school been applied, the outcome might have been prevented. I was taught, first and foremost, to ask the same question every day, every time, for every patient interaction: “what do the vital signs tell us?”

In the case, shortly after the procedure had begun, the surgeon was informed that a suture he preferred, a suture that was part of his normal setup, was not available. The surgeon adapted a different suture to suit his requirements, a work-around that others might also have done. Although planned as a laparoscopic procedure, the surgeon converted this to a more invasive open procedure due to access problems related to the patient’s morbid obesity. At one point a small duodenal puncture was suspected but not confirmed.

Postoperatively, the patient complained of abdominal pain “beyond normal expectations” (nursing note in chart), nausea and anorexia, which the surgeon attributed to her psychological constitution and hypochondriacal personality.

After three days she was discharged to her home for self care. Her exam at discharge revealed mild abdominal tenderness, attributed to incisional pain. There was no recording of intestinal sounds in the discharge note written by the surgeon. The patient was advised to call if there were any problems and to return to the surgeon’s outpatient clinic in one week.

Thirty-six hours after discharge, the patient suddenly became unresponsive after having developed a distended abdomen over several hours. She was readmitted urgently with a diagnosis of bowel perforation and abdominal sepsis. Despite operative intervention and intensive care support, she died 48 hours after readmission.

The hospital performed a root cause analysis investigation and the findings concluded that this “unfortunate incident” was primarily related to insufficiencies in the hospital supply system resulting in the absence of the surgeon’s choice of suture.

A pre-op surgical protocol including ‘time out’ had been performed but had failed to note the absence of the suture. The surgeon had acted in good faith in adapting another suture, and bowel perforation was a recognized complication of surgery that the patient had been advised about when providing informed consent. Recommendations were made to improve efficiencies in the supply chain, the surgical setup and the pre-op checklist protocol processes.

This “unfortunate incident” resulted in the death of a mother who would never see her first grandchild. Anyone troubled yet? I began my analysis by using the fundamentals I learned early in medical school: “what do the vital signs tell us?” Postoperatively, the patient’s vital signs revealed a slowly, yet consistently increasing temperature (eventually reaching 38°C), a steadily increasing resting pulse and a slowly decreasing blood pressure.

The patient’s blood pressure was still within the normal range but considerably lower than on admission or in the outpatient setting. This combination of findings, in association with continual complaints of abdominal pain (which often elevates blood pressure) “beyond normal expectations,” nausea and anorexia should have clearly, unconditionally, and without any doubt, suggested abdominal sepsis related to bowel perforation.

Additionally, the two bowel movements this patient had after surgery had both contained streaks of blood, which had been reported in the nursing notes but not communicated verbally to the surgeon.

The patient’s bowel perforation was a treatable condition and her death from sepsis was most likely preventable. The patient died because the surgeon failed to pay attention to the objective signs of evolving infection; these were fundamentals he was taught in medical school.

The patient might have lived to see her first grandchild had things been handled differently. Inattention to detail and failure to respond to evidence in a timely fashion were the cause of this patient’s death, not a supply problem, not a surgical setup problem, not a pre-op checklist protocol problem, and not even a failure of nursing communication problem.

Physicians are obliged to read nursing notes and to ask their nurse colleagues to share their concerns!

This root cause analysis lacked integrity. It was not intended to delve deeply to identify the real cause of this woman’s death, ie, the very compelling issues of failures in professional performance and factors that might have contributed to those failures. It was a whitewash, revealing a systemic institutional and cultural challenge that would serve no purpose, lead to no learning and possibly contribute to harm in other patients.

Without learning from mistakes, these mistakes are perpetuated. By extension, the hospital lacked integrity in its processes and was not highly reliable.

If a person or process has integrity, that’s what matters most; if a person or process lacks integrity, that’s all that matters.

Dr Dan Cohen is international medical director of Datix. He can be contacted at: dcohen@datix.co.uk

High Reliability Organizations (HROs), Dan Cohen, Datix