In 2005, Elaine Bromily died as a direct result of medical errors during a routine operation.
The case review identified that a well equipped operating theatre and a team of technically skilled clinicians had failed to respond appropriately to an unanticipated emergency. They had failed to use equipment properly, follow protocols, maintain situational awareness or make decisions appropriately. Leadership was confused and those with contextual authority ignored attempts by the team to speak up.
Elaine’s husband Martin was an airline pilot, and because of his training and the culture he worked in, the issues were obvious. He could see that the staff in the operating theatre caring for his wife had not been properly briefed, had not rehearsed protocols around equipment use, and had no common understanding as a team for how to maintain situational awareness. Leaders had no understanding of the dangers of fixation, and - most critically - open communication was explicitly supressed at the moment of the emergency.
These are known as ‘human factors’, and a great deal of time and attention is paid to them in both the training and practice of the airline industry. Basically, systems - routines, processes, structures, culture, etc - are designed in ways that minimise the possibility of human error. So you’ll see even the most experienced airline pilot running through a checklist with a co-pilot; you’ll hear junior engineering staff asked - as a matter of routine - for their perspective on a component check; you’ll see aircrew run through the same pre-flight safety protocol whether there’s three people on a flight or 300.
It was clear to Martin that it wasn’t the clinicians that had failed in his wife’s case, but that the system and training had failed them.
He could only see this because he was outside the system, looking in from another, and he decided to do some research. What he found was that this wasn’t an isolated problem. It wasn’t just that team, or that department, or that hospital, or that trust. It was an issue across the whole health service. But he also saw that, whilst there were pockets of good practice and interest, the system was not going to change by itself. It needed a catalyst from the outside to provoke the shift that was needed. So despite having to bring up two young kids on his own whilst coping with the loss of his wife, he decided that others should be able to learn from what had happened, and that human factors should be both better understood and better implemented in clinical practice.
In 2007, working with a group of likeminded experts and clinicians and inspired by the Royal Aeronautical Society, Martin launched the Clinical Human Factors Group, aiming to promote understanding of human factors in healthcare from Board to Ward and beyond. They knew that the health system was very different to the aviation industry, but they were surprised to find that many of the differences - for example, the weakness of centralised regulation; the siloed working practices and tribalism; and the defensive responses to calls for change, particularly from 'the outside' - were not just defended but also points of pride for some, even though it was clear to everyone that they actually limited the effectiveness of both systems and practice and therefore put lives at risk. Even though there was a political and strategic intention to focus on maintaining and improving safety, the reality was a service that was "the result of hundreds of years of inertia, denial and vested interest [and] more akin to the teaching professions of old or the church". Critically, there was nowhere near enough focus on the patient or the clinician.
CHFG's vision is of a healthcare system that places an understanding of human factors at the heart of improving clinical, managerial and organisational practice, leading to significant improvements in safety, efficiency and effectiveness. It works to achieve this through helping healthcare organisations build high reliability into operations, encouraging the development of cultures that seek to learn from mistakes as well as successes, and promoting human factors science through education and training. And in the 15 years since it started, CHFG has put human factors on the political, regulatory and strategic agenda; launched an independent 'no-blame' investigations branch using the model from the transport industry; and built a large bank of resources and seminars for everyone from clinicians and patients to policy makers and CEOs. Human factors training is now being incorporated into professional standards, and human factors is now central to national projects around patient safety and investigations.
And all because someone from outside the system with no clinical experience got involved. I wonder if clinicians are still arrogant enough to ask when Martin Bromily last operated on someone.
There are several videos about The Bromily Case which are used in training programmes, including this: