Showing posts with label High reliability organization. Show all posts
Showing posts with label High reliability organization. Show all posts
Wednesday, May 23, 2012
HROs
Tweeted by @peterlachman who is attending Risky Business, a link to a v useful resource about high reliability organizations. See here.
Tuesday, October 18, 2011
Escape Fire and high reliability
I have bene rereading two books recently, Managing the Unexpected, (a fantastic read) and Berwicks Escape Fire. Berwick spoke about the Mann gulch fire disaster in Montana, about which Weick has written. Although at first pass, one wonders what lessons a forest fire in Montana has to delivering safe care in health, there are many deep lessons to be learned. This is a paper well worth reading. A few examples.
Partners and partnership are critical.
To be wise is not to know particular facts but to know without excessive confidence or excessive cautiousness. Wisdom is thus not a belief, a value, a set of facts, a corpus of knowledge or information in some specialized area, or a set of special abilities or skills. Wisdom is an attitude taken by persons toward the beliefs, values, knowledge, information, abilities, and skills that are held, a tendency to doubt that these are necessarily true or valid and to doubt that they are an exhaustive set of those things that could be known.
In a fluid world, wise people know that they don't fully understand what is happening right now, because they have never seen precisely this event before. Extreme confidence and extreme caution both can destroy what organizations most need in changing times, namely, curiosity, openness, and complex sensing. The overconfident shun curiosity because they feel they know most of what there is to know. The overcautious shun curiosity for fear it will only deepen their uncertainties. Both the cautious and the confident are closed-minded, which means neither makes good judgments. It is this sense in which wisdom, which avoids extremes, improves adaptability.
Partners and partnership are critical.
A partner makes social construction easier. A partner is a second source of ideas. A partner strengthens independent judgment in the face of a majority. And a partner enlarges the pool of data that are considered. Partnerships that endure are likely to be those that adhere to Campbell's three imperatives for social life, based on a reanalysis of Asch's (1952) conformity experiment:
(1) Respect the reports of others and be willing to base beliefs and actions on them (trust);
(2) Report honestly so that others may use your observations in coming to valid beliefs (honesty); and,
(3) Respect your own perceptions and beliefs and seek to integrate them with the reports of others without deprecating them or yourselves (self-respect).
Thursday, September 29, 2011
Normalisation of Deviance
One of my favourite terms is the “Normalisation of Deviance”. I think it captures perfectly what happens every day in healthcare. We work around problems, pat ourselves on the back for finding a (temporary) solution, and ignore the very profound message that the system is sending us; this message essentially is “you dont understand your system and your processes and if you dont work out a way to fix this problem, there will be serious consequences”. In Reasons Swiss Cheese model, all those little holes are problems that usually were recognised well before a serious event occured, but were most often ignored and certainly never consciously addressed and removed.
As Scott Snook, a lecturer from Harvard puts it:
As Scott Snook, a lecturer from Harvard puts it:
Each uneventful day that passes reinforces a steadily growing false sense of confidence that everything is all right – that I, we, my group must be OK because the way we did things today resulted in no adverse consequences.A paper that describes the consequences of this thinking, really a form of magical thinking is available on the NASA website; it is a section of the report into the Columbia Space Shuttle disaster. Available here.
Labels:
deviance,
High reliability organization,
NASA
Monday, April 4, 2011
Simulation
There is increasing interest in the benefits of simulation to ensure improved outcomes in healthcare. A piece in the Economist piqued my interest in this area. They describe a story I had never heard before, involving the role of former US President Jimmy Carter in dealing with a reactor meltdown in Canada in 1952. Carter led a 23 man team to disassemble a reactor near Ottawa that partially melted down. Such was the radiation exposure that each person was limited to 90 seconds at the core. To ensure that the process went as smoothly as possible in such adverse conditions,
"The team built an exact replica of the reactor on a nearby tennis court, and had cameras monitor the actual damage in the reactor's core. "When it was our time to work, a team of three of us practised several times on the mock-up, to be sure we had the correct tools and knew exactly how to use them. Finally, outfitted with white protective clothes, we descended into the reactor and worked frantically for our allotted time," he wrote. "Each time our men managed to remove a bolt or fitting from the core, the equivalent piece was removed on the mock-up."The take home message is no surprise, simulation to be successful must be as realistic as possible, with real time feedback. A paper published in January in PCCM reports that with increasing use of simulation of cardiac arrest in a children's hospital, survival post cardiac arrest increased to 50%, substantially above the national average.
Wednesday, December 8, 2010
High Reliability Organizations
There has been a great deal of discussion recently about healthcare adopting the same approaches that have facilitated high reliability organisations to achieve exceptional levels of safety despite operating in high risk high consequence environments. Examples include aircraft carriers, nuclear submarines, the nuclear power industry and aviation.
What are the features of a high reliability organisation? Are these concepts translatable to healthcare? Are there any examples of HROs in healthcare?
There are according to one expert in the field five characteristics of HROs.
•Preoccupation with failure rather than success; this is self explanatory. The HRO almost celebrates failure and actively seeks it out recognising that only by recognising the defects within it's systems can it seek to rectify those defects.
•Reluctance to simplify interpretations; always seeking the explanation especially the explanation that defines the cause of a possible future mistake.
•Sensitivity to operations; To be sensitive to operations, we must monitor a messy reality and respond to the unexpected.
•Commitment to resilience; HROs recognize that not every risk can be mitigated, but anticipate failure and ensure that redundancy is built into the system.
•Deference to expertise; instead of hierarchy structures determining responses, the decision making in a HRO migrates to the persons with most expertise in that area.
The key difference between HROs and other organisations is that they respond differently to what others would consider signals of no significance. Mindfulness is what some have described this aspect, the capability to respond strongly to weak signals and respond strongly to mitigate the potential adverse consequences of such a failure. An example in healthcare might be the test result that is delayed, a routine test of no significance but this is a warning that the system is prone to error, that a time critical result may also be delayed. The HRO responds immediately to address this failure, the Low Reliability Organisation (LRO), effectively all of healthcare, is unlikely to take any action. HROs are constantly looking for evidence of failure or potential failure. Clearly these concepts can be applied to healthcare, though the details are likely to differ. However, it is likely that the only organisation which will successfully make this transition will be those in which the culture is receptive, indeed greedy, to make this change, and in which the leadership see becoming a HRO as the number one priority of the organization. This is such a fundamental shift that it likely that very few organizations will be successful in their attempts to become HROs.
I asked two physicians recently, world experts in safety and who lead the safety/ quality efforts in their hospital, which is probably the most advanced hospital in the world in this field, where their institution was on a 1-10 scale in safety. About a 3-4 on a good day they replied. That is the characteristic of a hospital that is striving to be the best and safest in the world, but recognises that despite being the best, it has a long journey ahead.
This book is probably one of the seminal works describing HROs, and I recommend it highly. Weick & Sutcliffe
What are the features of a high reliability organisation? Are these concepts translatable to healthcare? Are there any examples of HROs in healthcare?
There are according to one expert in the field five characteristics of HROs.
•Preoccupation with failure rather than success; this is self explanatory. The HRO almost celebrates failure and actively seeks it out recognising that only by recognising the defects within it's systems can it seek to rectify those defects.
•Reluctance to simplify interpretations; always seeking the explanation especially the explanation that defines the cause of a possible future mistake.
•Sensitivity to operations; To be sensitive to operations, we must monitor a messy reality and respond to the unexpected.
•Commitment to resilience; HROs recognize that not every risk can be mitigated, but anticipate failure and ensure that redundancy is built into the system.
•Deference to expertise; instead of hierarchy structures determining responses, the decision making in a HRO migrates to the persons with most expertise in that area.
The key difference between HROs and other organisations is that they respond differently to what others would consider signals of no significance. Mindfulness is what some have described this aspect, the capability to respond strongly to weak signals and respond strongly to mitigate the potential adverse consequences of such a failure. An example in healthcare might be the test result that is delayed, a routine test of no significance but this is a warning that the system is prone to error, that a time critical result may also be delayed. The HRO responds immediately to address this failure, the Low Reliability Organisation (LRO), effectively all of healthcare, is unlikely to take any action. HROs are constantly looking for evidence of failure or potential failure. Clearly these concepts can be applied to healthcare, though the details are likely to differ. However, it is likely that the only organisation which will successfully make this transition will be those in which the culture is receptive, indeed greedy, to make this change, and in which the leadership see becoming a HRO as the number one priority of the organization. This is such a fundamental shift that it likely that very few organizations will be successful in their attempts to become HROs.
I asked two physicians recently, world experts in safety and who lead the safety/ quality efforts in their hospital, which is probably the most advanced hospital in the world in this field, where their institution was on a 1-10 scale in safety. About a 3-4 on a good day they replied. That is the characteristic of a hospital that is striving to be the best and safest in the world, but recognises that despite being the best, it has a long journey ahead.
This book is probably one of the seminal works describing HROs, and I recommend it highly. Weick & Sutcliffe
Subscribe to:
Posts (Atom)