Version classiqueVersion mobile

Proceedings of the fourth Resilience Engineering Symposium

 | 
Erik Hollnagel
, 
Éric Rigaud
, 
Denis Besnard

Patient Safety: A Wakeup Call And Resilient Response

Samuel B Sheps, Karen Cardiff et Rob Robson

Résumé

Despite a decade of intense effort, the problem of unintentional (or preventable) patient harm in health care facilities remains a challenge. Recent research, from hospitals that have actively engaged in patient safety initiatives, reported rates of adverse events similar to, and 10 times higher, than estimated by the Institute of Medicine report, “To Err is Human” in 1999. We believe the difficulty in making significant headway on the patient safety agenda is due in part to the fact that a) it was always going to be a long (indeed never ending) struggle – aviation for example took almost 60 years to become ultra-safe – and b) in part to misunderstanding of the nature of the dynamics that are involved in the generation of adverse events in risk critical industries. In this paper we reflect on the nature of the safety initiatives that health care has tended to focus on, but which have not sufficiently taken note of central concepts of safety science, resilience, as well as deeply embedded features of the health care system itself that have impeded, in our view, progress on enhancing patient safety.

Note de l’auteur

Note portant sur l’auteur1

Note portant sur l’auteur2

Note portant sur l’auteur3

Texte intégral

Patient safety: A wake up call

1The recent publication in the New England Journal of Medicine estimating the rate of adverse events (AEs) in 10 North Carolina Hospitals (Landrigan et al., 2010), is a startling (but not surprising) demonstration of the limitations of the patient safety movement since the Institute of Medicine (IOM) report (Kohn et al., 1999). The fact that rates of adverse events are essentially unchanged (indeed may be much higher (Classen et al (2011)), in hospitals undertaking significant safety initiatives over the last decade (Institute for Healthcare Improvement, 2011, Canadian Patient Safety Institute, 2011), using the same methods (nurse screening and physician review) as studies in Canada, the UK, New Zealand and Denmark, suggests that current thinking about, and approaches to, adverse events in health care, need serious reconsideration. This paper discusses key issues central to creating safety based on system safety science, that health care leadership has failed to understand, or worse, ignored.

Public events vs private events

2In most risk critical industries adverse events (i.e. “accidents”) are public phenemona. There is obvious wreckage, blood and/or signficiant environmental damage: they are hard to ignore. To remain viable, industries must emphasize safety as a key business objective (equal to productivity) and articulated at the highest governance levels. It might seem puzzling that this is not so in health care. However, in health care adverse events are “cloistered” within institutions (e.g. acute care hospitals), are fairly common, but sporadic. Such events are unlikely to come to public attention, unless reported in the press, which is rare given their frequency. Moreover, patients in hospital are sick, many with complex conditions, thus it is easy to ascribe a death, for example, even if not anticipated, to complications. This reponse, a defensive reaction, stifles thought on the dynamic factors that must be addressed proactively and collectively to reduce the likelihood of harm. Unlike other risky industries, hospitals don’t go bankrupt. regardless of harm.

Quality is not safety, and safety is not quality

3Another reason, supported by empirical data and expert opinion (Dekker, 2006a; Hollnagel, 2009; Cardiff and Sheps, 2010) is the conflation of quality and safety that limits the development and scope of relevant solutions to reduce harm. Chassin and Loeb’s (2011) recent paper is the latest example of commentators who have made this fundamental conceptual error. Quality efforts are fine, but there should be no illusion that providing higher quality care will create something conceptually and practically different: safer care (Dekker, 2007a). Quality is an inherent characteristic or property of a person, object, or process that enhances care and/or fulfills ISO 9000 requirements. Accreditation activities seek to achieve similar, rather linear, goals. Safety, however is an emergent manifestation of organizational activity, achieved through anticipating the interplay of biological, physical, social, financial, political, emotional, occupational factors that create harm arising from care (Rasmussen, 1997; Grote, 2004; Hollnagel, 2009). Health care systems that rely primarily on more (or improved) rules, procedures, guidelines and training will not achieve the desired outcome. In short, until there is a fundamental cognitive shift in thinking about the sources of failure in complex, dynamic systems (i.e. “sources” that look a great deal like the routine activities of normal work), improvement to patient safety will be limited, as sadly, Chassin and Loeb (2011) illustrate so well.

Normal accidents: Old view vs new view

4A variety of theories and conceptual models have been developed to help understand what accidents actually are, how they happen and how they can be avoided or mitigated (Sheps and Cardiff, 2008; Hollnagel 2009). An initial focus on technical issues as sources of failure was followed by the recognition that human performance was a critical factor. More recently, organizational influences were understood as equally important in human and technical performance. Finally it has become clear that all three elements actually interact, in often surprising ways, to create harm.

5Essentially, there are two major views regarding the origins of adverse events. The first, called the “old view” (Dekker, 2006b) maintained that “human error” was the primary cause, given improving technical reliability, i.e. the system was inherently safe. The chief threat to safety was the fundamental unreliability of people, interacting with each other or with increasingly complex technology. Thus safety was thought to be about protecting the system from humans through proceduralization, automation, training and discipline, mirroring the quality paradigm in its reliance on standards, guidelines, and “best practices”. Regulatory, structured and prescriptive approaches have been instrumental in all risk critical industries to make them, over many decades, ultra-safe, despite the fact that these industries do not conflate quality and safety. These means to this achievement are, however, insufficient to maintain it (Hollnagel et al., 2008).

6In contrast to the “old view”, the “new view” of safety based on Normal Accident Theory (Perrow, 1984) and Rasmussen’s work on managing risk (Rasmussen, 1997) considers “human error” as a symptom of deeper problems within the system; safety is NOT an inherent fixed characteristic of systems. Risk critical systems, like health care, undertake hazardous activities, thus people have to juggle multiple demands, and make trade-offs while at the same time being the central system component that creates safety. Moreover, safety is a feature of people’s tools, tasks and work environments; thus safety comes from understanding and influencing the dynamics between these elements of work (Dekker, 2006b, Hollnagel, 2009). Finally, in health care, perhaps more than in other risk critical industries, human beings are in close proximity to events, people provide care and their decisions and actions attract immediate attention. However “human error” is NOT an explanation, but demands an explanation (Dekker, 2006b).

7System resilience (Hollnagel et al., 2006) is the fundamental basis for the “new view”. Despite this deeper understanding health care organizations, in general, remain fixated on narrowly targeted empirically driven “solutions” embedded in a “diagnose and treat” mind set (Cook 2010).

Oranizational/professional self image

8Adverse events are “surprising” as well as deeply disturbing (Dekker, 2006b). Hospitals perceive themselves as providing good care and its professionals as highly skilled, competent and caring – both perceptions are true. Thus patient harms when they occur, “represent” egregious, and largely personal, failures. The self-image of both the organizations and professionals involved is seriously (and painfully) questioned. Given that professionals are implicitly taught (and deeply believe) that they must be perfect, patient harm gives rise to finger pointing and guilt.

9Adverse events, as surprises however, need to be understood as “normal” in risk critical dynamic settings. A recent paper (Wachter and Pronovost, 2009) has suggested that physicians (in particular) must be fully accountable for their decision making, the default position of the “old view” that humans are the critical weakness in an otherwise smooth running health care system. This re-emphasis on personal accountability in complex, dynamic and risky work environments is worrisome, and, even more disturbingly, will not prevent patient harm. Woods (2005) has fully explored this unhelpful thinking about accountability in contemporary healthcare, particularly the accountability-responsibility gap, and its negative impact on organizational and individual learning.

Communication between senior management and front line staff

10A significant issue emerging from almost all adverse event investigations is poor communication, among front line staff and between the blunt (governance and management) and sharp (clinical) ends of health care organizations. Front line miscommunication, whether written or oral, arises in part from the interaction between hierarchical/professional disconnnects, time pressures and lack of sense making within professional teams. Transitions in care (emergency room to ward, ward to surigical suite, shift changes, etc.) are opportunities for miscommunication deeply embedded in the daily work of health care providers (Cook et al., 1998).

11Blunt and sharp ends of care miscommunication occurs in many risk critical industries: blunt end decision making sets up efficiency thoroughness trade-offs (Hollnagel, 2009); insufficient understanding at the blunt end of what the sharp end workers do on a day to day basis to permit the organization to function (Dekker, 2006a; Dekker 2006b); the inherent underspecification of work; and hierarchical decision making processes. Executive walk-arounds, for example, to address blunt-end-sharp-end disconnects are scripted, thus fail to address the fact that work imagined by senior executives/board members seldom reflects actual work.

12Moreover, institutional learning, touted by the safety movement, is not possible through reporting systems that systematically remove emotional and contextual narratives as reports go up the hierarchy (Waring, 2009). In addition, front line staff dutifully reporting rarely get responses from senior levels, contributing to the creation of creating second victims (Wu 2000)

13While there is much talk in health care about a safety culture, few people in most organizations, if any, really have a clear understanding of what this would actually look like with regard to organizational behaviour, day to day operations or responses to adverse events (Cook, 2010). Indeed, most “safety culture” surveys emphasize fear of discipline rather than engaging staff in conversations about how they might create safe care, often when at the margins of safety, i.e. how to practice resilience.

Risk management or learning from critical incidents

14Risk management, as health care’s legal defense mechanism, exists to demonstrate compliance with rules and regulations, mitigating (usually in financial terms) institutional liability. Until recently, risk management had little engagement with recent thinking about creating safety in complex, dynamic organizations. Indeed, risk management is fundamentally hampered in understanding safety precisely because traditional legalistic thinking requires a victim and a suspect (Dekker, 2007b) Indeed, health care is still prone to frame problems of liability in terms of linear cause and effect (“root cause”) relationships, de-contextualization, and hindsight bias. Thus the need for new rules and procedural guidelines, education or discipline or all four. Risk management has not traditionally asked: Why did the actions of those involved make sense to them at the time? What is “known” and becomes the focus of attention is the harm, and harm requires a remedy or else the organization is viewed as lacking “proper accountability”. Human error becomes an explanation when in fact it requires an explanation (Dekker, 2006b). Useful explanations, however, consider the complex array of human-human and human-technology interactions, inherently flawed decision making in the face of multiple (often conflicting and stressful) physical and cognitive demands, poor functional design, fatigue and other human factors inherent in complex socio-technical systems (Woods and Cook, 2002; Hollnagel, 2004; Dekker, 2007a; Hollnagel, 2009).

Isolation and siloing of current efforts

15Safety initiatives tend to be highly targeted (siloed) within health care organizations. “What happens in surgery, stays in surgery”, thus those not directly involved, are often thought not to be able to benefit or learn from the problems that arise in the surgical suite, the ICU, with infection control, in the pharmacy, etc. There is little, if any opportunity for cross learning to prevent harm despite similar challenges presented by technology, human factors and organizational constraints. This is partly a professional isolation issue, and partly the result of the lack of any organization-wide structured locus for learning. Attempts at team building (e.g. briefing-debriefing, safety huddles, etc.) while effective, remain largely local.

16Interestingly, health care organizations seem to take a much more structural and functional approach to financial and human resource issues, than to safety. Hospitals generally have substantial, well staffed Finance and Human Resource Departments to ensure that the organization does not incur financial liabilities and manages staff effectively. That the prevention and mitigation of patient (and staff) harms is not thought to need similar structural and functional presence within organizations, integrating all safety activities and acting as a clearinghouse for innovative approaches to address adverse events, to coordinate and learn from investigations providing leadership for both senior management and Boards and front line staff, is odd. Significant infrastructure for patient safety is advocated by Pronovost et al. (2008).

Conclusion

17The provision of health care is an intensive risk critical and highly dynamic process; it is fraught with dire consequences and potential for harm. Amazingly harm in any one institution is generaly infrequent, though in aggregate it must command our attention. Initiatives intended to mitigate this harm, while useful, are increasingly limited. While a firm bedrock of standards, policies and procedures, as well as intensive clinical training are essential (as they have been in other risk critical industries), they are, by themselves, insufficient to address the dynamic interactive nature of the factors that can lead to patient harm. A new perspective, resilience engineering, is needed as has been recognized by many other risk critical industries.

18Failure can be a great teacher if every level of the organization is willing to learn from it (Madsen and Desai, 2010). It is critical to understand that safety both emerges from the daily work of health care professionals and management, and a constant reflection on how that work can be safely accomplished. Blame destroys learning. A reconsideration of how health care has addressed the patient safety problem is required to secure the gains already made in some areas (e.g. ICUs, proper use of surgical check lists) and move clinical and organizational thinking to the more cognitively relevant areas of resilience thinking. Resilience accepts that harm is potentially inherent in all facets of care and seeks to support those whose work creates safety (and sometimes harm) to anticipate the unexpected, be sensitive to small signals suggesting that problems are immanent, and openly communicate these concerns in a process of sense making at front line and governance levels. This is not an easy task, but health care can become safer if there is a willingness to support learning the lessons presented on a daily basis about what might go wrong, and be open to listening and working together (leadership and staff). Landrigan et al. (2010) and Classsen and Loeb (2011), like Miss Clavel (Bemelmens, 1955) have “turned on the light” but it is still incumbent on health care to openly acknowledge and concretely admit “that something is not right.”

Acknowledgement

19The authors wish to acknowledge that this paper was originally published in Clinical Governance: An International Journal, Volume 16, Issue 2, 2011. The paper published in the 2011 Resilience Symposium Proceedings is a revised and significantly amended version of that paper.

Bibliographie

References

Bemelmens, L. (1955), Madeline in America, Scholastic Press, New York, NY. Canadian Patient Safety Institute (2011), “Safer Healthcare Now!”, available at: http://www.patientsafetyinstitute.ca/English/Initiatives/shn/Pages/default.aspx (accessed January 10, 2011).

Cardiff, K. and Sheps, S. (2010), “If you think quality and safety are the same...think again”, paper presented at the System Safety Society Canadian Chapter Spring Symposium, June 10, 2010, Ottawa, Ontario, available at: http://canada.system­safety.org/meetings/past_presentations.php (accessed January 10, 2011).

Chassin, M.R. and Loeb, J.M. (2011). The ongoing quality improvement journey: Next stop, high reliability. Health Affairs, Vol 30:4; pp. 559-568.

Classen, D.C., Resar, R., Griffin, F., Federico, F., Franckel, T, Kimmel, N., Whittington, J.C., Frankel, A, Seger, A. and James, B.C. (2011). ‘Global Trigger Tool’ shows that adverse events in hospitals may be ten times greater than previously measured. Health Affairs, Vol 30:4; pp. 581-589.

Cook, R. I., Woods, D. D. & Miller, C. (1998). A tale of two stories: Contrasting views on patient safety. Chicago, IL: National Patient Safety Foundation

Cook, R. (2010), “The complexity of the healthcare system”, paper presented at the 1st Nordic Patient Safety Conference, May 20-21, 2010, Stockholm, Sweden.

Davis, P., Lay-Yee, R., Briant, R., Scott, A. and Schug, S. (2002), “Adverse events in New Zealand public hospitals I: Occurrence and impact”, The New Zealand Medical Journal, Vol. 115, No. 1167, pp. 1167-1172.

Dekker, S. (2006a), “Past the edge of chaos”, Technical Report, working paper, Lund University School of Aviation, Lund University, Ljungbyhed, Sweden, March 2006.

Dekker, S. (2006b), The Field Guide to Understanding Human Error, Ashgate Publishing Limited, Aldershot, England.

Dekker, S. (2007a), “Safety is not a dividend of quality”, paper presented at the Winnipeg Regional Health Authority, October 31, 2007, Winnipeg, Manitoba.

Dekker, S. (2007b), Just Culture: Balancing Safety and Accountability, Ashgate Publishing Limited, Aldershot, England.

Grote, G. (2004), “Uncertainty management at the core of system design”, Annual Reviews in Control, Vol. 28, pp. 267-274.

Hollnagel, E. (2004), Barrier Analysis and Accident Prevention, Ashgate Publishing Limited, Aldershot, England.

Hollnagel, E., Woods, D.D. and Leveson, N. (2006), Resilience Engineering: Concepts and Precepts, Ashgate Publishing Limited, Aldershot, England.

Hollnagel, E., Nemeth, C.P., and Dekker, S. (2008), Resilience Engineering Perspectives, Volume 1: Remaining Sensitive to the Possibility of Failure, Ashgate Publishing Limited, Aldershot, England.

Hollnagel, E. (2009), The ETTO Principle: Efficiency-Thoroughness Trade-Off: Why Things That Go Right Sometimes Go Wrong, Ashgate Publishing Limited, Aldershot, England.

Institute for Healthcare Improvement (2011), “Protecting 5 Million Lives from Harm”, available at: http://www.ihi.org/IHI/Programs/Campaign/Campaign.htm?TabId=1 (accessed January 10, 2011).

Kohn, L.T., Corrigan, J.M. and Donaldson, M.S. (1999), “To err is human: Building a safer health system”,availableat: http://www.providersedge.com/ehdocs/ehr_articles/To_Err_Is_Human_%20Building_a_Safer_Health_System-exec_summary.pdf (accessed January 10, 2011).

Landrigan, C.P., Parry, G.J., Bones, C.B., Hackbarth, A.D., Goldmann, D.A. and Sharek, P.J. (2010), “Temporal trends in rates of patient harm resulting from medical care”, The New England Journal of Medicine, Vol. 363, No. 22, pp. 2124-2134.

Madsen, P.M. and Desai, V. (2010), “Failing to learn? The effects of failure and success on organizational learning in the global orbital launch vehicle industry”, Academy of Management Journal, Vol.53, No. 3, pp. 451-476.

Perrow, C.B. (1984), Normal Accidents: Living with High-Risk Technologies, Princeton University Press, Princeton, New Jersey.

Pronovost, P.J., Rosenstein, B.J., Paine, L., Miller, M.R., Haller, K., Davis, R., Demski, R., Garrett, M.R. (2008), “Paying the piper: Investing in infrastructure for patient safety”, The Joint Commission Journal on Quality and Patient Safety, Vol. 34, No. 6, pp. 342-348.

Rasmussen, J. (1997), “Risk management in a dynamic society: A modeling problem”, Safety Science, Vol. 27, No. 2, pp. 183-213.

Sheps, S. and Cardiff, K. (2008), “Creating high reliability organizations in the Canadian healthcare system”, working paper, School of Population and Public Health, University of British Columbia, Vancouver, British Columbia, June 17, 2008.

Sheps, S. and Cardiff, K. (2009), “Resilience engineering: A necessary shift in thinking and practice to improve the management of patient safety”, working paper, School of Population and Public Health, University of British Columbia, Vancouver, British Columbia, August 31, 2009.

Wachter, R.M. and Pronovost, P.J. (2009), “Balancing “no blame” with accountability in patient safety”, New England Journal of Medicine, Vol. 361, pp. 1401-1406.

Waring, J.J. (2009), “Constructing and re-constructing narratives of patient safety”, Social Science and Medicine, Vol. 69, No. 12., pp. 1722-1731.

Woods, D.D. and Cook, R.I. (2002), “Nine steps to move forward from error”, Cognition, Technology & Work, Vol. 4, pp. 137–144.

Woods, D.D. (2005). Conflicts between learning and accountability in patient safety. DePaul Law Review, Vol. 54: 485-502.

Wu, A.W. (2000), “Medical error: the second victim”, British Medical Journal, Vol. 320, pp. 726-727.

Notes

1 School of Population and Public Health, University of British Columbia, 2206 East Mall, Vancouver, B.C. Canada V6T 1Z3 sam.sheps@ubc.ca

2 School of Population and Public Health, University of British Columbia, 2206 East Mall, Vancouver, B.C. Canada V6T 1Z3 karen.cardiff@ubc.ca

3 Healthcare System Safety and Accountability PO Box 238, Elora, Ontario, Canada N0B 1S0 rrobson@hssa.ca

Le texte et les autres éléments (illustrations, fichiers annexes importés) sont sous Licence OpenEdition Books, sauf mention contraire.

Rechercher dans OpenEdition Search

Vous allez être redirigé vers OpenEdition Search