0
Improving Patient Care |

Five System Barriers to Achieving Ultrasafe Health Care FREE

René Amalberti, MD, PhD; Yves Auroy, MD; Don Berwick, MD, MPP; and Paul Barach, MD, MPH
[+] Article and Author Information

From the Cognitive Science Department, Brétigny-sur-Orge, France; Percy Military Hospital, Paris-Clamart, France; Institute for Healthcare Improvement, Cambridge, Massachusetts; and Jackson Memorial Hospital and Miami Center for Patient Safety, University of Miami Medical School, Miami, Florida.


Acknowledgments: The authors thank Frank Davidoff, Jane Rossner, and Marshall Gilula for editorial review and helpful suggestions.

Potential Financial Conflicts of Interest: None disclosed.

Requests for Single Reprints: Paul Barach, MD, MPH, University of Miami Medical School, North Wing 109, 1611 Northwest 12th Avenue, Miami, FL 33136; e-mail, pbarach@med.miami.edu.

Current Author Addresses: Dr. Amalberti: Département sciences cognitives, IMASSA, BP 73, 91223 Brétigny-sur-Orge, France.

Dr. Auroy: Hôpital Percy, Service d'Anesthésie-réanimation, 101 Avenue Henry Barbusse, 92141 Clamart Cedex, France.

Dr. Berwick: Institute for Healthcare Improvement, 20 University Road, 7th Floor, Cambridge, MA 02138.

Dr. Barach: University of Miami Medical School, North Wing 109, 1611 Northwest 12th Avenue, Miami, FL 33136.


Ann Intern Med. 2005;142(9):756-764. doi:10.7326/0003-4819-142-9-200505030-00012
Text Size: A A A
Key Summary Points

In health care, the premium placed on autonomy, the drive for productivity, and the economics of the system may lead to severe safety constraints and adverse medical events.

Several key building blocks must be addressed before other solutions to the problem of unsafe medical care can be considered. Among these building blocks are the need to control maximum production, use of the equivalent actor principle, and the need for standardization of practices.

Safety in health care depends more on dynamic harmony among actors than on reaching an optimum level of excellence at each separate organizational level.

Open dialogue and explicit team training among health care professionals are key factors in establishing a shared culture of safety in health care.

The notion of a 2-tiered system of medicine may evolve logically by distinguishing between health care sectors in which ultrasafety is achievable and sectors that are characterized by ambition, audacity, and aggressive efforts to rescue patients, in which greater risk is inherent in the goals.

More than 5 years ago, the Institute of Medicine report “To Err Is Human: Building a Safer Health System” highlighted the need to make patient safety a major priority for health care authorities (1). Since then, the pressure to increase patient safety has continuously grown in western countries. Priority has focused on identifying and reducing preventable events. Important changes have already been made to the accident and incident reporting system, and the associated techniques of analysis (26). However, the upper limit of harm prevention is unclear (7). Many investigators have proposed that adapting the success strategies and tools of ultrasafe systems, such as those used in the aviation and nuclear power industries, will lead to comparable successes and safety outcomes in health care (89). The reality is probably more complicated. Many complex industries—for example, the chemical industry or road safety—have adapted the safety tools of advanced systems and made important gains in the past 2 decades. However, the safety results from most of these efforts top out well before the level reached by the civil aviation and nuclear power industries (10). This limit does not seem to be due to insufficient tools, low competence among workers, or naive safety strategies. For the most part, it seems to be the consequence of a conscious tradeoff among safety goals, performance goals, and the organization of the specific profession. Becoming ultrasafe may require health care to abandon traditions and autonomy that some professionals erroneously believe are necessary to make their work effective, profitable, and pleasant.

A comparative analysis of industry behavior demonstrates that becoming an ultrasafe provider requires acceptance of 5 overall types of constraints on activity. This analysis is based on the screening of various socio-technical professions, such as the aviation, nuclear power, chemical, and food industries; road transportation; and health care. The benchmark analysis aims to associate specific traits of these industries with their safety performance. We then describe 5 high-level organizational dimensions derived from the general literature on risk and safety (1113), each of which is associated with a range of values: type of expected performance (from daily routine work to highly innovative, and standardized or repetitive), interface of health care providers with patients (from full autonomy to full supervision), type of regulations (from few recommendations to full specification of regulations at an international level), pressure for justice after an accident (from little judicial scrutiny to routine lawsuits against people and systems), and supervision and transparency by media and people in the street of the activity (from little concern to high demand for national supervision).

We consider the value of a given dimension to become a barrier when it is present for all work situations that entail equal or less safety and it is absent for all work situations that entail greater safety. The barriers can be ranged along a safety axis by considering the average safety level of work situations that cannot cross each of these barriers. A barrier may be under partial control and therefore overlap other barriers that are also under partial control, making the relative effect of each barrier on the observed final level of safety more complex.

We consider one barrier to be more constraining than another barrier when the maximum safety performance associated with no control at all is lower than that of another barrier. The barriers to safety that we discuss are fundamental, or root, barriers. Addressing each root barrier demands a substantial change in practice that entails considerable economic, political, and performance tradeoffs.

The overall safety profile of an industrial system is measured by reporting on the number of adverse events over a time interval (for example, an annual rate). The figures are generally weighted according to the volume of activity (such as number of miles traveled per year). The variable that is best for specifying the volume of activity—the denominator in a safety calculation—is industry specific and is therefore poorly standardized across industries. For example, civil aviation uses 1 million departures as the relevant value to calculate volume of activity, whereas military aviation uses the number of flight hours. Reliable measurement of health care and patient safety outcomes is the first challenge for health care benchmarking (1415). In health care, the ethically compelling numerator is preventable harm.

In many industries, the weighting process reflects how comfortable the organization or industry is with its risk exposure. For example, the risk for fatal accidents in road traffic, which is 1 of the top 3 causes of death in western countries, is often weighted by convenience of travel and the mileage traveled (16). Use of this denominator may lead to the perception that road transportation is a very safe domain compared with the alternatives. The unwary consumer of such risk reports may therefore erroneously interpret road travel as much safer than modes of travel for which risk is calculated on the basis of a much larger denominator, such as that used in aviation. In fact, air travel is far safer than road travel. We use the rate of catastrophic events per exposure among industrial and human endeavors as an anchor to allow comparison of accident rates across industries with those in health care (Figure 1). Viewed through this lens, accident rates in health care currently range from 10−1 to 10−7 events per exposure. This ratio is the most accessible and allows easier comparisons across industries.

Grahic Jump Location
Figure 1.
Average rate per exposure of catastrophes and associated deaths in various industries and human activities.

The size of the box represents the range of risk in which a given barrier is active. Reduction of risk beyond the maximum range of a barrier presupposes crossing this barrier. Shaded boxes represent the 5 system barriers. ASA = American Society of Anesthesiologists.

Grahic Jump Location

In the civil aviation, nuclear power, and railway transport industries in Europe, the rate of catastrophic accidents per exposure, such as complete failure of an airplane engine leading to loss of aircraft, is better than 1 × 10−6. That is, the rate of death in these industries is less than 1 per million exposures. The rate of fatal adverse events among hospital patients is much greater but also varies by domain (1). In obstetrics, anesthesiology, or blood transfusion, the risk for fatal adverse events per exposure is less than 10−5(17). Conversely, surgery has a total rate of fatal adverse events of almost 10−4(14). Numerous investigators present this 10−4 risk for accident as an extrapolated average value in health care (1819). However, not all statistics have the same validity, because of differences in definitions and comprehensiveness in monitoring methods (20). Some statistics derive from large databases with objective assessment, whereas others derive from local estimates. The latter is particularly true for health care. The rates of adverse events are most likely reasonably convergent in the published literature, but some investigators have pointed out the importance of an accurate numerator and denominator in the calculation (2122). For our purposes, however, we believe that these variations do not deeply alter the proposed safety framework. We aim to reason more in terms of relative ranking rather than precise safety values. Moreover, our working hypothesis on the stability of the relative ranking is all the more reasonable because the industries from which we are inferring were chosen on the basis of separation by many logs of safety amplitude.

Bearing in mind the caveats regarding calculation of risk, the risk for catastrophic events across industries differs greatly. Some sectors continue to have a low safety level (for example, transport by microlight aircraft or helicopter, and emergency surgery), some are stuck at an average safety level (for example, road safety and occupational accidents in trade or fishing), some are at good levels (for example, the petrochemical industry and anesthesiology), and the best have achieved safety levels beyond 10−6(for example, the nuclear power and civil aviation industries). Five systemic successive barriers seem to characterize limitations in safety improvement.

Barrier 1: Acceptance of Limitations on Maximum Performance

The first barrier involves regulations that limit the level of risk allowed. This level is dictated in situations in which high levels of production and performance are also sought. When limits do not exist—that is, the prevailing attitude is “attain a specified high level of production, no matter what it takes”—the system in question is very unsafe. When the maximum performance is unlimited and individuals or systems are allowed to make autonomous decisions without regulation or constraints, the risk for fatal events nears 1 × 10−2 per exposure. For example, mountain climbers who attain more than three 8000-meter Himalayan peaks have a risk for death that exceeds 1 in 30 (23). This figure has been consistent for more than 50 years. Similar figures are observed for complex, audacious surgical interventions in medicine, such as repair of complex pediatric heart anomalies (24). This level of risk also characterizes amateur and pioneering systems. Of note, the professionals who act under these conditions are often highly competent. Low safety levels do not arise from incompetence. The greater risks in complex domains are incurred by experts who challenge the boundaries of their own maximum performance. The more audacious the expert, the more risky the adopted strategies and the more frequent the adverse outcomes.

Fortunately, most industrial systems have passed beyond the pioneering phase and limit their maximum potential production and performance by comprehensive regulations and self-imposed guidelines. These systems deny even experts absolute discretion. However, overregulation also poses a risk. Flu vaccinations and blood transfusion policies offer good examples of the unintended consequences of overregulation. In the past 2 decades, an impressive series of safety restrictions on blood acquisition has successfully reduced the risks associated with the transmission of HIV and the hepatitis viruses by 10-fold (25). These restrictions have led, however, to a severe reduction in the number of accepted blood donors. This result demonstrates a classic tradeoff between ultrasafety and productivity: The limitations on circumstances under which blood may be donated greatly reduce transmission of serious diseases, but they result in the risk that blood will not be readily available when needed, for example, to treat traumatic shock. This trade-off will affect all patients until reliable synthetic alternatives to blood become available.

Barrier 2: Abandonment of Professional Autonomy

The second barrier involves restriction of the autonomy of health care professionals. In road safety, “traffic” is a collection of drivers, each of whom is pursuing a personal goal (such as destination or timing). For each individual, the other actors (such as other drivers or pedestrians) are in some sense barriers to these personal goals. Road safety requires that drivers not act purely autonomously; rather, they must each subordinate their own goals given that others share the road. In the highly charged political and legal environment of the nuclear power industry, we have seen a gradual reduction in autonomy, with improved safety. The disaster at Three Mile Island led to the emergence of norms throughout this industry (26). The dread of even 1 potential catastrophe and its implications for all industry members outweighed any objections to creating a robust reporting system for near misses and reduced autonomy of plant operators. Backed by communal pressure, local proactive safety methods became institutionalized and effective across the industry. The intensified approach to process improvement through a focus on safety led to financial gains through more efficient power production (fewer outages, fewer shutdowns, and reduction of capacity) (9). In the aviation industry, comprehensive risk management programs developed in the past 30 years (including crew resource management) have reduced the authority of pilots and have made aviation very safe. The decades-long effort to improve safety in aviation through system monitoring and feedback holds many important lessons for health care (27).

A growing movement toward educating health care professionals in teamwork and strict regulations have reduced the autonomy of health care professionals and thereby improved safety in health care (28). But the barrier of too much autonomy cannot be overcome completely when teamwork must extend across departments or geographic areas, such as among hospital wards or departments. For example, unforeseen personal or technical circumstances sometimes cause a surgery to start and end well beyond schedule. The operating room may be organized in teams to face such a change in plan, but the ward awaiting the patient's return is not part of the team and may be unprepared. The surgeon and the anesthesiologist must adopt a much broader representation of the system that includes anticipation of problems for others and moderation of goals, among other factors. Systemic thinking and anticipation of the consequences of processes across departments remain a major challenge.

Barrier 3: Transition from the Mindset of Craftsman to That of an Equivalent Actor

The third barrier appears when systems have already accepted limitations on individual discretion (after the first barrier has been crossed successfully) and can work well at team levels (after the second barrier has been crossed). We believe that to achieve the next increase in safety levels, health care professionals must face a very difficult transition: abandoning their status and self-image as craftsmen and instead adopting a position that values equivalence among their ranks. For example, a commercial airline passenger usually neither knows nor cares who the pilot or the copilot flying their plane is; a last-minute change of captain is not a concern to passengers, as people have grown accustomed to the notion that all pilots are, to an excellent approximation, equivalent to one another in their skills. Patients have a similar attitude toward anesthesiologists when they face surgery. In both cases, the practice is highly standardized, and the professionals involved have, in essence, renounced their individuality in the service of a reliable standard of excellent care. They sell a service instead of an individual identity. As a consequence, the risk for catastrophic death in healthy patients (American Society of Anesthesiologists risk category 1 or 2) undergoing anesthesia is very low—close to 1 × 10−6 per anesthetic episode (29).

Conversely, most patients specifically request and can recall the name of their surgeon. Often, the patient has chosen the surgeon and believes that the result of surgery could vary according to that choice. This view is typical of a craftsman market. Safety outcomes for surgeons are much worse than for anesthesiologists, nearer to 1 × 10−4 than to 1 × 10−6(30). This rate of risk is also seen in many small industries, such as chemical companies, charter airlines, or farm-produce industries (in companies with <50 employees). In France, the risk for catastrophe in such small companies is worse than 1 × 10−4, whereas larger chemical companies have an average risk closer to 1 × 10−5 and large aviation companies and food manufacturers are even safer, with a risk near 1 × 10−6(31). The difference between small companies and big companies lies in the “à la carte” production of products, such as cheese or fireworks. These products belong to protected commercial niches that have suffered over time from large variations in quality and product safety.

Ultrastandardization and the equivalent actor principle require stable conditions for activity. These conditions are reached in some sectors of health care, such as pharmacy, radiology, and nonemergency anesthesiology. They are less common in intensive care units and emergency surgery, in which unstable conditions, such as nonpermanent staff, variation in patient acuity, and little control of planning, are the norm.

Barrier 4: Need for System-Level Arbitration To Optimize Safety Strategies

The increase in pressure from medical malpractice liability and media scrutiny has created a need for system-level arbitration. These factors are the paradoxical consequences of systems that have reached very safe levels of performance (32). The safer a system is, the more likely it is that society will seek to hold people accountable or seek legal recourse when injuries occur. In France, for example, the rate of patients' official complaints per 1000 beds has increased from 11.5% in 1999 to 16.7% in 2004 (33). The consequence of this paradox is that the patient injuries that occur tend to be very expensive in terms of patient compensation and help to fuel the liability crisis. Accidents become politically and financially intolerable because of their consequences and cost rather than because of their frequency and severity. The public and the media can censure companies or hospitals, leading to sweeping new policies, firing of individuals, and sometimes destruction of industries. The recent passing in Florida of a constitutional amendment that makes all quality assurance data available to the public represents the public's desire to better scrutinize health care providers (34). This amendment has already led to a decrease in reporting of adverse events in Florida. Of note, the most recent safety problems can bias objective assessment of risk because a lower value is assigned to individual deaths, even if numerous, and a higher value is assigned to a concentration of deaths, even if rare. This biased view is seen, for example, with respect to aircraft accidents. In such conditions, health care professionals fear for their own position and behave accordingly. The liability crisis in Florida supports this barrier as a key to creating safer conditions when physicians feel vulnerable.

The fourth barrier results from the tendency of professionals and their unions to overprotect themselves in the face of legal pressures and threats of litigation. This overprotection occurs when insufficient consideration is given to the unintended consequences for the rest of the staff and system. This barrier echoes the second barrier, excessive autonomy, except that it is much more subtle and perverse. The actors are not ignoring the goals and constraints of their colleagues, as in the second barrier. Health care professionals and executives support their willingness to improve safety by confronting the fourth barrier and act by imposing additional constraints on other colleagues. However, the perverse effect is that their safety decisions primarily absolve them of their responsibility without clear recognition of the impact of their decisions.

An additional perverse effect of top-down safety reinforcement is the potential difference in perception of patient safety at the various levels of the organization. Top management views safety in terms of mitigating the consequences of a crisis, so as to avoid jeopardizing the total organization (35). To them, patient safety is just another source of risk, among other sources that have similar consequences to the organization, such as troubled industrial relations or inadequate cash flow. Chairs of clinical departments traditionally approach safety by confusing safety with quality and focus on production-line issues. Individual clinicians are more aware of patient safety issues because of the risks that may damage their own self-image, reputation, or financial wherewithal. Societal censure, including personal or social confrontation with one's own errors or failures, is difficult for clinicians to accept (3637).

The blood transfusion crises of the 1980s provide an example of the fourth barrier, with consequent overregulation and conflicts among the 3 levels. Many patients were infected with HIV because systematic testing for HIV infection was not done during blood donation in France. It was commonly believed that the public health authorities delayed the introduction of HIV testing to avoid losing money and national reputation by refusing to use a testing kit from the United States (38). In fact, the relevant medical authorities had recommended immediate action. As the crisis became public, blood donations became scarce because of a loss of donor confidence. Furthermore, the increased controls and paperwork made donation more demanding and time-consuming. The result was that many physicians voluntarily reduced the use of blood. Today, 20 years after the crisis and after implementation of dozens of additional controls, the risk for transfusion-transmitted viral infection is intrinsically much lower (39). But an unintended consequence of these events in France has been a resurgence of severe anemia. Thus, a disease that had essentially disappeared in western countries is reemerging (40).

Barrier 5: The Need To Simplify Professional Rules and Regulations

The fifth barrier typically derives from the perverse effect of excellence. It is generated by the accumulation of layers that are intended to improve safety but make the system overly complex, burdensome, and ultraprotected. Because reporting of accidents loses relevance, people forget to report them. The visibility of risk becomes small, and decisions are made without clear proof of their benefit, sometimes introducing contradictions among regulations and policies (41). New safety solutions implemented at this point have unintended effects. For example, the rate of production of new guidance materials and rules by the European Joint Aviation Regulators is substantially increasing. More than 200 new policies, guidance documents, and rules are created each year even though the safety of global commercial aviation has been at a plateau of 1 × 10−6 for years. Because little is known about which rules or new guidelines are truly linked to safety levels, the system is purely additive: Old rules and guidance material are never discarded or removed. Regulations become difficult to apply, and pilots violate more rules in reaction to increasing legal pressure.

Some areas of medicine follow this same pattern. For example, regulations to protect against fire in hospitals in France have been revised 5 times in the past 8 years, despite few data on whether these regulations have improved fire safety as opposed to simply making the system more unwieldy. In health care, arcane vocabulary and the complexity and opaqueness of processes cause risk to be poorly visible to practitioners.

The most recent and most sophisticated safety solutions for automated aircraft are electronic cocoons and flight envelopes that protect the aircraft against pilot errors, such as excessive commands or too much or too little speed. Paradoxically, these protections have became an important cause of several recent near misses and accidents in glass cockpits, because of crew misunderstanding of the software logic (4243). We can extrapolate from the unintended consequences of automation and complexity in aviation to health care, pointing to the comparable negative side effects of technical complexity in medicine. When risks to patients become less observable, the best move is to simplify the system, eliminate nonproductive regulations, and give clinicians more latitude in decision making.

Health care faces many of the same barriers that other industries have faced in striving toward ultrasafety. However, health care must accommodate at least 3 additional industry-specific factors. First, risks in health care are not homogeneous. In many clinical domains, such as trauma surgery, the rate of serious complications is 1 × 10−2, but not all complications are related to medical errors (44). Rather, the risks are inherent in the clinical circumstances. In contrast, some health care sectors, such as gastroenterologic endoscopy, are inherently very safe, with a risk for serious adverse events of less than 1 × 10−5 per exposure. Second, the magnitude and impact of human error are unclear in medicine. Fundamentally, 3 risks are combined in health care: that of the disease itself, that entailed by the medical decision, and that of implementing the selected therapy. These 3 risks generally do not move in the same direction. This complexity makes error prevention harder to predict and grasp. The prognosis for a terminally ill patient may change because of an audacious surgical strategy. However, the most audacious strategies are less evenly distributed in the profession, are the most demanding technically, and are the most prone to errors. Finally, the risk for personal harm, such as becoming infected with HIV, weighs on the clinical staff in a unique way.

The unusual degree of stress that health care workers experience derives from at least 4 factors. First, health care is one of the few risk-prone areas in which public demand considerably constricts the application of common-sense safety-enhancing solutions, such as limiting the flow and choice of incoming patients. This demand is a direct threat to overcoming the 1st barrier (limitation of performance). Second, health care is also one of the few risk-prone areas in which the system is extensively supported by novices, such as students, interns, and residents. Several attempts have been made to evaluate the risks associated with performance by beginners by improved supervision and restricted work hours (45). A study from the U.S. Department of Veterans Affairs demonstrates that the risks of surgery remain higher in teaching hospitals, even after adjustment for case severity (46). Third, health care is one of the few risk-prone areas in which so many obvious sources of human error exist in the system, yet little has been done to reduce them. Sources of error include excessive fatigue on the job, systematic working of overtime, overloaded work schedules, and chronic shortage of staff (4749). Finally, an endemic source of errors in medicine is the shifting of more clinical care and technology to the ambulatory setting: for example, performance of liposuction in the physician's office rather than a hospital. A recent report suggested that the risk for death associated with liposuction combined with tummy tucks is 10-fold greater when these procedures are performed in clinics as opposed to hospitals (50).

Health care is constantly improving in effectiveness and safety, but the industry has yet to traverse fully the 5 barriers to ultrasafety. Quality improvement programs, including a set of monitoring tools (such as reporting procedures and standardized protocols), alone cannot overcome these barriers. To paraphrase Reason, speaking of the aviation community in the 1980s, the current efforts to improve patient safety are akin to “fighting mosquitoes, but not draining the swamp [of error]” (13). The aviation industry made considerable efforts in the 1980s and 1990s to overcome the first 3 barriers and is now focusing on the fourth and fifth barriers. Health care has yet to master even the first barrier to safe performance. Consequently, efforts to emulate the aviation industry are misplaced if the health care industry focuses on the same barriers as the aviation industry. Mastering the 5 barriers will be a challenge and will require accepting limitations on performance by reducing professional discretion and pressures for productivity. Reduction of errors may also constrain the professional latitude that health care providers currently have. Specialization is constraining performance rather than treating each physician as an expert of unlimited capability and decisions.

Ultrasafe systems have a definite tendency toward constraint. For example, civil aviation restricts pilots in terms of the type of plane they may fly, limits operations on the basis of traffic and weather conditions, and maintains a list of the minimum equipment required before an aircraft can fly. Line pilots are not allowed to exceed these limits even when they are trained and competent. Hence, the flight (product) offered to the client is safe, but it is also often delayed, rerouted, or cancelled. Would health care and patients be willing to follow this trend and reject a surgical procedure under circumstances in which the risks are outside the boundaries of safety? Physicians already accept individual limits on the scope of their maximum performance in the privileging process; societal demand, workforce strategies, and competing demands on leadership will undermine this goal. A hard-line policy may conflict with ethical guidelines that recommend trying all possible measures to save individual patients. The further question is raised of how the equal-actor model can account for the patient's need for a relationship with his or her physician.

Another important lesson from other industries is the move from training, regulation, and assessment of individuals to that of teams of health care providers. Given the interdisciplinary nature of health care and the need for cooperation among those who deliver it, teamwork is critical to ensuring patient safety and recovery from and mitigation of error. Teams make fewer mistakes than do individuals, especially when each team member knows his or her own responsibilities as well as those of other team members. Teams elevate the importance of non-physician input and reduce physician autonomy. However, simply installing a team structure does not automatically ensure it will operate effectively. Teamwork is not an automatic consequence of collocating people and depends on a willingness to cooperate for the sake of a shared goal (51). Medical educators are focusing on creating explicit training that targets team-based competencies.

An improved vision by leadership of the safety and dangers of health care is needed to optimize the risk-benefit ratio. Stratification could lead to 2 tiers or “speeds” of medical care, each with its own type and level of safety goals. This 2-tier system could distinguish between medical domains that are stable enough to reach criteria for ultra-safety and those that will always deal with unstable conditions and are therefore inevitably less safe. For the latter sector of medicine, high-reliability organizations may offer a sound safety model (52). High-reliability organizations are those that have consistently reduced the number of expected or “normal” accidents (according to the normal accident theory) through such means as change to culture and technologic advances, despite an inherently high-stress, fast-paced environment (53)(Figure 2).

Grahic Jump Location
Figure 2.
A strategic view of safety in health care.

A 2-tiered system of medicine may result from the distinction between a limited number of clinical domains that can achieve ultrasafety and sectors in which a certain level of risk is inherent. HRO = high-reliability organization.

Grahic Jump Location

High-reliability health care organizations will be unable to constrain their production—they must inherently be innovative and require flexible risk arbitration and adaptation rather than strict limits (54). The Table shows a detailed comparison of these 2 possible tiers of health care. Physician training would have to accommodate this 2-tiered approach, and patients would have to understand that aggressive treatment of high-risk disease may require acceptance of greater risk and number of medical errors during clinical treatment.

Table Jump PlaceholderTable.  A Two-Tiered System of Medicine
Kohn LT, Corrigan JM, Donaldson MS.  To Err Is Human: Building a Safer Health System. Committee on Quality in America. Washington, DC: Institute of Medicine, National Academy Pr; 1999.
 
Vincent C, Taylor-Adams S, Stanhope N.  Framework for analyzing risk and safety in clinical medicine. BMJ. 1998; 316:1154-7. PubMed
CrossRef
 
Bates DW, Cullen DJ, Laird N, Petersen LA, Small SD, Servi D. et al.  Incidence of adverse drug events and potential adverse drug events. Implications for prevention. ADE Prevention Study Group. JAMA. 1995; 274:29-34. PubMed
 
Battles JB, Shea CE.  A system of analyzing medical errors to improve GME curricula and programs. Acad Med. 2001; 76:125-33. PubMed
 
Gaba DM.  Anaesthesiology as a model for patient safety in health care. BMJ. 2000; 320:785-8. PubMed
 
Edmondson AC.  Learning from mistakes is easier said than done: group and organizational influences on the detection and correction of human error. J Appl Behav Sci. 1996; 32:5-28.
 
Leape LL, Berwick DM.  Safe health care: are we up to it? [Editorial]. BMJ. 2000; 320:725-6. PubMed
 
Barach P, Small SD.  Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems. BMJ. 2000; 320:759-63. PubMed
 
Apostolakis G, Barach P.  Lessons learned from nuclear power. Hatlie M, Tavill K Patient Safety: International Textbook. New York: Aspen; 2003; 205-25.
 
Lucas DA.  Human performance data collection in industrial systems.  Human Reliability in Nuclear Power. London: IBC Technical Services; 1987.
 
Hollnagel E.  Barriers and Accident Prevention. Aldershot, United Kingdom: Ashgate Avebury; 2004.
 
Rasmussen J, Svedung I.  Proactive Risk Management in a Dynamic Society. Swedish Rescue Services Agency. Karlstad, Sweden: Räddningsverket; 2000.
 
Reason J.  Managing the Risk of Organizational Accidents. Aldershot, United Kingdom: Ashgate Avebury; 1997.
 
Thomas EJ, Studdert DM, Burstin HR, Orav EJ, Zeena T, Williams EJ. et al.  Incidence and types of adverse events and negligent care in Utah and Colorado. Med Care. 2000; 38:261-71. PubMed
 
Michel P, Quenon JL, de Sarasqueta AM, Scemama O.  Comparison of three methods for estimating rates of adverse events and rates of preventable adverse events in acute care hospitals. BMJ. 2004; 328:199. PubMed
 
Richter ED, Barach P, Friedman L, Krikler S, Israeli A.  Raised speed limits, speed spillover, case-fatality rates, and road deaths in Israel: a 5-year follow-up. Am J Public Health. 2004; 94:568-74. PubMed
 
Joseph VA, Lagasse RS.  Monitoring and analysis of outcome studies. Int Anesthesiol Clin. 2004; 42:113-30. PubMed
 
Leape LL.  Error in medicine. JAMA. 1994; 272:1851-7. PubMed
 
Gaba D.  Structural and organizational issues in patient safety. California Management Review. 2000; 43:83-102.
 
Nebeker JR, Barach P, Samore MH.  Clarifying adverse drug events: a clinician's guide to terminology, documentation, and reporting. Ann Intern Med. 2004; 140:795-801. PubMed
 
Hayward RA, Hofer TP.  Estimating hospital deaths due to medical errors: preventability is in the eye of the reviewer. JAMA. 2001; 286:415-20. PubMed
 
Murff HJ, Patel VL, Hripcsak G, Bates DW.  Detecting adverse events for patient safety research: a review of current methodologies. J Biomed Inform. 2003; 36:131-43. PubMed
 
Oelz O.  Risk assessment and risk management in high altitude climbing. Proceedings of the 19th Myron B. Laver International Post-graduate Course on Risk Management in Anaesthesia, University of Basel, Basel, Switzerland, 26-27 March 1999. Basel: Univ of Basel; 1999:5-9.
 
de Leval MR, Carthey J, Wright DJ, Farewell VT, Reason JT.  Human factors and cardiac surgery: a multicenter study. J Thorac Cardiovasc Surg. 2000; 119:661-72. PubMed
 
Anderegg K, Kiss J.  The risk of viral infection from a blood transfusion in the tri-state region. Transfusion Medicine Update. Accessed athttp://www.itxm.org/TMU1995/tmu2-95.htmon 20 December 2004.
 
Rees JV.  Hostages to Each Other: The Transformation of Nuclear Safety since Three Mile Island. Chicago: Univ of Chicago Pr; 1994.
 
Billings CE.  Some hopes and concerns regarding medical event-reporting systems: lessons from the NASA Aviation Safety Reporting System [Editorial]. Arch Pathol Lab Med. 1998; 122:214-5. PubMed
 
Salas E, Burke CS, Stagl KC.  Developing teams and team leaders: strategies and principles. Demaree RG, Zaccaro SJ, Halpin SM Leader Development for Transforming Organizations. Mahwah, NJ: Lawrence Erlbaum; 2004.
 
Arbous MS, Grobbee DE, van Kleef JW, de Lange JJ, Spoormans HH, Touw P. et al.  Mortality associated with anaesthesia: a qualitative analysis to identify risk factors. Anaesthesia. 2001; 56:1141-53. PubMed
 
Baker GR, Norton PG, Flintoft V, Blais R, Brown A, Cox J. et al.  The Canadian Adverse Events Study: the incidence of adverse events among hospital patients in Canada. CMAJ. 2004; 170:1678-86. PubMed
 
Saugues O, Gonnot FM.  40 propositions pour améliorer la sécurité du transport aérien de voyageurs, n°1717 [Forty propositions to enhance travelers' air-transport safety, number 1717]. Assemblée nationale. 2004.
 
Amalberti R.  Revisiting safety and human factors paradigms to meet the safety challenges of ultra complex and safe systems. Willpert B, Falhbruch B System Safety: Challenges and Pitfalls of Interventions. Amsterdam: Elsevier; 2002; 265-76.
 
Personal communiqué. Information from MACSF, French Medical Insurance. Accessedhttp://www.lexpress.fr/info/sciences/dossier/sante/consultationon 17 August 2004. Also available as “Evolution de la fréquence RCH (corporel) en France” [Evolution of the rate of patients' official complaints in French medical establishments]; Accessed athttp://www.sham.fron 8 March 2005.
 
LaMendola B.  Doctors, lawyers take fight to court. Tampa Sun-Sentinel. Accessed athttp://www.miami.edu/UMH/CDA/UMH_Main/1,1770,32072-3,00.htmlon 4 November 2004.
 
Carroll D.  Chronic obstruction of major pulmonary arteries. Am J Med. 1950; 9:175-85. PubMed
 
Wu AW, Folkman S, McPhee SJ, Lo B.  Do house officers learn from their mistakes? JAMA. 1991; 265:2089-94. PubMed
 
Vohra P, Mohr J, Daugherty C, Ming C, Barach P.  Attitudes of medical students and housestaff towards medical errors and adverse events [Abstract]. Anesth Analg. 2004 . Abstract A1342.
 
Gremy F.  [Knowledge and communication in public health. Evolution of scientific knowledge during the dramatic moment of blood contamination. 1—Debates concerning the origin and nature of AIDS]. Sante Publique. 2000; 12:91-108. PubMed
 
Pillonel S, Laperche S.  Evolution du risque transfusionnel de transmission des infections virales entre 1992 et 2003 en France et impact du test de dépistage génomique viral. L'Etablissement Français du Sang. Eurosurveillance. 2005;10:5-6. Accessed athttp://www.eurosurveillance.org/em/v10n02/1002-123.aspon 8 March 2005.
 
Ozier Y.  Evaluation des risques liés au retard à la transfusion et à la non transfusion [Evaluation of risks associated with delays in blood transfusion or non-transfusion]. In: Proceedings, Symposium Hémovigilance, XXI Congrès de la SFTS, Saint-Etienne, France, 16-19 June 2003. Paris: Société Française d'Hémovigilance et de Thérapeutique Tranfusionnelle; 2003.
 
Amalberti R.  The paradoxes of almost totally safe transportation systems. Saf Sci. 2001; 37:109-26.
 
Abbott K, Slotte S, Stimson D.  The Interfaces between Flight Crews and Modern Flight Deck Systems. Washington, DC: Federal Aviation Administration; 1996.
 
Cook RI, Woods DD.  Implications of automation surprises in aviation for the future of total intravenous anesthesia (TIVA). J Clin Anesth. 1996; 8:29S-37S. PubMed
 
Khuri SF, Daley J, Henderson W, Hur K, Demakis J, Aust JB. et al.  The Department of Veterans Affairs' NSQIP: the first national, validated, outcome-based, risk-adjusted, and peer-controlled program for the measurement and enhancement of the quality of surgical care. National VA Surgical Quality Improvement Program. Ann Surg. 1998; 228:491-507. PubMed
 
Philibert I, Barach P.  Residents' hours of work [Editorial]. BMJ. 2002; 325:1184-5. PubMed
 
Khuri SF, Najjar SF, Daley J, Krasnicka B, Hossain M, Henderson WG. et al.  Comparison of surgical outcomes between teaching and nonteaching hospitals in the Department of Veterans Affairs. Ann Surg. 2001; 234:370-82. PubMed
 
Gander PH, Merry A, Millar MM, Weller J.  Hours of work and fatigue-related error: a survey of New Zealand anaesthetists. Anaesth Intensive Care. 2000; 28:178-83. PubMed
 
Taffinder NJ, McManus IC, Gul Y, Russell RC, Darzi A.  Effect of sleep deprivation on surgeons' dexterity on laparoscopy simulator [Letter]. Lancet. 1998; 352:1191. PubMed
 
Gaba DM, Howard SK.  Patient safety: fatigue among clinicians and the safety of patients. N Engl J Med. 2002; 347:1249-55. PubMed
 
Vila H Jr, Soto R, Cantor AB, Mackey D.  Comparative outcomes analysis of procedures performed in physician offices and ambulatory surgery centers. Arch Surg. 2003; 138:991-5. PubMed
 
Baker DP, Gustafson S, Beaubien JM, Salas E, Barach P.  Medical Teamwork and Patient Safety: The Evidence-Based Relation. Washington, DC: Am Institute for Research; 2003.
 
Weick K, Sutcliffe K.  Managing the Unexpected: Assuring High Performance in a Range of Complexity. San Francisco: Jossey-Bass; 2001.
 
Perrow C.  Normal Accidents: Living with High Risk Technologies. New York: Basic Books; 1984.
 
Roberts KH.  Some characteristics of high reliability organizations. Organization Science. 1990; 1:160-77.
 

Figures

Grahic Jump Location
Figure 1.
Average rate per exposure of catastrophes and associated deaths in various industries and human activities.

The size of the box represents the range of risk in which a given barrier is active. Reduction of risk beyond the maximum range of a barrier presupposes crossing this barrier. Shaded boxes represent the 5 system barriers. ASA = American Society of Anesthesiologists.

Grahic Jump Location
Grahic Jump Location
Figure 2.
A strategic view of safety in health care.

A 2-tiered system of medicine may result from the distinction between a limited number of clinical domains that can achieve ultrasafety and sectors in which a certain level of risk is inherent. HRO = high-reliability organization.

Grahic Jump Location

Tables

Table Jump PlaceholderTable.  A Two-Tiered System of Medicine

References

Kohn LT, Corrigan JM, Donaldson MS.  To Err Is Human: Building a Safer Health System. Committee on Quality in America. Washington, DC: Institute of Medicine, National Academy Pr; 1999.
 
Vincent C, Taylor-Adams S, Stanhope N.  Framework for analyzing risk and safety in clinical medicine. BMJ. 1998; 316:1154-7. PubMed
CrossRef
 
Bates DW, Cullen DJ, Laird N, Petersen LA, Small SD, Servi D. et al.  Incidence of adverse drug events and potential adverse drug events. Implications for prevention. ADE Prevention Study Group. JAMA. 1995; 274:29-34. PubMed
 
Battles JB, Shea CE.  A system of analyzing medical errors to improve GME curricula and programs. Acad Med. 2001; 76:125-33. PubMed
 
Gaba DM.  Anaesthesiology as a model for patient safety in health care. BMJ. 2000; 320:785-8. PubMed
 
Edmondson AC.  Learning from mistakes is easier said than done: group and organizational influences on the detection and correction of human error. J Appl Behav Sci. 1996; 32:5-28.
 
Leape LL, Berwick DM.  Safe health care: are we up to it? [Editorial]. BMJ. 2000; 320:725-6. PubMed
 
Barach P, Small SD.  Reporting and preventing medical mishaps: lessons from non-medical near miss reporting systems. BMJ. 2000; 320:759-63. PubMed
 
Apostolakis G, Barach P.  Lessons learned from nuclear power. Hatlie M, Tavill K Patient Safety: International Textbook. New York: Aspen; 2003; 205-25.
 
Lucas DA.  Human performance data collection in industrial systems.  Human Reliability in Nuclear Power. London: IBC Technical Services; 1987.
 
Hollnagel E.  Barriers and Accident Prevention. Aldershot, United Kingdom: Ashgate Avebury; 2004.
 
Rasmussen J, Svedung I.  Proactive Risk Management in a Dynamic Society. Swedish Rescue Services Agency. Karlstad, Sweden: Räddningsverket; 2000.
 
Reason J.  Managing the Risk of Organizational Accidents. Aldershot, United Kingdom: Ashgate Avebury; 1997.
 
Thomas EJ, Studdert DM, Burstin HR, Orav EJ, Zeena T, Williams EJ. et al.  Incidence and types of adverse events and negligent care in Utah and Colorado. Med Care. 2000; 38:261-71. PubMed
 
Michel P, Quenon JL, de Sarasqueta AM, Scemama O.  Comparison of three methods for estimating rates of adverse events and rates of preventable adverse events in acute care hospitals. BMJ. 2004; 328:199. PubMed
 
Richter ED, Barach P, Friedman L, Krikler S, Israeli A.  Raised speed limits, speed spillover, case-fatality rates, and road deaths in Israel: a 5-year follow-up. Am J Public Health. 2004; 94:568-74. PubMed
 
Joseph VA, Lagasse RS.  Monitoring and analysis of outcome studies. Int Anesthesiol Clin. 2004; 42:113-30. PubMed
 
Leape LL.  Error in medicine. JAMA. 1994; 272:1851-7. PubMed
 
Gaba D.  Structural and organizational issues in patient safety. California Management Review. 2000; 43:83-102.
 
Nebeker JR, Barach P, Samore MH.  Clarifying adverse drug events: a clinician's guide to terminology, documentation, and reporting. Ann Intern Med. 2004; 140:795-801. PubMed
 
Hayward RA, Hofer TP.  Estimating hospital deaths due to medical errors: preventability is in the eye of the reviewer. JAMA. 2001; 286:415-20. PubMed
 
Murff HJ, Patel VL, Hripcsak G, Bates DW.  Detecting adverse events for patient safety research: a review of current methodologies. J Biomed Inform. 2003; 36:131-43. PubMed
 
Oelz O.  Risk assessment and risk management in high altitude climbing. Proceedings of the 19th Myron B. Laver International Post-graduate Course on Risk Management in Anaesthesia, University of Basel, Basel, Switzerland, 26-27 March 1999. Basel: Univ of Basel; 1999:5-9.
 
de Leval MR, Carthey J, Wright DJ, Farewell VT, Reason JT.  Human factors and cardiac surgery: a multicenter study. J Thorac Cardiovasc Surg. 2000; 119:661-72. PubMed
 
Anderegg K, Kiss J.  The risk of viral infection from a blood transfusion in the tri-state region. Transfusion Medicine Update. Accessed athttp://www.itxm.org/TMU1995/tmu2-95.htmon 20 December 2004.
 
Rees JV.  Hostages to Each Other: The Transformation of Nuclear Safety since Three Mile Island. Chicago: Univ of Chicago Pr; 1994.
 
Billings CE.  Some hopes and concerns regarding medical event-reporting systems: lessons from the NASA Aviation Safety Reporting System [Editorial]. Arch Pathol Lab Med. 1998; 122:214-5. PubMed
 
Salas E, Burke CS, Stagl KC.  Developing teams and team leaders: strategies and principles. Demaree RG, Zaccaro SJ, Halpin SM Leader Development for Transforming Organizations. Mahwah, NJ: Lawrence Erlbaum; 2004.
 
Arbous MS, Grobbee DE, van Kleef JW, de Lange JJ, Spoormans HH, Touw P. et al.  Mortality associated with anaesthesia: a qualitative analysis to identify risk factors. Anaesthesia. 2001; 56:1141-53. PubMed
 
Baker GR, Norton PG, Flintoft V, Blais R, Brown A, Cox J. et al.  The Canadian Adverse Events Study: the incidence of adverse events among hospital patients in Canada. CMAJ. 2004; 170:1678-86. PubMed
 
Saugues O, Gonnot FM.  40 propositions pour améliorer la sécurité du transport aérien de voyageurs, n°1717 [Forty propositions to enhance travelers' air-transport safety, number 1717]. Assemblée nationale. 2004.
 
Amalberti R.  Revisiting safety and human factors paradigms to meet the safety challenges of ultra complex and safe systems. Willpert B, Falhbruch B System Safety: Challenges and Pitfalls of Interventions. Amsterdam: Elsevier; 2002; 265-76.
 
Personal communiqué. Information from MACSF, French Medical Insurance. Accessedhttp://www.lexpress.fr/info/sciences/dossier/sante/consultationon 17 August 2004. Also available as “Evolution de la fréquence RCH (corporel) en France” [Evolution of the rate of patients' official complaints in French medical establishments]; Accessed athttp://www.sham.fron 8 March 2005.
 
LaMendola B.  Doctors, lawyers take fight to court. Tampa Sun-Sentinel. Accessed athttp://www.miami.edu/UMH/CDA/UMH_Main/1,1770,32072-3,00.htmlon 4 November 2004.
 
Carroll D.  Chronic obstruction of major pulmonary arteries. Am J Med. 1950; 9:175-85. PubMed
 
Wu AW, Folkman S, McPhee SJ, Lo B.  Do house officers learn from their mistakes? JAMA. 1991; 265:2089-94. PubMed
 
Vohra P, Mohr J, Daugherty C, Ming C, Barach P.  Attitudes of medical students and housestaff towards medical errors and adverse events [Abstract]. Anesth Analg. 2004 . Abstract A1342.
 
Gremy F.  [Knowledge and communication in public health. Evolution of scientific knowledge during the dramatic moment of blood contamination. 1—Debates concerning the origin and nature of AIDS]. Sante Publique. 2000; 12:91-108. PubMed
 
Pillonel S, Laperche S.  Evolution du risque transfusionnel de transmission des infections virales entre 1992 et 2003 en France et impact du test de dépistage génomique viral. L'Etablissement Français du Sang. Eurosurveillance. 2005;10:5-6. Accessed athttp://www.eurosurveillance.org/em/v10n02/1002-123.aspon 8 March 2005.
 
Ozier Y.  Evaluation des risques liés au retard à la transfusion et à la non transfusion [Evaluation of risks associated with delays in blood transfusion or non-transfusion]. In: Proceedings, Symposium Hémovigilance, XXI Congrès de la SFTS, Saint-Etienne, France, 16-19 June 2003. Paris: Société Française d'Hémovigilance et de Thérapeutique Tranfusionnelle; 2003.
 
Amalberti R.  The paradoxes of almost totally safe transportation systems. Saf Sci. 2001; 37:109-26.
 
Abbott K, Slotte S, Stimson D.  The Interfaces between Flight Crews and Modern Flight Deck Systems. Washington, DC: Federal Aviation Administration; 1996.
 
Cook RI, Woods DD.  Implications of automation surprises in aviation for the future of total intravenous anesthesia (TIVA). J Clin Anesth. 1996; 8:29S-37S. PubMed
 
Khuri SF, Daley J, Henderson W, Hur K, Demakis J, Aust JB. et al.  The Department of Veterans Affairs' NSQIP: the first national, validated, outcome-based, risk-adjusted, and peer-controlled program for the measurement and enhancement of the quality of surgical care. National VA Surgical Quality Improvement Program. Ann Surg. 1998; 228:491-507. PubMed
 
Philibert I, Barach P.  Residents' hours of work [Editorial]. BMJ. 2002; 325:1184-5. PubMed
 
Khuri SF, Najjar SF, Daley J, Krasnicka B, Hossain M, Henderson WG. et al.  Comparison of surgical outcomes between teaching and nonteaching hospitals in the Department of Veterans Affairs. Ann Surg. 2001; 234:370-82. PubMed
 
Gander PH, Merry A, Millar MM, Weller J.  Hours of work and fatigue-related error: a survey of New Zealand anaesthetists. Anaesth Intensive Care. 2000; 28:178-83. PubMed
 
Taffinder NJ, McManus IC, Gul Y, Russell RC, Darzi A.  Effect of sleep deprivation on surgeons' dexterity on laparoscopy simulator [Letter]. Lancet. 1998; 352:1191. PubMed
 
Gaba DM, Howard SK.  Patient safety: fatigue among clinicians and the safety of patients. N Engl J Med. 2002; 347:1249-55. PubMed
 
Vila H Jr, Soto R, Cantor AB, Mackey D.  Comparative outcomes analysis of procedures performed in physician offices and ambulatory surgery centers. Arch Surg. 2003; 138:991-5. PubMed
 
Baker DP, Gustafson S, Beaubien JM, Salas E, Barach P.  Medical Teamwork and Patient Safety: The Evidence-Based Relation. Washington, DC: Am Institute for Research; 2003.
 
Weick K, Sutcliffe K.  Managing the Unexpected: Assuring High Performance in a Range of Complexity. San Francisco: Jossey-Bass; 2001.
 
Perrow C.  Normal Accidents: Living with High Risk Technologies. New York: Basic Books; 1984.
 
Roberts KH.  Some characteristics of high reliability organizations. Organization Science. 1990; 1:160-77.
 

Letters

NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).

Comments

Submit a Comment
Sixth System Barrier to Ultrasafe Healthcare
Posted on May 4, 2005
W. Harry Horner
No Affiliation
Conflict of Interest: None Declared

Over the past forty years organized medicine and many practitioners have been co-opted by third party payers through the grinding, day in and day out negotiation that goes on between medicine, business and government. Because the arrangements arrived at by this triumvirate have been made largely in secret, beyond the control of patients,and to their disadvantage, this relationship can accurately, and unfortunately, be described as the practice of medicine by conspiracy. This conspiracy has resulted in chronically inadequate (1) and inefficient (2) medical care and permitted the government, insurance companies, and corporations to dominate the practice of medicine with both frustrating and tragic results.

The relevance of this conspiracy, with regard to patient safety, is that individual physicians, nurses, and other health care workers who behave ethically, by reporting inadequate, unsafe patient care, may find themselves not only ignored, but retaliated against (3,4), rather than being positively acknowledged for their professionalism and patient advocacy.

Such unethical and inappropriate attacks against physicians "“ and obstruction of improvements in patient care - are made possible by other physicians who initiate or co-operate "“ either actively or passively - with retaliation for a variety of reasons. They may see targeted physicians as competitors; or seek to gain favor with regulators and/or financiers; i.e. power; or they may simply identify with the aggressor in an understandable, but misguided, attempt to avoid the fate of the targeted physcian. In any case, successful retaliation of this sort against one staff member sends a clear message to others that it is not in their best interest to voice concerns regarding issues such as inadequate training, supervision, staffing, equipment, medications, etc.

I suggest that professional ethics are a sixth and major barrier to ultrasafe healthcare. Unfortunately, it is clear that individual sacrifice on the altar of patient safety is not enough (3). Organized medicine must make it clear through actions, as well as words, to its members, hospitals, and lawmakers, that patient advocacy is a professional responsibility and that retaliation against professionals who speak out ("Profession" comes from the Latin "speaking forth"), in order to prevent patient injury and assure adequate care, will not be tolerated. Until that happens patient safety will continue to be merely an aspiration. No matter how well designed, a system "“ any system "“ is, ultimately it is only as good as its operators.

1. Mathes CD, Sadena R, Salomon JA, Murray CJL, Lopez AD. Healthy life expectancy in 191 countries, 1999. Lancet 2001 357:1658-1691.

2. Evans DB, Tandon A, Murray CJL, Lauer JA. The comparative efficiency of national health systems in producing health: an analysis of 191 countries. GPE Discussion Paper Series: No. 29. WHO. 2000.

3. Twedt S. The cost of courage: How the tables turn on doctors. Pittsburgh Post Gazette. Oct. 26-29, 2003. http://www.post- gazette.com/pg/03299/234499.stm. Checked 5/3/05.

4 Plantz SH, Kreplick LW, Panacek EA, Mehta T, Adler J, McNamara RM. A national survey of board-certified emergency physicians: quality of care and practice structure issues. Am J Emerg Med 1998;16:1-4.

Conflict of Interest:

None declared

No Title
Posted on May 4, 2005
Lucius F Wright
The Jackson Clinic, P. A., Jackson, Tenn., 38301
Conflict of Interest: None Declared

Amalberti and colleagues' thoughtful analysis of cultural barriers to maximizing patient safety in healthcare is provocative. I was struck by the similarities to the issues I have encountered in dealing with purchasers of healthcare services. I think we are fundamentally split. When we are not too sick, we want a safe system, do not particularly care about the identity of the provider, and want the service provided as a commodity, just as we expect Wal-Mart to deliver everyday low prices, on demand availability, and guaranteed results. On the other hand, if we are really sick, and think there is a good chance we might die, then we want "the best," however defined, and cost is no object. I call this the Nieman-Marcus approach to health care.

Clearly this schizophrenia causes problems for physicians and consumers, formerly known as patients. A physician who decides his/her service is an interchangeable commodity, one of the first goals for attaining a culture of safety identified by this article, he/she is unlikely to exert him or herself to be Marcus Welby, M. D., on call all the time, with infinite time and patience to spend on one patient. Good old Marcus, though, is likely to be labeled "unsafe," because he clearly took his craftsman's role seriously.

The authors note that airline safety processes may not be fully interchangeable with the healthcare system. We all agree that catastrophic loss of an airplane is worth trying to avoid, but in healthcare we confront the fact that we are going to lose the patient, albeit hopefully one at a time. This is not an argument against trying to improve our safety record, rather it is a plea to recognize that unavoidable deaths are sometimes hard to separate from the inevitable deaths.

The authors suggest that safety goals need to be "tiered." I agree. If the parallel with the service demands of our patients and purchasers is correct, then I suggest as much as 70% of the care is, in fact, a commodity. We can, therefore, construct a workforce and a care delivery system to meet the "on-demand" needs of patients at low prices, with a 10- 6 safety level. 20% of the care is high intensity, but short term, care traditionally associated with hospitals. Perhaps we should settle for a 10-3 or 10-4 level of safety here. The other 10% is the complex chronic care, which constitutes a good part of the outpatient care for many internists and medical specialists. In many of these cases, we need a surrogate for death as a safety measure, since the death of the patient is assured.

I am not optimistic, though, that we as a society can conduct a meaningful dialog on what we want from our healthcare system, how much we are willing to spend on it, and how safe we want it to be. Perhaps the authors' next paper could focus on the barriers to safety that reflect our cultural preferences, our dietary habits, and our persistent belief in magic potions.

Conflict of Interest:

None declared

Submit a Comment

Summary for Patients

Clinical Slide Sets

Terms of Use

The In the Clinic® slide sets are owned and copyrighted by the American College of Physicians (ACP). All text, graphics, trademarks, and other intellectual property incorporated into the slide sets remain the sole and exclusive property of the ACP. The slide sets may be used only by the person who downloads or purchases them and only for the purpose of presenting them during not-for-profit educational activities. Users may incorporate the entire slide set or selected individual slides into their own teaching presentations but may not alter the content of the slides in any way or remove the ACP copyright notice. Users may make print copies for use as hand-outs for the audience the user is personally addressing but may not otherwise reproduce or distribute the slides by any means or media, including but not limited to sending them as e-mail attachments, posting them on Internet or Intranet sites, publishing them in meeting proceedings, or making them available for sale or distribution in any unauthorized form, without the express written permission of the ACP. Unauthorized use of the In the Clinic slide sets will constitute copyright infringement.

Toolkit

Want to Subscribe?

Learn more about subscription options

Advertisement
Related Articles
Journal Club
Topic Collections
PubMed Articles
My copilot is a nurse--using crew resource management in the OR. AORN J 2006;83(1):179-80, 183-90, 193-8 passim; quiz 203-6.
Forgot your password?
Enter your username and email address. We'll send you a reminder to the email address on record.
(Required)
(Required)