Normal Accidents

Normal Accidents: Living with High Risk Technologies (Updated)

CHARLES PERROW
Copyright Date: 1999
Pages: 386
https://www.jstor.org/stable/j.ctt7srgf
  • Cite this Item
  • Book Info
    Normal Accidents
    Book Description:

    Normal Accidentsanalyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.

    The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.

    eISBN: 978-1-4008-2849-4
    Subjects: Technology

Table of Contents

  1. Front Matter
    (pp. i-iv)
  2. Table of Contents
    (pp. v-vi)
  3. ABNORMAL BLESSINGS
    (pp. vii-2)
  4. INTRODUCTION
    (pp. 3-14)

    Welcome to the world of high-risk technologies. You may have noticed that they seem to be multiplying, and it is true. As our technology expands, as our wars multiply, and as we invade more and more of nature, we create systems—organizations, and the organization of organizations—that increase the risks for the operators, passengers, innocent bystanders, and for future generations. In this book we will review some of these systems—nuclear power plants, chemical plants, aircraft and air traffic control, ships, dams, nuclear weapons, space missions, and genetic engineering. Most of these risky enterprises have catastrophic potential, the ability...

  5. CHAPTER 1 Normal Accident at Three Mile Island
    (pp. 15-31)

    Our first example of the accident potential of complex systems is the accident at the Three Mile Island Unit 2 nuclear plant near Harrisburg, Pennsylvania, on March 28, 1979. I have simplified the technical details a great deal and have not tried to define all of the terms. It is not necessary to understand the technology in any depth. What I wish to convey is the interconnectedness of the system, and the occasions for baffling interactions. This will be the most demanding technological account in the book, but even a general sense of the complexity will suffice if one wishes...

  6. CHAPTER 2 Nuclear Power as a High-Risk System: Why We Have Not Had More TMIs-But Will Soon
    (pp. 32-61)

    Why haven’t we had more Three Mile Islands? If nuclear power is so risky, why has no one been killed by radiation exposure as a result of a nuclear power plant accident? If the safety systems have worked so far, nearly twenty years into the nuclear power age, why call this a high-risk system? One answer is that the “defense in depth” safety systems have worked, limiting the course of accidents. We shall examine these safety systems briefly. But a more accurate and less reassuring answer is that we simply have not given the nuclear power system a reasonable amount...

  7. CHAPTER 3 Complexity, Coupling, and Catastrophe
    (pp. 62-100)

    To make a systematic examination of the world of high-risk systems, and to address problems of reorganization of systems, risk analysis, and public participation, we need to carefully define our terms. Not everything untoward that happens should be called an accident; to exclude many minor failures, we need an exact definition of an accident. Our key term,system accident or normal accident, needs to be defined as precisely as possible, and distinguished from more commonplace accidents. We will define it with the aid of two concepts used loosely so far, which now require definition and illustration:complexityandcoupling. We...

  8. CHAPTER 4 Petrochemical Plants
    (pp. 101-122)

    Petrochemical plants produce far fewer headlines and much less controversy than nuclear power plants. They have been around for about one hundred years, so ample engineering experience exists, and the public is familiar with the sight of their gangly towers and squat storage tanks. Fires rage uncontrolled in them at times, and a few operators might be killed, but we do not have protest marches, a Chemical Regulatory Commission, scientific panels and conferences that are covered by the media, or a search for alternatives to our plastics and gasoline. It is a low-profile industry—deliberately, as we shall see. It...

  9. CHAPTER 5 Aircraft and Airways
    (pp. 123-169)

    This chapter will take us closer than any other to our personal experiences, for almost all of us have flown on commercial airline flights. Having survived, we know it is at least fairly safe. Yet aircraft and the airways have system accidents, as we shall see, and there are many examples of production pressures, malfeasance, incompetent designs, and regulatory inaction. Why did you fly without giving much thought to any danger, then? There are some unique structural conditions in this industry that promote safety, and despite complexity and coupling, technological fixes can work in some areas. Yet we continue to...

  10. CHAPTER 6 Marine Accidents
    (pp. 170-231)

    Marine accidents bring us to a wider system than we have had to encounter so far. It is a fascinating country, and in our tour, we will encounter a frying pan that destroys a luxury liner in hours, captains playing “chicken” in sea lanes with forty ships about, “radar assisted collisions,” monumental storms, tugboats blocking radio channels by playing Johnny Cash music, and tankers over a city-block-long negotiating channels only two feet deeper than they are. In the midst of these calamities are owners egging their captains on and insurance companies that fail to inspect the ships but shout “stop...

  11. CHAPTER 7 Earthbound Systems: Dams, Quakes, Mines, and Lakes
    (pp. 232-255)

    In this chapter we will deal with the movement of large quantities of earth or water, whether deliberately or by accident. The production systems are primitive, compared to nuclear and chemical plants, and there are few unanticipated interactions in mines and virtually none in dams. Why, then, be concerned with these systems? First, it is useful to have some contrast to complex, tightly coupled systems to explicate and illustrate the value of our basic concepts. Dams have catastrophic potential, a matter of interest to us, but dam failures are not system accidents. The system is tightly coupled but very linear....

  12. CHAPTER 8 Exotics: Space, Weapons, and DNA
    (pp. 256-303)

    This final data chapter deals with three very high-tech systems, and that is about all that links them together. One has almost no catastrophic potential (space missions); one has the ultimate catastrophic potential (nuclear weapons); and the third, recombinant DNA research and production, or DNA for short, has hardly begun, but could well develop in a direction that would be second in catastrophic potential only to nuclear war. Each system contributes to our argument in different ways, providing further evidence that system accidents are inevitable in complex, tightly coupled systems.

    The first section, on space missions, will elaborate in a...

  13. CHAPTER 9 Living with High-Risk Systems
    (pp. 304-352)

    A crucial question may have been at the back of your mind as you have read this book: What is to be done? After having looked at all these systems, what do I propose as a solution? I have a most modest proposal, but even though modest and, I think, realistic, it is not likely to be followed. I propose using our analysis to partition the high-risk systems into three categories. The first would be systems that are hopeless and should be abandoned because the inevitable risks outweigh any reasonable benefits (nuclear weapons and nuclear power); the second, systems that...

  14. AFTERWORD
    (pp. 353-387)

    Much has happened in the world of accidents in large systems since publication ofNormal Accidentsin 1984. The least important should be explained first. In the process of consolidating empires and insisting on maximum profits, Rupert Murdoch decreed that Basic Books should eliminate the books from its backlist that sold only a modest amount each year. (In fact, shortly thereafter he eliminated Basic Books entirely, though it has emerged again under new owners.) Princeton University Press thoughtfully and wisely agreed to a photocopy reprinting and encouraged me to add an afterword.

    Of vastly greater importance, we have had a...

  15. POSTSCRIPT: THE Y2K PROBLEM
    (pp. 388-412)

    Introduction and a prediction. When I first started looking at Y2K in September of 1998, the pessimists won me over. Researching accidents led me to emphasize the crazy things that can go wrong and, as an organizational theorist, I am familiar with all the problems that can occur and the mistakes that can be made, and very skeptical of official reassurances to the contrary. Reflection on the difficulty of killing thousands with one blow—the Union Carbide effect—however, suggested that it might be hard to line up all the pins so that major disruptions would occur, and further reflection...

  16. LIST OF ACRONYMS
    (pp. 413-414)
  17. NOTES
    (pp. 415-425)
  18. BIBLIOGRAPHY
    (pp. 426-440)
  19. INDEX
    (pp. 441-451)