All Categories
Product Description Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them. The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem. Amazon.com Review Hang a curtain too close to a fireplace and you run the risk of setting your house ablaze. Drive a car on a pitch-black night without headlights, and you dramatically increase the odds of smacking into a tree. These are matters of common sense, applied to simple questions of cause and effect. But what happens, asks systems-behavior expert Charles Perrow, when common sense runs up against the complex systems, electrical and mechanical, with which we have surrounded ourselves? Plenty of mayhem can ensue, he replies. The Chernobyl nuclear accident, to name one recent disaster, was partially brought about by the failure of a safety system that was being brought on line, a failure that touched off an unforeseeable and irreversible chain of disruptions; the less severe but still frightening accident at Three Mile Island, similarly, came about as the result of small errors that, taken by themselves, were insignificant, but that snowballed to near-catastrophic result. Only through such failures, Perrow suggests, can designers improve the safety of complex systems. But, he adds, those improvements may introduce new opportunities for disaster. Looking at an array of real and potential technological mishaps--including the Bhopal chemical-plant accident of 1984, the Challenger explosion of 1986, and the possible disruptions of Y2K and genetic engineering--Perrow concludes that as our technologies become more complex, the odds of tragic results increase. His treatise makes for sobering and provocative reading. --Gregory McNamee Review "[ Normal Accidents is] a penetrating study of catastrophes and near catastrophes in several high-risk industries. Mr. Perrow ... writes lucidly and makes it clear that `normal' accidents are the inevitable consequences of the way we launch industrial ventures.... An outstanding analysis of organizational complexity." ---John Pfeiffer, The New York Times "[Perrow's] research undermines promises that `better management' and `more operator training' can eliminate catastrophic accidents. In doing so, he challenges us to ponder what could happen to justice, community, liberty, and hope in a society where such events are normal." ---Deborah A. Stone, Technology Review " Normal Accidents is a testament to the value of rigorous thinking when applied to a critical problem." ---Nick Pidgeon, Nature About the Author Charles Perrow is Professor of Sociology at Yale University. His other books include The Radical Attack on Business, Organizational Analysis: A Sociological View, Complex Organizations: A Critical Essay, and The AIDS Disaster: The Failure of Organizations in New York and the Nation.