In some cases, this is incontrovertible: A nuclear reactor has millisecond relationships between pressure, temperature, and activity in the core, all controlled by a plumbers nightmare of coolant pipes-and there's little operators can do in an emergency that doesn't potentially vent radioactive material to the environment. This is definitely the more academic version though, i'd consider Meltdown to be the pop version. While there us certainly good primary research in evidence, and the discussion of coupling in complex systems would have been valuable at the time of initial publication it scans a bit trite now , the author clearly has an inexplicably luddite agenda. Roughly, you can think of the interactions occurring within the system and the coupling being centered on external interfaces. Accidents are usually attributed to design failures or human error but the nature of complex technologies that have parts that are closely coupled to each other makes catastrophic failure so likely that it is a normal occurrence hence a normal accident. Patient safety is a global concern and is the most important domains of health-care quality.
You might consider some of it normal stupidity. Normal Accidents' growing influence since 1984 on social science scholarship and across academic, business and governmental disciplines was not accidental. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. I began this book in July and finally finished it today. This book covers a concept that complex technologies will be subject to normal accidents. His treatise makes for sobering and provocative reading.
I also enjoyed the numerous case studies presented in the book because it is in the specific cases we can glen ideas at prevention. Can we avoid all accidents with enough time, enough information, enough practice running the system or enough process around it? Perrow's book, though presented as a narrow study of the functioning of technological systems, is also a study of the psychology of human error, which could be fatal even in low-tech systems, and is much more dangerous today given the speed, size, and clout of modern technology. Normal accidents offers some useful, if frequently impractical advice for creating systems that are not dangerous, but more often it tends to encourage apathy and complacency. Perrow, is a two-dimesional analytical framework combining complex vs. In analyzing the near-meltdown at the nuclear plant in 1979, Perrow noted that the equipment vendor and the system operators blamed each other. Neither tends to suffer systemic accidents.
In high-risk systems, no matter how effective safety devices are, some types of accidents are inevitable because the system's complexity leads to multiple and unexpected interactions. Consistency isn't that hard, people. Because of its density and the datedness, I give this three of five stars. Perrow defines complexity as the ability of a single component in a system to affect many other components, and tight coupling as a characteristic of having close and rapid associations between changes in one part of the system and changes in another part. So it was with Charles Perrow's influential book Normal Accidents. Long experience and deliberate effort to create perfectly functioning systems, which fail even so, tend to make one dubious about the possibility of pulling it off in practice.
The book is quite interesting, but a little dense and not quite as easy to read as other pop-science books I've read recently. It asserts that typical precautions, by adding to complexity, may help create new categories of accidents. It's hard to put labels on this book. One should read this book just because once you have read it, it will make you look at systems, complex ones in particular, in a fundamentally different way, especially interrelationships, safety design errors, human factors etc. Taken on balance, the book is a solid introduction to concepts most of us don't give any thought to at all: what's involved in the dangerous technologies that allow us to go about our daily lives, and whether or not we can ever properly safeguard the systems we create to run those technologies. It's dense, it's dated he's talking about disasters of the late 70s and early 80s most of the time , and it's difficult to penetrate.
Universities, for example, are interactively complex but only loosely coupled — decisions are often influenced by unanticipated factors but effects are felt slowly. However, the date of retrieval is often important. Plus, I'm looking back from the other side of over 30 years of exponential growth in computing power. I was aware this book was written by a sociologist, but I thought it might be even better for that reason - could be goo Man was this book ever a slog. There are a lot of them. Perrow explains how human reliance on technology and over-design will inevitably lead to failure precisely because of inherent safety design. He was invited to provide a background paper for the President's Commission On The Accident At Three Mile Island, which enquired into the 1979 nuclear incident near Harrisburg, Pennsylvania.
Seller Inventory V9780691004129 Book Description Princeton University Press. This article provides an introduction to basic risk management and error theory principles and examines ways in which they can be applied to reduce and mitigate the inevitable human errors that accompany high-risk systems. Among the most influential are Scott Sagan's The Limits of Safety 1993 and Dietrich Dörner's The Logic of Failure 1996. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. One case involves the loss of the two square mile Lake Peigneur in Louisiana. Certainly of a tightly coupled system that cause the entire system to collapse when only one component fails. Perrow rightly rails against operator error as the most frequent cause of accidents.
For me, it was a struggle. The Exxon Valdez was definitely a tightly-coupled system. A related problem is the extremely authoritarian command structure used at sea; in which first mates are much less comfortable questioning their captains than copilots are in the air. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents.