My essay, published in my "Influencer" column on LinkedIn
_Does human error cause accidents? Yes, but we need to know what led to the error: in the majority of instances it is inappropriate design of equipment or procedures. __It is time to launch a revolution, time to insist on a people-centered approach to technology. _
On April 8, 2014, the New York Times headlined a story about a U.S. National Transportation Safety Board (NTSB) report on the tragic incident of January 2013 when a ferry boat rammed a pier, injuring 80 people, 4 seriously. The headline stated "Captain and Design Faulted in East River Ferry Crash." The headline was wrong. The NTSB got it right. Here is what Deborah Hersman, chair of the bureau, wrote:
Yes, our report identified that the final error was made by the Captain on the day of the accident, but the first vulnerabilities were designed into the system years before. Accidents, like a fraying rope, are always a series of missed opportunities, but the blame typically falls on the final strand in a rope that breaks - often it is the human being. (From the prepared remarks by Deborah Hersman, chair of NTSB, April 8, 2014.)
Hersman got it right with her elegant, final statement. Bad design and procedures lead to breakdowns where, eventually the last link is a person who gets blamed, and punished.
It's an oft-repeated story. A tragic accident occurs where people are injured or killed. An investigation starts, determined to find the cause. Eventually it discovers the culprit: a nurse misprogrammed an infusion pump, or the technician entered the wrong dosage, or the pilot entered the wrong code into the navigation equipment, or the ship's captain failed to take the correct action. Ahah! Human error. Punish the culprits. Later, when a similar problem happens, another person will be blamed.
Over 90% of industrial accidents are blamed on human error. You know, if it was 5%, we might believe it. But when it is virtually always, shouldn't we realize that it is something else?
There were multiple issues contributing to the ferry accident, but one of the most critical was poor design of the controls that led to the well known and avoidable condition called "mode error." Couple this with NTSB's identification of the lack of proper procedures by the ferry management, most especially the lack of a Safety Management System, and the accident was almost guaranteed to happen (indeed, a similar accident had occurred a few years earlier).
The ferry captain assumed that the controls were in their normal mode, which were appropriate for docking the ship. But they weren't: earlier, he had to set them to a different mode to deal with an unexpected vibration. The Captain was faulted for not remembering that he had earlier set the non-normal mode, but misremembering and misinterpreting modes is a common cause of failure: it has led to numerous problems with computer systems and to a crash of an Airbus airplane. It led to the grounding of a large cruise ship, Royal Majesty, off Cape Cod, Massachusetts because the navigation was in "Dead Reckoning" mode rather than using GPS signals. In all of these cases, the equipment signaled the mode in such a subtle manner that none of the highly-trained crews noticed.
Mode errors are so common that human factors guidelines recommend against the use of modes. In many cases, modes are not necessary and a human-centered design effort could eliminate them. When they are necessary, it isn't difficult to make the mode status clear and obvious: the difficult part seems to be the realization that such a signal is necessary. This is a common result when design is done by engineers who assume logical functioning and have not themselves studied the complexities of operational behavior, with multiple people, continual interruptions, and unexpected situations. The human-centered design field has multiple procedures for revealing the critical information required by people in just these situations, but their practices seldom get incorporated into the design of these safety critical systems. It seems that each industry has to learn their value the hard way. How many more years must we injure and kill people because of poor design?
When there are serious accidents, the first reaction is often to claim "human error." That is why the problems persist: we do not remedy the underlying causes. We won't solve these problems until we recognize that bad design of equipment and procedures is most often the culprit. Does human error cause accidents? Yes, but we need to know what caused the error: in the majority of instances human error is the result of inappropriate design of equipment or procedures.
It is time to launch a revolution, time to insist on a people-centered approach to technology. This isn't a new cry. The human-systems integration board of the U.S. National Research Council (the research arm of the National Academies) as well as human factors and ergonomics communities around the world have been studying and issuing reports about this problem for decades. Some industries do understand the need to design for the way people actually work. Examples include commercial aviation, major vendors of software, and the military. Medicine is just barely starting to understand and change its practices.
We need to instill a people-centered approach in the training of engineers and technologists. It is time to stop blaming people and instead to design for people. We need to discover and fix the real, underlying problems in our equipment and procedures.
References
The NTSB Chair's closing statement at the hearing on the New York ferry incident
https://www.ntsb.gov/news/speeches/hersman/daph140408c.html
Abstract of the Draft NTSB report for the New York ferry incident:
https://www.ntsb.gov/news/events/2014/nyferry/2014AbstractSeastreakaccident.pdf
NY Times article on the NTSB hearing about the New York ferry incident
http://www.nytimes.com/2014/04/09/nyregion/captain-and-design-faulted-in-east-river-ferry-crash.html
The Author
Don Norman wears many hats, including cofounder of the Nielsen Norman group, professor (Harvard, UC, San Diego, Northwestern, KAIST, Tongji), business exec (former VP at Apple, executive at HP, and now cofounder of a startup), on company boards and company advisor, and author of best-selling books on design: _Emotional Design, Living with Complexity, _and Design of Everyday Things. He has a lot of experience studying and advising about Human Error in the nuclear power industry, aviation safety, computer, and automobile industries. He covers error extensively in the 54 page chapter 5 of the newly revised edition of the Design of Everyday Things. Learn more about Norman at jnd.org.