People-Centered (Not Tech-Driven) Design*
Published in: Norman, D. (2018). People-centered (not tech-driven) design. In T. Pappas (Ed.), Encyclopaedia Britannica, Anniversary Edition (pp. 640-641). Chicago: Encyclopaedia Britannica.
How did we reach the point where our technology is more important than people? And most importantly, how can we reverse this trend in order to ensure that our technologies are designed with people in mind, more humane, more collaborative, and more beneficial to the needs of people, societies, and humanity. To me, these are some of the foremost issues facing the world.
We are in a period of major changes in technology, impacting almost all areas of human lives. Increases in computational and communication power, the advent of small, tiny sensors, new ways of making physical parts, new materials, and powerful new software tools (including, of course, artificial intelligence) are changing education, work, healthcare, transportation, industry, manufacturing, and entertainment.
The impact of these changes upon people and society is both positive and negative. Although the positive impacts are celebrated, the negative impacts are often treated as unfortunate but unavoidable side effects. Suppose instead we adopt the view that these negative side effects are so severe that we need a different framework for designing our world.
Today, much of our technology is designed through a technology-centered approach. Basically, the technologists--and technology companies--invent and design what they can but then leave many tasks that could be done by machines to people instead, thereby forcing us to work on the technology’s terms. As a result, workers often are required to do things people are known to be bad at. And then, when they do these jobs badly, they are blamed--“Human error” is the verdict. No, this is not human error: it is inappropriate design.
Want some examples? Consider any boring, repetitive task such as working on an assembly line, entering numbers into a table, or driving a motor vehicle for long periods. Each of these activities requires continual attention to detail, high accuracy, and precision--all things that people are particularly poor at. Machines are well equipped for these activities. Alas, these tasks are required of us because of the way technology has been designed. People are forced to make up for the deficiencies in the technology, which forces people to serve the requirements of machines.
The result? Human error is blamed for over 90 percent of industrial and automobile accidents. It is the leading cause of aviation accidents, and medical error is reported to be the third-largest cause of death in the entire United States. Horrifying? Yes, but why do we label it “human error”? It is design error.
If human error were responsible for five percent of fatalities, I would believe it. But when it is said to be 90 percent, clearly something else must be wrong. Accident review committees often stop prematurely when they find that someone did some inappropriate action. The review stops there, satisfied that the cause has been discovered. Unfortunately, that misses the real cause: Why did the person make the error in the first place? Invariably, if the investigation continues, there are multiple underlying causes, almost always a result of poor design of either the equipment, the training, or the procedures.
There has to be a better way. And there is: We must stop being so technology-centered and become human-centered. Alas, this is easier said than done. Technology so dominates our lives that it is very difficult to reverse this deeply ingrained, historical outlook.
I practice what is called people-centered design, where the work starts with understanding people’s needs and capabilities. The goal is to devise solutions for those needs, making sure that the end results are understandable, affordable, and, most of all, effective. The design process involves continual interaction with the people who will use the results, making sure their true needs are being addressed, and then continually testing through multiple iterations, starting with crude but informative prototypes, refining them, and eventually ending up with a satisfactory solution.
Human-centered design has enhanced the ability of people to understand and use many complex devices. Early airplane cockpits had numerous displays and controls, often so poorly thought out that they contributed to error--and in some cases, deaths. Through the application of human-centered design approaches, today’s cockpits now do an excellent job of matching the display of critical information and the positioning and choice of controls to human capabilities. In addition, the procedures followed by pilots and crew, air-traffic controllers, and ground staff have also been revised to better match human requirements. As a result, the accident rate has decreased to the point where commercial aviation incidents are rare. In similar fashion, early computers were controlled through complex command languages that required considerable training to use, and when errors occurred, they were blamed on the operators.
Today’s computer systems are designed with much greater appreciation of human needs and capabilities. The results are graphical displays and control through simple mouse clicks, hand gestures, or voice commands that match the way people think and behave, so that learning is easy and direct.
The goal is to change the way we consider our technology. Instead of having people do the parts of a task that machines are bad at, let’s reverse the process and have machines do the parts that people are bad at. Instead of requiring people to work on technology’s terms, require the machines to work on human terms. People and technology would then become partners. This approach could result in systems where the combination of people + technology can be smarter, better, and more creative than either people or technology alone. A person plus a calculator is a good example of a perfect, complementary match.
What do I hope for in the future? A symbiotic relationship between people and technology, where design starts by understanding human needs and capabilities, only using the technologies that are appropriate to empower people. One goal is collaboration, where teams composed of people and technology do even better than they could do unaided, with more pleasure and satisfaction. There are many situations where autonomous, intelligent technology should be deployed, often in areas characterized by the “three D’s”: dull, dirty, and dangerous. For in most situations, collaboration over long periods without distraction or deviance--where people guide the overall goals and activities with technology executing the lower-level requirements of the task for consistency, accuracy, and precision--leads to better, more enjoyable results for everyone. To get there, however, we need to replace the technology-centered design approach with a human-centered one, where we start by building upon human skills, with the latter then enhanced through the capabilities of technology.
Don Norman, a frequent speaker, author, and corporate advisor, is Professor and Director of the Design Lab at the University of California, San Diego and co-founder and principal of Nielsen Norman Group. His formal education is in electrical engineering and psychology. He has served as a faculty member at Harvard, University of California, San Diego, Northwestern, and KAIST (South Korea). He has also worked in industry as a vice president at Apple, an executive at Hewlett Packard, and at a startup. Today Norman's emphasis is on helping technology companies structure their product lines and businesses, concentrating on design thinking to help drive both incremental and radical innovation. His books include The Design of Everyday Things, Living with Complexity, Emotional Design: Why We Love (Or Hate) Everyday Things, and The Design of Future Things, among many others.