Donald Norman

American academic

Donald Arthur Norman (born December 25, 1935) is a professor emeritus of cognitive science at University of California, San Diego and a Professor of Computer science at Northwestern University.

Human thought is not like logic; it is fundamentally different in kind and spirit. The difference is neither worse nor better. But it is the difference that leads to creative discovery and to great robustness of behavior.

Quotes

edit
  • People Propose, Science Studies, Technology Conforms.
    • Things That Make Us Smart (1993), Epilogue.
  • Academics get paid for being clever, not for being right.
    • 27th annual conference of the Travel and Tourism Research Association, June 1996, Las Vegas, p. 143.
  • Although I firmly believe that there is no such thing as a stupid question, there can indeed be stupid answers. 42 is an example. Not only is this a poor ripoff of Doug Adams' Hitchhiker's Guide, but it isn't even a prime number. Everyone surely knows that numerical answers to profound questions are always prime. (The correct answer is 37.)

The Design of Everyday Things (1988, 2002)

edit
Originally published as The Psychology of Everyday Things. Page numbers refer to the 2002 Basic Books edition, ISBN 0465067107.
  • Serious accidents are frequently blamed on "human error." Yet careful analysis of such situations shows that the design or installation of the equipment has contributed significantly to the problems. The design team or installers did not pay sufficient attention to the needs of those who would be using the equipment, so confusion or error was almost unavoidable.
    • Introduction to the 2002 Edition, p. ix.
  • When you have trouble with things—whether it's figuring out whether to push or pull a door or the arbitrary vagaries of the modern computer and electronics industries—it's not your fault. Don't blame yourself: blame the designer.
    • Introduction to the 2002 Edition, p. x.
  • Good design is also an act of communication between the designer and the user, except that all the communication has to come about by the appearance of the device itself. The device must explain itself.
    • Introduction to the 2002 Edition, p. xi.
  • Technologists are not noted for learning from the errors of the past. They look forward, not behind, so they repeat the same problems over and over again. [...] As each new technology matures, customers are no longer happy with the flashy promises of the technology but instead demand understandable and workable designs. Slowly the manufacturers relearn the same basic principles and apply them to their products. The most egregious failures always come from the developers of the most recent technologies.
    • Introduction to the 2002 Edition, p. xv.
  • If people keep buying poorly designed products, manufacturers and designers will think they are doing the right thing and continue as usual.
    • Ch. 1, p. 8.
  • Usability is not often thought of as a criterion during the purchasing process. Moreover, unless you actually test a number of units in a realistic environment doing typical tasks, you are not likely to notice the ease or difficulty of use. [...] Do it right there in the store. Do not be afraid to make mistakes or ask stupid questions. Remember, any problems you have are probably the design's fault, not yours.
    • Ch. 3, p. 78.
  • When a device as simple as a door has to come with an instruction manual—even a one-word manual—then it is a failure, poorly designed.
    • Ch. 4, p. 87; regarding doors labeled "Push" and "Pull".
  • The principle of visibility is violated over and over again in everyday things. In numerous designs crucial parts are carefully hidden away. Handles on cabinets distract from some design aesthetics, and so they are deliberately made invisible or left out. The cracks that signify the existence of a door can also distract from the pure lines of the design, so these significant cues are also minimized or eliminated. The result can be a smooth expanse of gleaming material, with no sign of doors or drawers, let alone of how those doors and drawers might be operated.
    • Ch. 4, p. 100.
  • Even though principles of rationality seem as often violated as followed, we still cling to the notion that human thought should be rational, logical, and orderly. Much of law is based upon the concept of rational thought and behavior. Much of economic theory is based upon the model of the rational human who attempts to optimize personal benefit, utility, or comfort. Many scientists who study artificial intelligence use the mathematics of formal logic—the predicate calculus—as their major tool to simulate thought. [...] Human thought is not like logic; it is fundamentally different in kind and spirit. The difference is neither worse nor better. But it is the difference that leads to creative discovery and to great robustness of behavior.
    • Ch. 5, pp. 114–115.
  • Change the attitude toward errors. Think of an object's user as attempting to do a task, getting there by imperfect approximations. Don't think of the user as making errors; think of the actions as approximations of what is desired.
    • Ch. 5, p. 131.
  • "It probably won a prize" is a disparaging remark in this book. Why? Because prizes tend to be given for some aspects of design, to the neglect of all others—usually including usability.
    • Ch. 6, p. 152.
  • In their work, designers often become expert with the device they are designing. Users are often expert at the task they are trying to perform with the device. [...] Professional designers are usually aware of the pitfalls. But most design is not done by professional designers, it is done by engineers, programmers, and managers.
    • Ch. 6, p. 156.
  • Innocence lost is not easily regained. The designer simply cannot predict the problems people will have, the misinterpretations that will arise, and the errors that will get made.
    • Ch. 6, p. 157.
  • Creeping featurism is a disease, fatal if not treated promptly. There are some cures, but, as usual, the best approach is to practice preventative medicine.
    • Ch. 6, p. 173.
  • Computer scientists have so far worked on developing powerful programming languages that make it possible to solve the technical problems of computation. Little effort has gone toward devising the languages of interaction.
    • Ch. 6, p. 180.
  • When I use a direct manipulation system—whether for text editing, drawing pictures, or creating and playing games—I do think of myself not as using a computer but as doing the particular task. The computer is, in effect, invisible. The point cannot be overstressed: make the computer system invisible.
    • Ch. 6, p. 185.
  • Hypertext makes a virtue out of lack of organization, allowing ideas and thoughts to be juxtaposed at will. [...] The advent of hypertext is apt to make writing much more difficult, not easier. Good writing, that is.
    • Ch. 7, pp. 212—213.
  • In the consumer economy taste is not the criterion in the marketing of expensive soft drinks, usability is not the primary criterion in the marketing of home and office appliances. We are surrounded with objects of desire, not objects of use.
    • Ch. 7, p. 216.

The Invisible Computer (1998)

edit
  • As the technology matures, it becomes less and less relevant. The technology is taken for granted. Now, new customers enter the marketplace, customers who are not captivated by technology, but who instead want reliability, convenience, no fuss or bother, and low cost.
    • Ch. 10
  • Go to the bookstore and look at how many bookshelves are filled with books trying to explain how to work the devices. We don't see shelves of books on how to use television sets, telephones, refrigerators or washing machines. Why should we for computer-based applications?
    • Ch. 10
  • We are victims of our own success. We have let technology lead the way, pushing ever faster to newer, faster, and more powerful systems, with nary a moment to rest, contemplate, and to reflect upon why, how, and for whom all this energy has been expended.
    • Ch. 10
  • The major problems facing the development of products that are safer, less prone to error, and easier to use and understand are not technological: they are social and organizational.
    • Ch. 10.
edit
 
Wikipedia
Wikipedia has an article about:
 
Commons
Wikimedia Commons has media related to: