Two women are undergoing in vitro fertilization. The embryologist whose job it is to sort and prepare the embryos for implantation mixes them up. Nine months later one of the women gives birth to the other woman’s baby. Is it human error?
Responding to a 911 call, a fire department dispatcher sends a fire-fighting unit to the 3100 block of 9th Street. There’s no fire, so the unit returns to the station. Meanwhile, a raging inferno on the 3100 block of 9th Avenue takes the lives of a mother and her three children. Is it human error?
Yes, of course it’s human error, but each case is also an example of a flawed system. The author, president and principal scientist at Ergonomic Systems Design, Inc., presents twenty true stories of human error induced by technology and design. His previous collection of such accounts, Set Phasers on Stun, won critical acclaim from science and design publications. In this volume, the stories focus on the plight of the user when faced with a design that is incompatible with the way people perceive, think, and act. Representing different classes and types of human error in a variety of settings—hospitals, airports, amusement parks—the stories are seductive and have the potential to serve as provocative and instructive tools.
Many of the stories are horribly tragic: a hospital’s cavalier attitude toward safety results in a boy’s death while he’s undergoing an MRI; a misleading gas gauge and an awkward cockpit design cause singer John Denver to crash his new airplane off the coast of California; in the title story a Japanese chemist recklessly stirs up a batch of enriched Uranium-235 in a bucket. Other stories are anxiously amusing: on Thanksgiving evening a bank’s electronically-controlled locking system traps a man overnight inside an ATM booth; despite having no luggage, a one-way ticket paid for with cash, and a clumsily doctored passport, Richard Reid, the “shoe bomber,” is determined by airline security not to be a threat.
In most of the twenty situations the fatal flaw in the system occurs when the user interacts with technology and design. The airplane may be structurally sound, the hospital equipment may be cutting-edge, the army’s GPS may be state-of-the-art; but if the pilot, the doctor, or the soldier is distracted, casual, or confused, design-induced human error can and will happen.
Some commentary and analysis, perhaps a leading question or two at the end of each story would have been helpful; nevertheless, Casey’s effort is commendable and may provide an effective catalyst for organization-wide review and discussion.