Many institutional processes are designed to eliminate variance by fine-tuning techniques. We hope to perfect repetitive tasks like manufacturing to “six sigmas,” eliminating all variability of outcome through a perfectly honed production process. Even where risks are necessarily present, we seek to routinize the process of risk assessment and risk management through techniques such as cost-benefit analysis. All of these techniques are valuable, but they have their limits and can sometimes have counter-productive side-effects.
In a world characterized by complexity, we have to begin by admitting the impossibility of perfecting what we are doing. Perfection requires stability –- the notes in a Beethoven score never change, so it is possible to aspire to a perfect performance. But many of the problems that we encounter are more like a jazz improvisation with unpredictable changes in the players, instruments, and styles. Rather than perfecting the playing of each individual note, we need to be alert for new information that may change old answers; we need to realize that planning must be flexible; and we must avoid locking ourselves into decisions that may later prove misguided. And we also need to be able to enter into shifting partnerships with other organizations, as the scope and dimension of the problem and the need for expertise and resources shift.
There are actually some successful examples of such institutions, such as the nuclear Navy built by Admiral Hyman Rickover. Organizations that cannot afford failure cannot limit themselves to routine risks or even to those that have materialized somewhere in the past. They have to be alert to uncertainties, to surprising events that may shed light on future risks, and to smaller mistakes that indicate the need for reengineering human and technological systems.
There's an interesting body of social science research about these "high reliability organizations." (Here's a presentation by one of my colleagues at the Haas Business School on the subject.) Basically, however, if you want to know what an HRO looks like, think about FEMA or the Army Corps of Engineers or the pre-9/11 national security establishment -- and then imagine their diametric opposites!
See the accompanying reading list on complexity theory and high reliability organizations.