In my mind, the ongoing oil spill in the Gulf of Mexico—like other technological disasters, such as those involving the Three Mile Island nuclear powerplant and the space shuttle Challenger—represents a management failure, more so than an engineering failure. Anything implying that the engineering profession as a whole somehow bears the blame for these regrettable events puzzles me, including ENR’s editorial on the subject in its June 7, 2010, issue. To understand why, it is worth noting what some others have said about the nature of engineering practice and its place in our culture.
Steven Goldman, a professor of the humanities at Lehigh University, has described the “social captivity” of engineering. Technology and innovation generally are dominated by market-driven value judgments, rather than technical knowledge. Even when decision-makers are engineers by training, their choices generally are governed by organizational priorities—not necessarily the capabilities and limitations of those who work under them. As a result, engineering tends to be instrumental as non-engineers selectively exploit it to achieve their own objectives.
David Goldberg, an engineering professor at the University of Illinois, has pointed out that this reality has significant ethical implications for engineers. Unfortunately, serving the interests of an employer or client will not always align naturally and consistently with serving the interests of society as a whole. Engineering can be used for malevolent purposes; even when this is not the case, the uncertainties involved are such that good intentions can still lead to undesirable consequences, especially when the boundaries of current technology are being stretched—which is what we see happening today.
Billy Koen, a retired engineering professor at the University of Texas, characterizes the engineering method as the use of state-of-the-art heuristics to create the best change in a poorly understood situation using the available resources. Roughly speaking, a heuristic is any plausible aid or direction in solving of a problem that is, in the final analysis, unjustified, incapable of justification and potentially fallible. While heuristics cannot be proven in the absolute sense—and thus carry no guarantees—their use legitimately is warranted. After all, engineering is not deterministic; it routinely involves selecting a way forward from among multiple options when there is no one “right” answer.
Consider the example of a bridge. Is there a single optimal span for a particular location? You can make an argument for the one across the Golden Gate, but the reality is that there is a staggering array of variables that contribute to what will be constructed. The challenge engineers routinely face is that no rigid and inerrant formula exists to dictate the “correct” outcome. This is why engineering is a profession, not a mere technical avocation. Governments and corporations give us their problems and expect us to solve them, even though there are no objective solutions. They depend on engineers to exercise good judgment.
Henry Petroski, a professor of engineering and history at Duke University, has written extensively about the role of failure in successful design. Heuristics have limits of applicability that are not always explicit; major failures are often due to the fact that a project pushed them just a bit too far but revealed, however, the range over which the heuristics employed in its design were valid. This suggests a paradox: Anticipating failure breeds success, because it exposes where the line is that should not be crossed; repeated success leads to failure, because the line’s location remains unknown.
As long as engineers have no choice but to act with incomplete information and finite funding—that is, all the time—it is inevitable that some risks will go unrecognized or must be grudgingly accepted. Insisting on absolute safety would preclude all but the most rudimentary endeavors, and even those would become prohibitively expensive. The price of progress is that significant failures occasionally occur; but it is to the credit of the engineering profession that they are very much the exception, rather than the norm. When failures do happen, the important thing is to not point fingers, but rather to learn the lessons they can teach.
I have no doubt that my fellow engineers will do exactly that in the wake of this latest tragedy.