Next: Betterment II

Previous: Engineering and the Environment


The Betterment of the Human Condition?

In the second lecture, we examined one definition of engineering: ``The Systematic Application of Scientific Knowledge to the Betterment of the Human Condition."

We've seen that the `science' part of this definition is questionable. Now I'd like to look at a second dubious phrase, ``the betterment of the human condition".

I have remarked before that engineers, unlike physicists, must take an exam in ethics before they can become professionals. This is because engineers share with surgeons an unusual capacity for benefitting or for harming others. In this lecture I want to review some of the harm that engineers are doing.

The first category of harm is `unexpected and unintended harm'. As an illustration of this category, let us consider the case of the Therac-25, as presented in the accompanying video.

A detailed analysis of the Therac-25 accidents is available at the URL: http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_1.html

A second example in this category is the DC-10 jetliner. There were several errors in the design of the fuselage for this plane: the rear cargo door had to be secured from the outside, and there was no way for those inside the plane to check that this had been done correctly. If it was not done correctly, the door could blow out during flight. Once this happened, the cargo hold would depressurise. The passenger compartment would remain pressurised, so there would be an immense differential pressure on the passenger floor. The floor would collapse, rupturing the hydraulic control lines to the rear engine and control surfaces on the rear wings. The plane would then most probably crash.

We know this sequence of events so well because exactly this occurred at 12,000 feet over Ontario on June 12, 1972. Fortunately the pilot, Captain McCormick, had trained on a simulator how to handle loss of control over the rear engine and wings, and he was able to land the plane safely in Detroit.

It was now clear that there was a problem. Daniel Applegate, the director of project engineering at Convair, the company that designed the fuselage, wrote a memo to his supervisors, saying ``It seems to me inevitable that, in the twenty years ahead of us, DC-10 cargo doors will come open, and I expect this to usually result in the loss of the plane."

The only change made to the aircraft was to install a 1-inch peephole overlooking the locking pins. It would have been possible to provide further protection, by, for example, installing vents between the cargo hold and the passenger compartment so that the pressure differential would equalise before crushing the floor, but this would have involved major design changes.

On March 3, 1974, another DC-10 took off from Paris. At 12,000 feet the rear cargo door blew out and all 346 people on board were killed.

Why were the changes needed to ensure safety not made earlier? McDonnell Douglas, the manufacturer of the aircraft, was in a time-critical situation: they needed to get the DC-10 onto the market before its rival, the Lockheed Tri-Star. Convair, the engineering company that McDonnell Douglas had subcontracted to do the fuselage design, was unwilling to argue too strongly that the problem needed to be fixed, since they would almost certainly be held to blame for the existence of the problem -- and could expect to be stuck with the million-dollar costs of the design changes. National Geographic Video of DC-10 Case

In this case, as in the Therac case, we see a design flaw leading to a series of disasters. The first disaster in each series fits our category of `unexpected and unintended'. And there's not much we can do or say about that first disaster. People do make mistakes. But can we say anything about the second disaster in the series, the disaster that happens after a problem has come to light?

One observation we can make from the film is that, from the testimony of the AECL safety officer, there seems to have been a particular system of values held within AECL, a system within which harm to the public was less important than embarrassment to the corporation. Feynman's investigations into the Challenger disaster, also reported in the resource files, suggest that similar values were widely held within NASA and Morton Thiokol. My own experience within General Motors tends to confirm that this is the rule in many corporations.

Even the most individualistic of us may be influenced by the value systems of our peers. This may happen most easily when those values are not stated explicitly, but form an unspoken set of assumptions underlying all activities within the corporation. One of the ways in which engineers have tried to counteract this is by forming professional organizations, so that the engineer has as a reference not only the value system of his fellow-employees, but also that of his professional peers, formally stated as a code of professional conduct. (The code of conduct for B.C. professional engineers is in the resource files.)

Engineers who draw attention to problems against the wishes of their superiors are known as `whistleblowers'. You can expect that being a whistleblower will have serious consequences for your job, and perhaps for the rest of your career -- even when the problem you're drawing attention to is real and important. Consider the case of Carl Houston, a welding supervisor for the engineering consulting firm Stone and Webster. This company was contracted by the Virginia Electric and Power Company (`VEPCO') to build a nuclear power plant. Houston inspected welding operations on the plant site , and found many substandard welds: improper electrodes were sometimes used, some electrodes were not oven-dried as they should be, and, most seriously, many of the welders had not been properly trained. He reported this to his manager. Nothing was done, and Houston was told to take the matter no further. Instead, he reported the problem to the head office of Stone and Webster. Again, nothing was done, except that Houston was forced to resign. Out of a job, but still concerned about public safety, he notified VEPCO and the Atomic Energy Commission. They took no action. He notified the office of the Governor of Virginia and the Virginia Department of Labour. Nothing was done. Eventually he was able to convince the two senators for Tennessee to bring the case to the attention of the AEC. This time the AEC made an investigation, and found that all he said was true. The AEC required that the problems be corrected. However, Houston remained out of a job.

It is tempting to conclude from these examples that the responsible engineer should always resist economic and political pressures to compromise on safety; that we should never accept anything less than 100% safety. But this is quite useless as a guideline; there is no such thing as 100% safety. This is particularly clear in the field of software design: every large piece of software contains bugs, even when that software has been in use for decades and lives depend on it working correctly. IBM has conducted studies on removing bugs, and has found that the cost of removing bugs goes up with each bug found, so that eventually the software engineer is spending weeks searching for bugs that will only produce failure every 5,000 years. At some point in this process, it becomes probable that changing the code to fix a bug will introduce new bugs. In principle there are techniques for writing provably-correct software, referred to in passing by Dr Parnas in the Therac video, but I know of no case in which these techniques have been used to generate programs of more than a few hundred lines.

(I suspect the same thing is true of any other complex piece of engineering, and that we are aware of the problem in software only because our debugging tools are powerful enough to anticipate some of the possible failure modes. It's clear, for example, that the space shuttle has unanticipated failure modes, even after two decades of use.)


Suggested Topics for Essays or Discussion

Next: Betterment II

Previous: Engineering and the Environment