Previous: Military Engineering: the Ugly

Next: Mines


Lecture 8: Military Engineering: A Debate



John Bird's Position

[This is a summary of John's position as I remember it; I may have missed some points.] We must begin by recognising some simple facts about human nature. In general, people are motivated by self-interest. So it would be quite unrealistic to rely on the good nature or the good intentions of foreign heads of state for Canada's security. There are groups which would like to encroach on our natural resources, for example, our fishing grounds. Without the Canadian Navy to protect them, our fishing fleets would have to stand and watch while other nations depleted our fish stocks. And in some conflicts, much more is at stake. Members of our parents' generation know very well that, had the Allies not proved militarily stronger than the Axis powers, we would be living in an entirely different world.

A potential aggressor can be deterred, not by persuasive argument or by invoking ethical standards, but only by an appeal to his self-interest: that attacking us will lead to more trouble than he can handle. To convince him of this, we need to conduct military research on two levels: firstly, we need open research, at universities, so that a potential aggressor will know we aren't defenceless; secondly, we need secret research, so that the aggressor will not know all the details of our defences.

In addition to the need for military research as a deterrent, we also need it to be prepared to deal with weapons that may be used against us. For example, we need to study landmines in order to find better ways of detecting and defusing them. We need to study chemical and biological weapons so that we can devise protective clothing and anti-toxins to protect ourselves against them.

We should also note that military research can lead to benefits for civilian life. One example would be the use of the Global Positioning System to aid in search and rescue operations. It would often be impossible to justify funding the research that leads to these benefits without the argument of military necessity.

In conclusion: as long as we live in a world ruled by self-interest, any nation that does not research its own defence can expect others to decide its future.


John Jones's position

I would like to begin with two disclaimers. I am not a pacifist: I would, for example, be prepared to serve in the armed forces if Canada were to be invaded; and I have no rules for deciding whether military research is right or wrong. I have made decisions about doing military research in the past, and I would like to tell you the thoughts that led me to those decisions.

When I was a student, the Americans were fighting the Vietnamese. I came across several pieces of scientific and engineering research done at that time which profoundly impressed me. The first was a team effort; social scientists, studying the psychology of the Vietnamese population, had come up with a simple observation: to sap the resistance of a bombed population, it is much more effective to wound than to kill: a seriously wounded person will not produce anything himself, and will moreover consume resources that could otherwise feed the able-bodied. So the problem is now an engineering one: how can we most effectively wound without killing? Mechanical engineers fired projectiles of different shapes and sizes into pigs for weeks, collecting raw data. The answer they came up with was a flechette-- a little barbed arrowhead. And the engineers borrowed an idea from nature. A porcupine quill, caught in the flesh, will work its way further in of its own accord. Under the microsope, we see it has a pattern of overlapping scales that act like tiny barbs. Similar scales were etched onto the flechettes.

These flechettes were loaded into cluster bombs and dropped in great numbers. At first they were gratifyingly effective; those wounded might eventually die, but they would linger on in great discomfort for many months before this. Then a problem arose; in some cases, the flechettes were being located by X-ray and removed. But our technology was more than a match for this. Materials engineers were sent for, and they at once recommended a plastic, mechanically strong and transparent to X-ray. Tests on further generations of pigs perfected the design.

These details may seem unpleasant to us. But our soldiers in the front line see it differently. This is a quotation from an American pilot, paying tribute to the industrial chemists at Dow who developed napalm: ``We sure are pleased with those backroom boys at Dow. The original product wasn't so hot -- if the natives were quick they could scrape it off. So the boys started adding polystyrene -- now it sticks like treacle to a blanket. But then if the natives jumped under water it stopped burning, so they started adding white phosphorous so's to make it burn better. It'll even burn underwater now. And just one drop is enough, it'll keep on burning right down to the bone so they die anyway from phosphorous poisoning." I've said that I'm not in a position to offer any general moral guidelines, but at that time it seemed clear to me that if I had to choose, rather than work as a chemical engineer for Dow, it would be better to make a living selling heroin to schoolchildren.

I think that's understandable. If there's no difference between the napalm engineer and a decent human being, then are probably no ethical differences in the world worth discussing. But then as soon as you think about it, it gets fuzzy. How about the plastics engineer who advised on the new flechette material? Perhaps he only provided data on Young's modulus and X-ray reflectivity, maybe he never knew what it was for. How about the electronics engineer who worked on the firing mechanism for the bomb-cluster? Or the one who worked on the radar that guides the bomber?

If you try to trace the web of responsibility all the way back, it must include all of us. Yet that conclusion is no use, because there is a difference between us and the napalm engineer; any account that does not preserve that difference is worthless. We have to cut through the web of responsibility at some point, and say: up to this point I will take responsibility for the consequences of my actions; beyond that, it's a stochastic process. And it seems reasonable to assume responsibility for the consequences of your actions just as far as you can predict and control those consequences. For example, if you design a bridge, and someone leaps from it to their death, you should not blame yourself; whereas if you design a scaffold, and someone is hanged upon it, they might tend to blame you.

When I was a student, I wanted to be a physicist. One of my heroes at that time was Einstein -- he still is, I suppose. Einstein made a remark towards the end of his life that puzzled me: ``If I had only known, I would have been a plumber." It was clear that he said this seriously, sadly. He saw his work on relativity and quantum physics as having led to Hiroshima, to Nagasaki, and to the immense stockpiles of nuclear weapons that are still, now, in place. Yet he hadn't known. No reasonable person blames him -- but how sad it must have made him, to see that the consequences of his life's work were out of his control, and casting a shadow over the world.

It seemed to me that this would always be the case in physics; any significant advance would have very wide consequences, which no-one could foresee. Though since most research in physics was funded by the military, some of the consequences could be guessed. Of course, there was nothing to say I was going to make any significant advances; but one doesn't plan one's career around that assumption.

So I decided to become an engineer. Compared with a physicist, an engineer is much closer to the results of his or her work. You know whether what you're designing is a bridge or a bomber. So you can accept responsibility for the consequences of your work, and feel a legitimate pride if those results are good.

I arrived in North America with a PhD in engineering just fourteen years ago, in the middle of a recession. I spent a lot of time studying the `want' ads in engineering journals. One line kept cropping up at the bottom of the ads: `US Citizenship required', or `Security clearance required'. Since then I've looked up the figures; about 50% of the scientists and engineers in the States work directly on military research; about 80% work in jobs which depend directly or indirectly on military funding.

I wasn't a US citizen, and I didn't have a security clearance, so it took me a long time to find a job. There were several opportunities to take jobs doing work that was military but innocuous -- analysing flow fields in jet engines, for example. The problem with these jobs is always that, once you've accepted the job, who decides what you're going to work on? Your boss decides, and if he tells you the jet engine project is cancelled and it's time to build finite-element models of flechettes, it's too late to tell him about your sensitive conscience. The moment of commitment comes when you take the job; that's the decision that you are responsible for; the decisions after that are your boss's responsibility.

Now I'm very happy to be in Canada rather than the States. Perhaps here a person could work on military research with fewer qualms, since Canada has traditionally played a peace-keeping role. But military technologies developed here may not stay here. Most of the military technologies developed in the West have been sold to whoever can afford them. Many of the fighters of Iraq's airforce were designed and built in France; the bunkers which protected them were designed and installed by British civil engineers. So even here, when you work on a weapons system, you don't know whose hands it will end up in. There is one thing you can know about its final use; the products of military R&D are expensive, so expensive that only governments can afford them. So your work will put weapons in the hands of governments; and the governments will use those weapons against the people.

So far I have talked about the consequences of military research for society. Doing military research also has an effect on science, which I would now like to discuss.

Science is said to be value-free, and in a sense this is true. But it is also true that the scientific method itself embodies certain values, and these values are what attract many of us to science. In particular, science is democratic and international. What do I mean by democratic? For example, I could be an Associate Professor at a prestigious Canadian university, and you could be a mere student, but if we publish rival theories of heat transfer, the issue will ultimately be decided by experiment and deduction, not by authority. By `international', I mean that a physicist in China practices the same physics as a physicist in London; an advance for one is an advance for both. This is the ideal; the practice sometimes falls short of the ideal, but the ideal is part of science. And it is this ideal which is responsible for the success of science; for, looked at historically, science does seem to be the one thing that we, humanity, have become good at. And I believe this is because the ground rules of science allow all humanity to cooperate in the creation and transmission of knowledge.

Military research seems to me destructive of this ideal, and in two ways. Firstly, the need for secrecy in military research is entirely antithetical to the spirit of science. Secrecy makes workers in the same field overseas into deadly enemies, rather than colleagues. Their advance becomes our loss.

Secondly, and more seriously, secrecy weakens the peer review mechanism, which is at the heart of science. It is easy, very easy, to deceive yourself about the results of an experiment. Even an experienced scientist, working in a familiar field, can lead herself to see in experimental results the pattern she wants to see. This is why no new knowledge is accepted into science without critical examination and testing by the community of scholars in that field. Remember the case of cold fusion, six years ago. How many people reported successfully repeating the Pons and Fleischman experiment before the results were shown to be in error? Wishful thinking has tremendous power, even over the trained mind.

Suppose you are working in a classified field. You discover the equivalent of cold fusion -- let's say, you discover how to build an X-ray laser. You write a paper on your results. Only a few dozen people in the country have clearance to see it; some of them are your technical peers, you've often met them at conferences. If your results are correct, billions of dollars will go into X-ray laser research. Everyone involved on the military side of the field -- which includes the reviewers -- can expect much more generous funding. Of course, the reviewers are all scientists, and will try to be objective -- but they're only human.

We don't need to speculate on what would happen in this scenario; it's already happened. Instrumentation artifacts on the first X-ray laser tests were misinterpreted as evidence of success. Lowell Woods and Edward Teller, the director and director emeritus of Star Wars research at Lawrence Livermore National Labs, presented the artificial results as genuine, and used their positions to silence scientists inside the Labs who tried to release the truth. How many smaller cases have gone undetected, we have no way of knowing. By way of excuse for the scientists and engineers involved, we may suppose that, after you've spent the day planning for a war that may kill billions, shifting a data point one-third of an inch no longer seems so serious a crime.

Not all research done for military purposes is classified. When funding research at universities, the U.S. Department of Defence suggested the following guideline: it may be unnecessary for all the research to be classified. The professor in charge of the work can get a security clearance, but the students working for him can be assigned to non-sensitive areas. Think about the consequences of this: the professor must ensure that his students don't get a sense of the broader picture; to serve his military clients successfully, he must do the exact reverse of what he should do as an educator. Example: study of particle beam damage, ostensibly as part of a study for travel to Mars, actually as part of SDI.

Previous: Military Engineering: the Ugly

Next: Mines