Vivian Weil, a thinker among techies

MemberCentral website of the American Association for the Advancement of Science 11/21/14

By Delia O’Hara

 Engineers were at the heart of the recent scandal at General Motors (GM) over a defective ignition switch that may have caused more than 30 deaths. Engineers developed the switch, realized it was faulty, approved it anyway, then apparently replaced it with an upgraded switch without recording the change, making its use difficult to track.

Although a number of GM employees have lost their jobs in the scandal, much of the blame for those actions so far has gone to Raymond DeGiorgio, a GM design release engineer; a federal investigation continues. The infamous switch often was not able keep a car’s ignition engaged, which led in some cases to the engine simply switching off, and disabling the car’s air bags. Company officials have admitted the problem was discovered in pre-production testing in 2001.

For AAAS Fellow Vivian Weil, the GM case is “a true engineering ethics ‘big bad case,’” the latest in a long line of spectacular project failures from the 19th century on, in which cost concerns or deadline pressures trumped human safety. The 1919 Great Molasses Flood in Boston; the 1928 collapse of the St. Francis dam north of Los Angeles; the Ford Pinto, equipped in the early 1960s with a gas tank that could explode in a collision — the list goes on and on. An occupational hazard for engineers is that mistakes in their projects can cost lives and create distrust in the minds of the public.

As a founding member, and now director, of the Center for the Study of Ethics in the Professions at Illinois Institute of Technology, Weil has been a pioneer in academic engineering ethics, helping young prospective engineers come to terms with the fact that while they will often play an important role in the projects they work on over the course of their careers (“What they present as options is often what gets carried out,” Weil says), they can come under enormous pressures as well.

Most engineers work in corporate settings, Weil says, and “organizations can be daunting. Engineering students need preparation for dealing with the things that come up. You can teach ethics but, ultimately, a person is alone with his/her ideas about how to do his/her job.”

Engineers often move into management at their companies, where they are asked to behave differently. A NASA subcontractor’s senior manager famously chided his top engineer in the run-up to the1986 Challenger space-shuttle disaster, urging him to green-light the launch in temperatures the engineers thought were too cold for the shuttle’s O-rings to tolerate: “Take off your engineer’s hat and put on your manager’s hat.”

Says Weil, “Thinking like an engineer means relying on tried-and-true technical standards, and a commitment to making incremental changes. The standards that people have developed for themselves are wonderful starting points for ethics.”

Young engineers learn at the Center to present theoretical ethical problems, along with their well-articulated solutions—and to consider strategies for getting their concerns noticed, which is no small part of the solution. Weil says the trick is to be not only ethical but also effective.

“What do you say to whom?” Weil says. “Notice the kinds of situations that ought to attract your attention, and the ways in which those situations, if unattended to, can cause real problems.”

Here’s one famous cautionary tale: Three engineers working on the new Bay Area Rapid Transit system in the San Francisco area in the early 1970s raised the alarm about lax testing and documentation standards in Westinghouse Corp.’s development of an innovative automated train control system, but management did not seem to take their apprehensions seriously. So the engineers wrote and distributed an anonymous memo about their concerns, and later went over their supervisors’ head to alert the BART board. That made their managers angry. All three engineers were fired for insubordination and for initially lying about their whistle-blowing activities; all suffered financially and emotionally as a result. However, on October 2, 1972, a BART train overran the Fremont station and crashed into an embankment, injuring five people—just the kind of event the three engineers had been trying to warn their managers about.

A philosopher trained at the University of Chicago and the University of Illinois at Chicago, Weil has spent her career as a thinker among the techies. As a philosopher, she is especially interested in human action, in “responsibility, in what actions convey.”

In the 1990s, with grants from the National Science Foundation, Weil’s ethics studies began to move into science, again reflecting mounting concern over a series of high-profile scandals, research frauds in this case, including William Summerlin’s infamous painted mice. The NSF support of Weil’s work recognized that “the continued (public) support of science requires trust,” she says, but the teaching of ethics, where it occurs, is not as “legitimized” as other subjects are.

Ethics have to have a starting point, and science too has some documents in which ethicists can find “a pretty good consensus,” Weil says, like the so-called Belmont Report, a response to callous research with human subjects, chiefly the Tuskegee Syphilis Study.  The Belmont Report establishes three principles for such research that Weil says have gained wide acceptance—justice, beneficence and respect for persons.

The truth is, “we get our ideas about ethics from a lot of places—living, going to school, to synagogues and churches,” and on that level, there is plenty of agreement. But when the benefits might outweigh the risks, or with new technologies, things can get murky, Weil says.

Take nanotechnology, a topic of Weil’s recent research. “Nanotechnology goes on in nature, but deliberate engineering is new,” she says. Carbon nanotubes, for example, have been touted as a modern miracle for their strength and ability to conduct heat and electricity, among other properties, but some evidence suggests they may be carcinogenic, “the new asbestos. We don’t know, but we have to say something about the pitfalls,” Weil says.

But researchers and engineers don’t abandon reasonable work because it might have some dangers, or even some applications that might be repulsive to a particular researcher—a military use, for example.

“Consequential thinking isn’t everything. When consequences matter a lot, you’d better not disregard them,” Weil says. “But you have to be more open [than that].”

Leave a Reply

Your email address will not be published. Required fields are marked *