2 Ethics & Professional Responsibility

Key Themes & Ideas

  • Engineering is a profession, characterized by self-regulation and a commitment to protecting and promoting the public good
  • As professionals, engineers have special moral responsibilities that go above and beyond what the law or common morality require
  • The core moral commitment of engineers and computer scientists is to protect and promote the welfare of the public
  • Professional responsibility is a multifaceted concept incorporating at least 4 different meanings, all of which are important to engineering
  • Professional codes of ethics outline many of the specific professional obligations of engineers and computer scientists, but are not exhaustive
  • Engineers are sometimes understood to have a special concern for safety, which underwrites their obligations to adhere to a standard of care and to sound the alarm if they are aware of potential harm or unethical behavior

This narrative was originally published by Michael Davis in “Thinking Like an Engineer: The Place of a Code of Ethics in the Practice of a Profession”[1]

On the night of 27 January 1986, Robert Lund was worried. The Space Center was counting down for a shuttle launch the next morning. Lund, vice-president for engineering at Morton Thiokol, had earlier presided over a meeting of engineers that unanimously recommended against the launch. He had concurred and informed his boss, Jerald Mason. Mason informed the Space Center. Lund had expected the flight to be postponed. The Center’s safety record was good. It was good because the Center would not allow a launch unless the technical people approved.

Lund had not approved. He had not approved because the temperature at the launch site would be close to freezing at lift-off. The Space Center was worried about the ice already forming in places on the boosters, but Lund’s worry was the “O-rings” sealing the boosters’ segments. They had been a great idea, permitting Thiokol to build the huge rocket in Utah and ship it in pieces to the Space Center two thousand miles away. Building in Utah was so much more efficient than building on-site that Thiokol had been able to underbid the competition. The shuttle contract had earned Thiokol $150 million in profits.

Challenger Spaceship
The Challenger on lift-off

But, as everyone now knows, the O-rings were not perfect. Data from previous flights indicated that the rings tended to erode in flight, with the worst erosion occurring on the coldest preceding lift-off. Experimental evidence was sketchy but ominous. Erosion seemed to increase as the rings lost their resiliency, and resiliency decreased with temperature. At a certain temperature, the rings could lose so much resiliency that one could fail to seal properly. If a ring failed in flight, the shuttle could explode.

Unfortunately, almost no testing had been done below 40°F. The engineers’ scarce time had had to be devoted to other problems, forcing them to extrapolate from the little data they had. But, with the lives of seven astronauts at stake, the decision seemed clear enough: Safety first.

Or so it had seemed earlier that day. Now Lund was not so sure. The Space Center had been “surprised,” even “appalled,” by the evidence on which the no-launch recommendation had been based. They wanted to launch. They did not say why, but they did not have to. The shuttle program was increasingly falling behind its ambitious launch schedule. Congress had been grumbling for some time. And, if the launch went as scheduled, the president would be able to announce the first teacher in space as part of his State of the Union message the following evening, very good publicity just when the shuttle program needed some.

The Space Center wanted to launch. But they would not launch without Thiokol’s approval. They urged Mason to reconsider. He reexamined the evidence and decided the rings should hold at the expected temperature. Joseph Kilminster, Thiokol’s vice-president for shuttle programs, was ready to sign a launch approval, but only if Lund approved. Lund was now all that stood in the way of launching. Lund’s first response was to repeat his objections. But then Mason said something that made him think again. Mason asked him to think like a manager rather than an engineer. (The exact words seem to have been, “Take off your engineering hat and put on your management hat.”) Lund did and changed his mind. The next morning the shuttle exploded during lift-off, killing all aboard. An O-ring had failed.

Should Lund have reversed his decision and approved the launch? In retrospect, of course, the answer is obvious: No. But most problems concerning what we should do would hardly be problems at all if we could foresee all the consequences of what we do. Fairness to Lund requires us to ask whether he should have approved the launch given only the information available to him at the time. And since Lund seems to have reversed his decision and approved the launch because he began to think like a manager rather than an engineer, we need to consider whether Lund, an engineer, should have been thinking like a manager rather than an engineer. But, before we can consider that, we need to know what the difference is between thinking like a manager and thinking like an engineer.

Reflecting on the Challenger disaster, and especially Mason’s request that Lund “take off [his] engineering hat and put on [his] management hat” provides a useful starting point for thinking about the relationship between engineering and ethics. For it is typical, in reviewing the case, to agree that Lund should have stuck to his initial judgment and not approved the launch. That the correct judgment was made with his “engineering hat” on and the incorrect judgment with his “management hat” suggests that engineers think differently from managers. And, more importantly, that the way in which they think differently has important moral implications; after all, the switch to management thinking cost of the lives of 7 astronauts. Thus, in figuring out the difference between thinking like a manager and thinking like an engineer, we can come to understand the professional responsibilities of engineers.

1. Engineering as a Profession

“Engineering is an important and learned profession.” Thus, reads the first line of the National Society of Professional Engineers (NSPE) Code of Ethics. While the use of the term “profession” may initially seem unimportant, for in everyday speech we talk about “professional athletes” and “professional attire”, the choice of that term is deliberate and important. For there is a more technical meaning of the term profession. And understanding that meaning of the term, and why engineering is a profession in that sense, will go a long way to helping us understand how thinking like an engineer is different from thinking like a manager and why ethics is so central to engineering.

Engineering is not the only profession, in the sense we are using the term here. The two most commonly identified professions are doctors and lawyers. Knowing this, and knowing what is special about doctors and lawyers, can help us start to understand what is special about engineering as well. Our health and our freedom are both absolutely vital aspects of our lives. And keeping them in good shape – keeping our minds and bodies healthy and keeping us out of confinement – takes a high level of expertise that not just anyone has. Moreover, because doctoring and lawyering require a high degree of expertise, as a society we expect the practitioners to keep each other in check. Although we, as a society, do impose legal requirements on doctors and lawyers, we also hand over a great deal of power to their associations or societies for establishing rules and regulations that either become a part of the law or go beyond it. Cases of “disbarment”, for lawyers, are cases where a practicing lawyer has been stripped of their ability to practice law by the relevant “bar association”, which is an association of other practitioners, not judges or lawmakers.

This discussion of what is special about doctors and lawyers should help us see what is special about all professions. Drawing from the above examples, it is typical to identify 3 key features of a profession:[2]

  1. Advanced Expertise. Professions require sophisticated skills (“knowing-how”) and theoretical knowledge (“knowing-that”) in exercising judgment that is not entirely routine or susceptible to mechanization. Preparation to engage in the work typically requires extensive formal education, including technical studies in one or more areas of systematic knowledge as well as broader studies in liberal arts (humanities, sciences, arts). Generally, continuing education and updating knowledge are also required.
  2. Selfregulation. Well-established societies of professionals are allowed by the public to play a major role in setting standards for admission to the profession, drafting codes of ethics, enforcing standards of conduct, and representing the profession before the public and the government. Often this is referred to as the ‘autonomy of the profession’, which forms of the basis for individual professionals to exercise autonomous professional judgment in their work.
  3. Public Good. The occupation serves some important public good, or aspect of the public good, and it does so by making a concerted effort to maintain high ethical standards throughout the profession. For example, medicine is directed toward promoting health, law toward protecting the public’s legal rights, and engineering toward technological solutions to problems concerning the public’s well-being, safety, and health. The aims and guidelines in serving the public good are detailed in professional codes of ethics, which, in order to ensure the public good is served, need to be taken seriously throughout the profession.

In short, professions involve extensive and uncommon knowledge that is essential to some vital public good. But because the knowledge involved is uncommon, society in general must trust professionals to use their expertise for the good of the public and hold each other accountable for doing so. Without expertise in structural engineering, for instance, I cannot personally determine whether a bridge is safe for me to cross. Instead, I am forced to trust that the engineers who designed the bridge and those who regularly check it for integrity are competent, conscientious, and committed to using their power responsibly. To paraphrase Peter Parker’s (Spider-man’s) uncle Ben, we may say, that advanced expertise generates the power while self-regulation and a commitment to some aspect of the public good establish the responsibility. We will, therefore, focus on those latter 2 features for understanding professional responsibility and engineering ethics.

Now that we know what a profession is and have some idea of what makes engineering a profession (although we’ll detail that more below), we can make a first pass at understanding what changed when Lund “put on his manager hat”. In doing so, first, he ceased to exercise autonomous professional judgment. Instead, he allowed his judgment to be controlled by the needs of the company and the pressures of the management team. This is a failure to properly self-regulate. Second, he failed to take seriously his obligation to serve the public good, putting the interests of the company above the well-being of the public (in particular, the astronauts). This, clearly, was a failure to serve the important public good that engineering is supposed to serve. But to understand this further, it’ll be helpful to dig deeper into the relationship between engineering and the public good.

2. Engineering and the Public Good

All professions involve a commitment to serve some aspect of the public good, above and beyond what is required by law or ordinary morality. In this way, professions create their own role-specific ethical code: ethical requirements that apply only to professionals and to them only in their professional capacity. For doctors (and other medical professionals) this is a commitment to the vital public good of health. For lawyers (and other legal professionals) this is a commitment to the vital public good of justice. For engineers and computer scientists (as well as other technological professionals) the commitment is to the vital public good of welfare. Or, as it is typically stated by many of the engineering professional societies, the “safety, health, and welfare of the public”.

But what does it mean to be committed to the welfare of the public? Like medicine and law, this is partially answered by the specific action-oriented obligations that engineers have. These are obligations to do (or not do) specific things under specific circumstances. For instance, medical professionals are ethically required to render medical aid to those in need whereas a member of the general public is not. Engineers, too, have a number of action-oriented obligations and we will examine some of them later in this chapter. But, to best understand a professional’s commitment to the public good, and so engineering’s commitment to the welfare of the public, it is better to focus on how the commitment changes a person’s reasoning and judgment. Because all professions involve special expertise, they all involve a significant amount of high-level reasoning and the exercise of judgment. Many of the issues professionals face aren’t answered by simply consulting an action handbook, but instead require considering a complex set of information to render expert judgment. A core part of this judgment involves bringing the technical expertise to bear on the issue. But the other aspect of this judgment is about what values, interests, or goals are to be privileged above others. For instance, when our doctor is deciding on a treatment plan, we expect her to make that decision on the basis of what is best for our health rather than, for instance, what is best for her schedule or what will make the hospital the most money.

Thus, on top of the action-oriented obligations, we can say that a profession’s commitment to the public good involves a special reasoning requirement: a requirement to reason in particular ways to privilege certain interests above others in exercising professional judgment. For the doctor, it is privileging the health of the patient; for the lawyer, the best interests of the client. The engineer, then, is required to privilege the welfare of the public in her professional judgments and design decisions. As wrong as it would be for the doctor to privilege the hospital’s finances above her patient’s health, it is similarly wrong for the engineer to privilege the company’s finances above the public’s welfare.

This focus on privileging the public welfare in all professional reasoning and judgment is also captured in the NSPE Code of Ethics (as well as the engineering and computer science codes of ethics produced by other professional societies). As the NSPE puts it: “Engineers, in the fulfillment of their professional duties, shall hold paramount the safety, health, and welfare of the public.” Like the best interests of the patient or client, exactly what it means to hold paramount the public’s welfare is a matter of judgment in particular contexts. And there will certainly be room for disagreement among different engineers. But what must hold constant in those disagreements is putting the primary focus on the welfare of the public rather than personal or business concerns.

Engineering and “Public Welfare”

What do the engineering codes of ethics mean by “public welfare”? Like professionwelfare is a term that is used in a variety of ways in society but has an important meaning in the context of ethics. For our purposes, welfare is equivalent to well-being and it encompasses a wide variety of conditions, activities, and more that contribute to a life going well. This is why “public good” is a useful equivalence, for in ethics when we think of welfare we are thinking about what is “good for” a person (or potentially a group or system).

So what is “good for” a person? Various theories of well-being exist in the philosophical and psychological literature. Rather than wading into those debates, we can take a step back from the question “what is well-being?” and instead ask “what conditions are necessary for the realization of some of the most commonly recognized elements of well-being?” This is, in part, because although various competing theories of well-being exist, they all still typically identify many of the same elements. Their disagreement is more over what makes something an element of well-being rather than the elements themselves.

So what are those conditions necessary for the realization of well-being? Well, those most relevant to engineering include things like having food, shelter, and water, having satisfying human relationships, having free movement and expression, and having a satisfactory relationship to the natural world. Engineering and technology can affect all of these conditions both positively and negatively. Pollution from engineering projects can make access to (clean) water difficult while technological advancements in water purification can make access easier. Recent technological creations like the internet have greatly altered how we relate to each other, making it both easier and more difficult to have satisfying human relationships. Planes, trains, and automobiles all greatly enhanced freedom of movement while the internet has certainly enhanced peoples’ ability to express themselves (for better and for worse!).

3. Being a Responsible Professional

The concept of responsibility shows up regularly in discussion of ethics, including professional ethics where it often shows up as professional responsibility. Similarly, engineering ethics discussions often take on the form of reviewing past engineering disasters (like the Challenger) and asking “who was responsible?” But the question “who was responsible?” only captures one small portion of the concept of responsibility. And, it is likely the least important aspect of all. Thus, to further deepen our understanding of engineering as a profession we can dig into the various dimensions of professional responsibility. While we may say that Lund is responsible for the Challenger disaster, we could mean that in a variety of ways.

3.1. Backward-looking & Forward-looking Responsibility

The first distinction we can make is between Backward-looking Responsibility and Forward-looking Responsibility. Backward-looking Responsibility captures the “who is responsible?” sort of question and is often associated with the idea of blame. We are looking back at something that has already occurred and asking “who is to blame here?” or “who should be held accountable?” To avoid confusion, in fact, it is common to reframe backward-looking responsibility as accountability. So, instead of saying “Lund is responsible for the disaster” we might say “Lund is accountable for the disaster”; similarly, instead of “Lund should be held responsible for the disaster” we might say “Lund should be held accountable for the disaster”. This sort of responsibility is of course important; the self-regulating aspect of a profession means, among other things, that it takes seriously identifying who is accountable for engineering errors (if anyone) and holding such people accountable for their failures. But it would be a mistake to focus on it too much, as it is effectively the “final form” responsibility will take, once all other dimensions have already been flouted. We only have to ask “who should be held responsible” for a failure once that failure has occurred. Other dimensions of responsibility can help us prevent the failure in the first place. Moreover, as hinted above, there is not always someone who should be held accountable for a failure; honest mistakes can happen. But that does not mean professional responsibility does not apply in any sense.

And so we can turn to Forward-looking Responsibility. This is the sort of responsibility we have in mind when we say things like “who is going to take responsibility for getting the job done?” or, more generally, “whose responsibility is it?” These questions are asked prior to carrying out some activity and so prior to any potential failure that may occur. And if we take the issue of forward-looking responsibility seriously then it significantly decreases the chances of failure, or at least of failure that anyone would be accountable for. We should also note the relationship between forward-looking and backward-looking responsibility here: If someone recognizes and accepts that they are responsible (in the forward-looking sense) for some aspect of a project and then they fail to execute (or they execute it poorly) that is typically good evidence that they should be held responsible (in the backward-looking sense) for that failure.

While it is not uncommon to think of responsibility almost exclusively in terms of its backward-looking dimension, professional responsibility is much more about the forward-looking dimension. It is about consciously recognizing what needs to be done, how it needs to be done, and then “taking the responsibility” to do it correctly. When we focus too much on backward-looking responsibility, we tend to focus on how we can avoid responsibility. We don’t want to be “held accountable” and so we may simply not act where action is required. This may help us not feel guilty, but it does little to improve the world. Since, in that case, either the thing that needs to be done simply does not get done or it gets done by people with less expertise. A true expert professional accepts, to repeat the line from Spider-man again, that with great power comes great (forward-looking) responsibility.

3.2. The Responsibility to Think & The Responsibility to Act

Cutting across the forward-looking/backward-looking distinction, we can also distinguish between the responsibility to think and the responsibility to act. This neatly tracks the distinction mentioned previously between action-oriented obligations and reasoning requirements. Sometimes we may say someone “fulfilled their professional responsibilities” or more generally talk about what our responsibilities are. Typically, in this case, what we mean is that they successfully carried out the actions that they were obligated to carry out. This, then, is about the responsibility to act. It is the responsibility to meet your action-oriented obligations. Notice, that this can be discussed in both a forward-looking and a backward-looking way. We can ask “what are my professional responsibilities in this situation?” This is effectively asking “what should I do?” and is thus forward-looking. But we can also suggest that someone should be “held responsible” for a failure because they failed to meet their action-oriented obligations. That would, of course, be backward looking.

But, as we have already suggested, being a professional is not just about doing the right actions in the right circumstances. Much of being a professional involves thinking in the right sorts of ways. And so we can also talk about the responsibility to think, by which we mean consider the right things in the right circumstances when exercising professional judgment. This is especially important for professions given that professionals can be held (legally or professionally) responsible for a failure to properly consider factors in making their judgments, in those judgments lead to harm. And this can be true even if they did not intend to cause harm nor could they have foreseen that harm would occur. One useful way to distinguishing the responsibility to think from other uses of the term responsibility is to frame it as being conscientious. Conscientiousness just means successfully considering the right things in the right way in our thinking and decision-making. Thus, we can understand the requirement to hold paramount the welfare of the public as a form of professional conscientiousness.

3.3. Codes of Ethics & Professional Responsibility

The final thing to say about responsibility isn’t so much an additional dimension beyond the four identified above. Rather, it is to note the relationship between Professional Responsibility and the Codes of Ethics produced by various professional societies. These Codes of Ethics are written documents that state (some of) the action-oriented obligations and reasoning requirements of professional engineers. Importantly, although they are put out by various societies that engineers may choose to be a part of, the responsibilities outlined in the codes of ethics apply to all practicing engineers, even those who are not members of the society. These societies are not creating the responsibilities; instead, they are making them explicit and organizing them. This fact partly explains why the codes of ethics of all engineering and computer science societies look very similar, as evidenced by their common language about the commitment to public health shown below.

Importantly, while Codes of Ethics can be great initial resources for identifying professional responsibilities, especially action-oriented obligations, they are not exhaustive lists. For one thing, many of the responsibilities outlined are quite vague – they are principles rather than rules – and so require judgment to fully flesh out. Moreover, given how Codes of Ethics are compiled, they represent the bare minimum set of responsibilities that large groups of professionals agree on. Finally, Codes of Ethics as quite slow to change and so do not always clearly keep up with social and professional changes. Given all that, a responsible professional will be conscientious of the value and the limits of Codes of Ethics and appeal to them accordingly. They will be a resource, but not a holy book; a philosophical text, but not a lawbook.

The Primacy of the Public in Engineering Codes of Ethics

“Fundamental Canon 1” of the National Society of Professional Engineers’ Code of Ethics states that “Engineers, in the fulfillment of their professional duties, shall hold paramount the safety, health, and welfare of the public.”

The first principle of the Software Engineering Code of Ethics and Professional Practice, produced jointly by the Association for Computing Machinery & the Computer Science division of Institute of Electrical and Electronics Engineers (IEEE), states that “Software engineers shall act consistently with the public interest.”

The first principle of ACM’s Code of Ethics and Professional Responsibility, applicable to all computing professionals, states that “A computing professional should contribute to society and to human well-being, acknowledging that all people are stakeholders in computing” while the second principle requires that computing professionals “avoid harm”.

The first requirement of the IEEE Code of Ethics similarly states that (electrical) engineers should “hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, to protect the privacy of others, and to disclose promptly factors that might endanger the public or the environment.”

The American Institute of Chemical Engineers’ Code of Ethics states that (chemical) engineers shall “hold paramount the safety, health and welfare of the public and protect the environment in performance of their professional duties.”

Finally, the American Society of Mechanical Engineers’ code of ethics echoes earlier codes of ethics with the first fundamental canon that “engineers shall hold paramount the safety, health and welfare of the public in the performance of their professional duties” while also stating, as the first principle of the code of ethics, that “engineers… [use] their knowledge and skill for the enhancement of human welfare.”

4. A Special Concern for Safety

Recall that the central professional responsibility of all engineers and computer scientists is to hold paramount the safety, health, and welfare of the public. We suggested earlier that that general responsibility is best understood as a reasoning requirement indicating how technology professionals should think and exercise professional judgment. But there are also a number of action-oriented obligations that follow from this responsibility. Many of these are outlined in the various codes of ethics, and we will not canvass them all here, but instead focus on a few core ones.

Many of the core action-oriented obligations of engineers can be captured under the broad heading of A Special Concern for Safety. This is, broadly, just another way of stating the same commitment to public welfare, but it narrows the focus a little bit by emphasizing avoiding harm over (for instance) doing good. When we have a special concern for safety, we are most interested in ensuring our products and projects don’t impose unacceptable risks on the public. This is in contrast to a doing good approach where we may be thinking much more about the potential benefits of our products and projects. Of course, both approaches are important, and we will want to keep them both in mind in general. However, the majority of the action-oriented obligations outlined in codes of ethics are focused on ensuring safety rather than using engineering skills to do good. And so, for the purposes of examining some of those obligations, having the “safety first” mentality in mind can be helpful.

4.1. The Standard of Care

To have a special concern for safety is to treat safety as an especially important consideration in decision-making. One way that this comes about in professions like engineering is through professional standard setting. Recall that because professions involve a high level of expertise professions are granted a degree of autonomy in regulating their own practitioners. Sometimes these regulations are purely professional and have no legal status. But sometimes they rise to the level of legal requirements. In engineering, this happens with some engineering standards like building codes. But a more general legal standard, rooted in professional self-regulation, that all professions have is known as the standard of care.

The standard of care is defined as “that level or quality of service ordinarily provided by other normally competent practitioners of good standing in that field, contemporaneously providing similar services in the same locality and under the same circumstances.”[3] The first thing to notice is that this is relatively vague as it does not tell us exactly what that level or quality of service is. Instead, it points us to how it gets determined. This is, to be clear, the legal definition of the standard of care. But what it does is offload the determination of whether the standard of care has been violated in a given case to the profession itself. The only way we find out, for sure, whether the standard of care has been violated, is by a court judgment after the court hears from a variety of expert witnesses – namely, those competent practitioners of good standing in the field.

So, although adhering to the standard of care is an action-oriented obligation as it defines a requirement to conduct proper tests, use proper materials, exercise proper oversight, etc. we cannot say definitively what it requires. And, of course, it changes over time as standards and practices change. This is part of what makes it so important to discuss, as it shows us one clear way in which even if your only goal was to not break the law, and you did not care at all about broader professional ethics, you would still need to be conscientious and exercise responsible judgment. For otherwise you may fail to exercise the standard of care.

The standard of care plays an important role in establishing professional responsibility. As indicated earlier, engineers are certainly not accountable for every bad thing that happens as a result of their work. Bad things can happen even if no one is responsible for them happening. But, of course, all people are held accountable for bad things that happen as a result of deliberate attempts to cause bad things as well as reckless activities that result in bad things. And this is true of professionals as well: If they intended to cause harm with their designs, then they are responsible for the harm caused; if they reasonably could have known that their design would result in harm, even if they did not intend for it to happen, then they are still responsible for the resulting harm. But what about harms that were not intended and could not reasonably have been foreseen?

We use the standard of care to answer that question. Legally speaking, the standard of care establishes the line between “honest mistake” resulting in harm and negligent harm. If an engineer is found to have failed to adhere to the standard of care – perhaps they did not run a standard test or failed to consider an aspect of the project that competent professionals would consider – and that resulted in harm, then the engineer is legally and professionally accountable for that harm. However, if they are found to have adhered appropriately to the standard of care then, even though harm resulted, they are not accountable.

In short, the standard of care indicates that engineers must “do their due diligence” in all their work, especially in terms of ensuring safety as the focus here is on whether the engineer is responsible for resulting harms. This means keeping up to date on standards and practices of the relevant field, conducting all tests and using appropriate materials, and whatever else the profession decides is part of being a competent member of that profession.

A number of the specific obligations listed in codes of ethics correspond to this idea. Engineers are required to only work in fields in which they are competent, they are required to adhere to all engineering standards, and they are required to continue their education to stay up-to-date in the field.

4.2. Sounding the Alarm

The standard of care and related obligations all speak to what an individual engineer should do or not do in her own work. But a number of the action-oriented obligations that are most commonly associated with engineering are of a different sort. They still emphasize a special concern for safety, but they all in different ways require engineers to take action when others have acted (or failed to act) in particular ways. We can group these related obligations under the heading of Sounding the Alarm, as they all involve speaking out about potential harm or wrongdoing.

Perhaps the most well-known obligation in this category is whistleblowing. In general, whistleblowing involves going outside standard communication channels to inform relevant authorities of potential harm or wrongdoing. Engineering codes of ethics include a number of specific obligations which are all forms of whistleblowing. Canon 1 of the IEEE code of ethics states that engineers have an obligation to “disclose promptly factors that might endanger the public or the environment” while one of the NSPE’s “rules of practice” states that “If [an engineer’s] professional judgment is overruled, under circumstances where the safety, health, property, or welfare of the public are endangered, they shall notify their employer or client and such other authority as may be appropriate.” Similarly, another rule of practice states that engineers must “notify the proper authorities and withdraw from further service on the project… if the client or employer insists on… unprofessional conduct.” Finally, the NSPE code of ethics also states that “Engineers having knowledge of any alleged violation of this Code shall report thereon to appropriate professional bodies and, when relevant, also to public authorities, and cooperate with the proper authorities in furnishing such information or assistance as may be required.”

There are a few things to notice about these statements. First, whistleblowing can be either “internal” or “external”, depending on the circumstances. It may involve notifying your boss’s boss or it may involve a call to the federal government. It leaves to the judgment of the engineer what contact is appropriate to responsibly report the issue. Some organizations, including some parts of the US government, have “whistleblower hotlines” that are especially set up to handle such things. Other organizations do not and may actively discourage such behavior. Nonetheless, as this makes clear, engineers and computer scientists are professionally (and, as it happens, legally) obligated to whistle blow under the relevant circumstances. Second, the obligation to whistle blow applies in cases of potential harm – thus fitting nicely with the special concern for safety – but also in cases of other ethical violations that may not result in harm. Since not all the NSPE code of ethics directly corresponds to safety, but engineers are required to report any violations, that means violations that may not risk any harm at all. Finally, an engineer cannot simply whistle blow and continue to work on a project they have identified as ethically problematic; instead, as the code states, they must resign from the project as well as report the issue. A very strong obligation indeed! One that can, as several historical cases of whistleblowing make clear, risk significant personal cost. Nonetheless, being a responsible professional means putting safety first, even above one’s job. As we learned in the Nuremberg Trials of the Nazis, “I was just taking orders” is not a valid defense for wrongdoing.

The other, related, obligation is known as the Duty to Warn. Like whistleblowing, this involves going outside of standard channels to report on potential harm. But, unlike whistleblowing, the duty to warn only applies to potential harm (including property destruction) and it is limited to cases of imminent harm. There is a similar duty for other professionals. If a doctor or lawyer has good reason to believe that her patient or client poses a genuine and imminent threat to another person then she is professionally and legally obligation to break confidentiality to provide adequate warning. For the engineer, the way this often comes out is with structural inspection. If an engineer has been hired to inspect a building, bridge or similar and finds that it poses imminent risk then regardless of her broader obligation of client confidentiality, she must take the appropriate actions to warn of the harm. That may mean warning legal authorities, but it could mean warning the residents of the building, if for instance collapse or some other potential harmful event is imminent.

Whistleblowing and the Duty to Warn are specific action-oriented obligations of professional engineers. But they also illustrate more broadly what it means to have a special concern for safety and to hold paramount the safety, health, and welfare of the public. In both cases, engineers are expected to violate a number of other professional obligations – most notably the obligation of employer/client confidentiality – for the sake of protecting the welfare of the public. Similarly, these obligations illustrate the expectation that a professional engineer will be prepared to sacrifice her personal well-being for the sake of the public’s welfare. Just as medical professionals put themselves at severe risk of infection from COVID-19 to treat patients throughout the pandemic – and a number of them lost their lives in the process – responsible technological professionals should be prepared to put themselves at risk if doing so is necessary to fulfill their obligation to hold paramount the safety, health, and welfare of the public.


Check Your Understanding

After successfully completing this chapter, you should be able to answer all the following questions:

  • What is a profession and what are its key features? How does engineering fill out those key features?
  • What are action-oriented obligations and what are some of the action-oriented obligations of professional engineers?
  • What is a reasoning requirement and what is the central reasoning requirement of engineers?
  • What is backward-looking responsibility and how does it compare to forward-looking responsibility?
  • What is the distinction between responsibility to think and responsibility to act? How does it relate to the distinction between action-oriented obligations and reasoning requirements?
  • What is a code of ethics? Who do codes of ethics apply to?
  • What is the standard of care? What feature(s) of a profession does it relate to and how?
  • What is an engineer’s obligation to sound the alarm? What are the two specific ways an engineer may sound the alarm?

References & Further Reading

Bazerman, M.H. & Tenbrunsel, A.E. (2011). Blind Spots: Why We Fail to Do What’s Right and What to Do About It (Princeton University Press).

Curd, M. & May, L. (1984). Professional Responsibility for Harmful Actions (Kendall/Hunt Publishers).

Davis, M. (1998). Thinking Like an Engineer: Studies in the Ethics of a Profession (Oxford University Press).

Davis, M. (2003). “Whistleblowing,” in The Oxford Handbook of Practical Ethics, Hugh LaFollette (ed.) (Oxford University Press).

DeGeorge, R.T. (1981). “Ethical Responsibilities of Engineers in Large Organizations,” Business and Professional Ethics Journal 1:1, pp. 1-14.

Harris, C. E., Pritchard, M.S., James, R.W., Englehardt, E. E., & Rabins, M.J. (2019). Engineering Ethics: Concepts and Cases, 6th ed. (Cengage Learning).

Janis, I. (1982). Groupthink, 2nd ed. (Houghton Mifflin).

Martin, M. & Schinzinger, R. (2015). Ethics in Engineering, 4th ed. (McGraw Hill Publishers).


  1. Michael Davis (1991),“Thinking Like an Engineer: The Place of a Code of Ethics in the Practice of a Profession,” Philosophy & Public Affairs 20:2, 150-167. He notes that his narrative is derived from testimony contained in The Presidential Commission on the Space Shuttle Challenger Disaster (Washington, D.C.: U.S. Government Printing Office, 1986).
  2. This particular framing of the features comes from Mike W. Martin & Roland Schinzinger (2005). Ethics in Engineering, 4th ed. McGraw Hill Education.
  3. This definition is found in the US court case Paxton v. County of Alameda (1953) 119 C. A. 2d 393, 398, 259 P. 2d 934)

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

The Primacy of the Public by Marcus Schultz-Bergin is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.