2 Ethics & Professional Responsibility

Key Themes & Ideas

  • Engineering is a profession, characterized by a commitment to protecting and promoting the public good
  • As professionals, engineers have special moral responsibilities that go above and beyond what the law or common morality require
  • Professional responsibility is characterized by conscientious decision-making, meeting professional obligations, and holding oneself and other professionals accountable for failures
  • The engineer’s commitment to the public good produces obligations to specially consider safety, adhere to the standard of care, and to be proactive in preventing harm
  • Various psychological and organizational barriers to professional responsibility exist, including the problem of many hands, the existence of blind spots, and groupthink

This narrative was originally published by Michael Davis in “Thinking Like an Engineer: The Place of a Code of Ethics in the Practice of a Profession”[1]

On the night of 27 January 1986, Robert Lund was worried. The Space Center was counting down for a shuttle launch the next morning. Lund, vice-president for engineering at Morton Thiokol, had earlier presided over a meeting of engineers that unanimously recommended against the launch. He had concurred and informed his boss, Jerald Mason. Mason informed the Space Center. Lund had expected the flight to be postponed. The Center’s safety record was good. It was good because the Center would not allow a launch unless the technical people approved.

Lund had not approved. He had not approved because the temperature at the launch site would be close to freezing at lift-off. The Space Center was worried about the ice already forming in places on the boosters, but Lund’s worry was the “O-rings” sealing the boosters’ segments. They had been a great idea, permitting Thiokol to build the huge rocket in Utah and ship it in pieces to the Space Center two thousand miles away. Building in Utah was so much more efficient than building on-site that Thiokol had been able to underbid the competition. The shuttle contract had earned Thiokol $150 million in profits.

Challenger Spaceship
The Challenger on lift-off

But, as everyone now knows, the O-rings were not perfect. Data from previous flights indicated that the rings tended to erode in flight, with the worst erosion occurring on the coldest preceding lift-off. Experimental evidence was sketchy but ominous. Erosion seemed to increase as the rings lost their resiliency, and resiliency decreased with temperature. At a certain temperature, the rings could lose so much resiliency that one could fail to seal properly. If a ring failed in flight, the shuttle could explode.

Unfortunately, almost no testing had been done below 40°F. The engineers’ scarce time had had to be devoted to other problems, forcing them to extrapolate from the little data they had. But, with the lives of seven astronauts at stake, the decision seemed clear enough: Safety first.

Or so it had seemed earlier that day. Now Lund was not so sure. The Space Center had been “surprised,” even “appalled,” by the evidence on which the no-launch recommendation had been based. They wanted to launch. They did not say why, but they did not have to. The shuttle program was increasingly falling behind its ambitious launch schedule. Congress had been grumbling for some time. And, if the launch went as scheduled, the president would be able to announce the first teacher in space as part of his State of the Union message the following evening, very good publicity just when the shuttle program needed some.

The Space Center wanted to launch. But they would not launch without Thiokol’s approval. They urged Mason to reconsider. He reexamined the evidence and decided the rings should hold at the expected temperature. Joseph Kilminster, Thiokol’s vice-president for shuttle programs, was ready to sign a launch approval, but only if Lund approved. Lund was now all that stood in the way of launching. Lund’s first response was to repeat his objections. But then Mason said something that made him think again. Mason asked him to think like a manager rather than an engineer. (The exact words seem to have been, “Take off your engineering hat and put on your management hat.”) Lund did and changed his mind. The next morning the shuttle exploded during lift-off, killing all aboard. An O-ring had failed.

Should Lund have reversed his decision and approved the launch? In retrospect, of course, the answer is obvious: No. But most problems concerning what we should do would hardly be problems at all if we could foresee all the consequences of what we do. Fairness to Lund requires us to ask whether he should have approved the launch given only the information available to him at the time. And since Lund seems to have reversed his decision and approved the launch because he began to think like a manager rather than an engineer, we need to consider whether Lund, an engineer, should have been thinking like a manager rather than an engineer. But, before we can consider that, we need to know what the difference is between thinking like a manager and thinking like an engineer.

To understand the difference between thinking like a manager and thinking like an engineer we can introduce the concept of a profession. This is because, as we will see shortly, engineering is a profession whereas business management is not. But investigating what a profession is, and what makes engineering a profession, will also help us get a better grip on the relevance of ethics to engineering in general.

Although we use the term ‘profession’ and ‘professional’ in a variety of ways in everyday communication, the concept has special meaning for our purposes. In particular, a profession is an occupation characterized by three key features: Advanced expertise, Self-regulation, and concerted service to the public good. Engineering, it is claimed, has these three key features while a variety of other occupations, including business management, do not. To evaluate that claim, we can dig a little deeper into the three features. Martin & Schinzinger describe them thusly:[2]

  1. Advanced Expertise. Professions require sophisticated skills (“knowing-how”) and theoretical knowledge (“knowing-that”) in exercising judgment that is not entirely routine or susceptible to mechanization. Preparation to engage in the work typically requires extensive formal education, including technical studies in one or more areas of systematic knowledge as well as broader studies in liberal arts (humanities, sciences, arts). Generally, continuing education and updating knowledge are also required.
  2. Self-regulation. Well-established societies of professionals are allowed by the public to play a major role in setting standards for admission to the profession, drafting codes of ethics, enforcing standards of conduct, and representing the profession before the public and the government. Often this is referred to as the “autonomy of the profession,” which forms the basis for individual professionals to exercise autonomous professional judgment in their work.
  3. Public good. The occupation serves some important public good, or aspect of the public good, and it does so by making a concerted effort to maintain high ethical standards throughout the profession. For example, medicine is directed toward promoting health, law toward protecting the public’s legal rights, and engineering toward technological solutions to problems concerning the public’s well-being, safety, and health. The aims and guidelines in serving the public good are detailed in professional codes of ethics, which, in order to ensure the public good is served, need to be taken seriously throughout the profession.

In short, professions involve extensive and uncommon knowledge that is essential to the public good. But because the knowledge involved is uncommon, society in general must trust professionals to use their expertise for the good of the public and hold each other accountable for doing so. Without the expertise in structural engineering, I cannot determine on my own whether the bridge I am about to drive over is safe. Instead, I am forced to trust that the engineers that built and regularly check the integrity of the bridge are competent, conscientious, and committed to using their power responsibly.

So, as a first pass, we can suggest that when Lund “put on his manager hat” he both ceased to exercise autonomous professional judgment – instead having his judgment controlled by the needs of the company and pressures of the management team – and failed to take seriously his obligation to serve the public good – instead focusing too narrowly on the good of the company.

1. Engineering and the Public Good

All professions involve a commitment to serve some aspect of the public good, above and beyond what is required by law or ordinary morality. In this way, professions create their own role-specific ethical code: ethical requirements that apply only to professionals and to them only in their professional capacity. Sometimes these are concrete requirements, as is the case with the physician requirement to render medical aid to anyone in need. Other times these are reasoning requirements: requirements to reason about situations in certain ways. For instance, physicians are required to put the needs of their patients above (nearly) all else. Similarly for attorneys: they are required to make decisions with the best interests of their client in mind.

In a later section we will examine some of the concrete responsibilities of engineers, but for now we can emphasize the key reasoning requirement. In doing so, we can further distinguish what it means to “think like an engineer” from what it means to “think like a manager”. While physicians must privilege the needs of their patients and attorneys must privilege the interests of their clients, engineers are professionally required to privilege the welfare of the public. This includes, among other things, privileging the needs and interests of the public above the interests of their employer or client. And so we see, again, how Lund acted unprofessionally in changing his decision: when he put his “management hat” on he ceased privileging the welfare of the public – in particular the astronauts – in exercising his professional judgment.

That engineers, as professionals, must reason in a special way is captured in the various codes of ethics produced by professional engineering societies, as evidenced in the box below. Importantly, codes of ethics are written documents that state the moral responsibilities of members of the profession. While they are produced by various professional societies – for instance, the National Society of Professional Engineers (NSPE) or the Association for Computing Machinery (ACM) – the responsibilities identified are not simply created by these professional societies nor do they only apply to members of those specific societies. The responsibilities outlined apply to all practicing engineers in virtue of their specialized expertise and power within society. This is most clearly captured in the Preamble to the NSPE Code of Ethics (emphasis added):

Engineering is an important and learned profession. As members of this profession, engineers are expected to exhibit the highest standards of honesty and integrity. Engineering has a direct and vital impact on the quality of life for all people. Accordingly, the services provided by engineers require honesty, impartiality, fairness, and equity, and must be dedicated to the protection of the public health, safety, and welfare. Engineers must perform under a standard of professional behavior that requires adherence to the highest principles of ethical conduct.

It is because engineering has a direct and vital impact on the quality of life for all people that engineers must adhere to their professional responsibilities.

The Primacy of the Public in Engineering Codes of Ethics

“Fundamental Canon 1” of the National Society of Professional Engineers’ Code of Ethics states that “Engineers, in the fulfillment of their professional duties, shall hold paramount the safety, health, and welfare of the public.”

The first principle of the Software Engineering Code of Ethics and Professional Practice, produced jointly by the Association for Computing Machinery & the Computer Science division of Institute of Electrical and Electronics Engineers (IEEE), states that “Software engineers shall act consistently with the public interest.”

The first principle of ACM’s Code of Ethics and Professional Responsibility, applicable to all computing professionals, states that “A computing professional should contribute to society and to human well-being, acknowledging that all people are stakeholders in computing” while the second principle requires that computing professionals “avoid harm”.

The first requirement of the IEEE Code of Ethics similarly states that (electrical) engineers should “hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, to protect the privacy of others, and to disclose promptly factors that might endanger the public or the environment.”

The American Institute of Chemical Engineers’ Code of Ethics states that (chemical) engineers shall “hold paramount the safety, health and welfare of the public and protect the environment in performance of their professional duties.”

Finally, the American Society of Mechanical Engineers’ code of ethics echoes earlier codes of ethics with the first fundamental canon that “engineers shall hold paramount the safety, health and welfare of the public in the performance of their professional duties” while also stating, as the first principle of the code of ethics, that “engineers… [use] their knowledge and skill for the enhancement of human welfare.”

2. The Multiple Faces of Responsibility

A lot has been said thus far about professionals having special responsibilities. But what, exactly, do we mean by responsibility? Like profession, we can use the term responsibility in a variety of ways. And, it turns out, a number of these different understandings of responsibility are relevant to engineering. Consider the following uses of the term:

  • In approving the Challenger launch, Lund failed to meet his professional responsibilities
  • In approving the launch, Lund is responsible for the disaster
  • In approving the launch, Lund failed to act responsibly

Each of these statements makes use of the term responsibility in a different way, as we can see if we reinterpret each of them to remove reference to responsibility entirely:

  • In approving the Challenger launch, Lund failed to meet his professional obligations
  • In approving the launch, Lund is accountable for the disaster
  • In approving the launch, Lund failed to act conscientiously

Each of these uses of responsibility are, of course, related: Lund is accountable for the disaster because he failed to meet his professional obligations. Or, we may go further and say that he is accountable because he was not conscientious of his professional obligations – he not only failed to meet them, he didn’t properly consider them. Either way, we can see that judgments of accountability follow from judgments of meeting (or failing to meet) obligations and/or judgments of conscientiousness (or a lack thereof).

Professional responsibility involves all these uses. A truly responsible professional reasons conscientiously – properly taking into account her obligations and the interests of all those affected by her decisions. And she is generally successful in meeting her obligations, as a result of that conscientiousness. And when she fails she does not back down from the accountability. Indeed, a conscientious engineer will “take responsibility” – put herself in the position to be accountable for decisions – when necessary to ensure that her basic obligation to the public good is met.

3. Professional Obligations

So far we have suggested that engineers have a professional obligation to reason in a special way – namely, in a way that is appropriately conscientious of the safety, health, and welfare of the public. But now we can start to flesh out that obligation by identifying some more concrete obligations that follow from it.

3.1. A Special Concern for Safety

We begin by reiterating the core obligation previously identified. Every existing engineering and computing code of ethics makes clear the most important responsibility of engineering and computing professionals: the safety of the public. Put a bit more broadly, the paramount responsibility is to the welfare of the public. This commitment tells us two important things: that engineers should always prioritize the interests of the public above those of their employer, client, or the profession itself; and that in prioritizing the interests of the public, the focus should be on “safety first”. This idea is sometimes captured in the slogan that engineers “have a special concern for safety”.

Focusing on this special concern for safety can help us make sense of the difference between thinking like an engineer and thinking like a manager and why Lund reversed his decision when asked to “think like a manager”. While a manager may (hopefully) care about safety to some extent, it is not their professional responsibility to have a special concern for it. When Lund was thinking like an engineer, keeping in mind his special concern for safety, he refused to approve the launch out of concern for the safety of the astronauts. But when he was thinking like a manager, that concern for safety was now placed alongside concern for the financial well-being of Morton Thiokol as well as, perhaps, his on-going employment. Put another way, “thinking like an engineer” involves committing to a principle that prioritizes public safety above all else.

Engineering and “Public Welfare”

What do the engineering codes of ethics mean by “public welfare”? Like professionwelfare is a term that is used in a variety of ways in society but has an important meaning in the context of ethics. For our purposes, welfare is equivalent to well-being and it encompasses a wide variety of conditions, activities, and more that contribute to a life going well. This is why “public good” is a useful equivalence, for in ethics when we think of welfare we are thinking about what is “good for” a person (or potentially a group or system).

So what is “good for” a person? Various theories of well-being exist in the philosophical and psychological literature. Rather than wading into those debates, we can take a step back from the question “what is well-being?” and instead ask “what conditions are necessary for the realization of some of the most commonly recognized elements of well-being?” This is, in part, because although various competing theories of well-being exist, they all still typically identify many of the same elements. Their disagreement is more over what makes something an element of well-being rather than the elements themselves.

So what are those conditions necessary for the realization of well-being? Well, those most relevant to engineering include things like having food, shelter, and water, having satisfying human relationships, having free movement and expression, and having a satisfactory relationship to the natural world. Engineering and technology can affect all of these conditions both positively and negatively. Pollution from engineering projects can make access to (clean) water difficult while technological advancements in water purification can make access easier. Recent technological creations like the internet have greatly altered how we relate to each other, making it both easier and more difficult to have satisfying human relationships. Planes, trains, and automobiles all greatly enhanced freedom of movement while the internet has certainly enhanced peoples’ ability to express themselves (for better and for worse!).

3.2. The Standard of Care

It is common that the standards professionals set for themselves later become legal requirements as well. For instance, the duty of confidentiality that professionals like physicians and lawyers have are now legally binding in the United States and elsewhere. Sometimes, as with the duty of confidentiality, the standards are enshrined quite explicitly within the law. But for all professions there remains a different type of legal standard known as the standard of care.

The standard of care is “that level or quality of service ordinarily provided by other normally competent practitioners of good standing in that field, contemporaneously providing similar services in the same locality and under the same circumstances.”[3] Importantly, the definition just given is the definition provided in law. It is not further detailed (or, when it is, those aspects become separate legal requirements). Instead, the standard of care intentionally “offloads” determinations of what precisely the standard of care requires, and when it is violated, to the profession. As the profession changes, so does the standard of care. And when a question arises as to whether a professional has violated the standard of care the courts turn to other members of the profession, via expert testimony, to make that determination.

Legally requiring adherence to the standard of care, and holding professionals responsible for harms caused as a result of the failure to adhere, is one of the central ways that society simultaneously grants greater privileges to professionals and holds them to a higher standard. It grants the profession the privilege of determining violations of the law but also requires professionals to meet a standard that non-professionals do not need to meet.

It is important to note the role the Standard of Care plays in establishing professional responsibility. All people are responsible for harms they cause as a result of deliberate wrongdoing or recklessness. But the standards of “deliberate wrongdoing” and “reckless wrongdoing” both require the person to knowingly act wrongly (or at least be in a position that they should have known they were acting wrongly). Negligent harm – the sort of harm that results from failure to adhere to the standard of care – has a lower standard. It suggests that a person can be responsible for the harms they caused even if they did not intend to cause harm nor could they have foreseen that their actions (or inactions) were likely to cause harm. And so, once again, we see how professionals are held to higher standards than non-professionals. They are expected to keep up with the standards of their field and to “do their due diligence” in all their professional activities.

3.3. The Duty to Warn

Engineers have special responsibilities to their clients and employers. Among these is a general duty of confidentiality: a duty not to disclose certain types of information without the consent of the client or employer. Importantly, though, this general duty of confidentiality is not absolute. The engineer’s prime responsibility to the public good can override it. And one way this happens is through what is called the duty to warn.

The duty to warn is a common duty among professionals – if a professional has a duty of confidentiality then they have a duty to warn that can override that duty of confidentiality. Of course exactly what triggers the duty to warn differs among the professions, but they are all loosely tied together by a recognition that protecting people from harm is more important than protecting confidentiality. If a patient discloses to a physician that she is considering harming someone else, then the physician has a duty to break physician-patient confidentiality and report the threat. If a client discloses to a lawyer that he plans to harm a witness, then she has a duty to break client-attorney privilege and report the threat. Similarly, if an engineer has good reason to suspect that the engineering project or product poses a threat to the safety of the public, then she has a duty to break confidentiality and report the threat.

Like the Standard of Care, the Duty to Warn is both a professional responsibility and a legal responsibility. While it is most commonly relevant to structural engineers overseeing or otherwise inspecting buildings, bridges, and the like, it applies to all engineers. And, like the Standard of Care, it shows us another way in which professional responsibility goes beyond standard responsibility. Unlike a private citizen, an engineer who fails to warn of the imminent collapse of a building she is inspecting can be held liable for the harms and damage that result from the collapse. This is precisely what happened with the FIGG Bridge Engineers and the Florida International University pedestrian bridge collapse. Although the collapse was the result of miscalculations on their part, the other problem was that when they saw the cracks forming they did not report it as a threat. They therefore failed both in their responsibility to competently exercise their skill and their responsibility to warn of potential threats to life and property.

3.4. Whistle-blowing

The final professional responsibility we will explore is closely related to the Duty to Warn but is also probably the most contentious and complex professional responsibility. Whistleblowing, like the duty to warn, involves breaking confidentiality. And it can, like the duty to warn, sometimes be focused on preventing imminent harm or property damage. But the duty to “blow the whistle” extends beyond mere harm.

To begin to understand the duty, it can help to examine what the Codes of Ethics say about the matter. Canon 1 of IEEE Code of Ethics states that engineers have an obligation to “disclose promptly factors that might endanger the public or the environment” while the very first “Rule of Practice” in the NSPE Code of Ethics states that “if [an engineer’s] professional judgment is overruled, under circumstances where the safety, health, property, or welfare of the public are endangered, they shall notify their employer or client and such other authority as may be appropriate.” Relatedly, that same code of ethics obligates engineers to “notify the proper authorities and withdraw from further service on the project… if the client or employer insists on… unprofessional conduct.” Finally, and perhaps most broadly, the NSPE Code of Ethics also obligates engineers to report violations of the code of ethics, regardless of whether any specific harm may result: “Engineers having knowledge of any alleged violation of this Code shall report thereon to appropriate professional bodies and, when relevant, also to public authorities, and cooperate with the proper authorities in furnishing such information or assistance as may be required.”

Our knowledge of the events leading up to the Challenger disaster, including the specific discussion relayed at the start of this chapter, are the result of an engineer taking his whistleblowing responsibility seriously. Roger Boisjoly attempted to prevent the launch at the time, but in the aftermath was also the major source of information for the Rogers Commission which investigated the disaster and was ultimately essential in determining that the disaster was in fact preventable and that Lund, among others, failed to act professionally in approving the launch.

There is much more to say about the obligation to whistleblow. Indeed, there are multiple competing theories of when whistleblowing is justified. But, for our purposes, it is sufficient now to know that engineers have such an obligation. Those interested in further exploration of whistleblowing are encouraged to consult some of the resources provided in the “Further Reading” section below.

4. Impediments to Responsibility

Thus far we have set up what is required of engineers as professionals – that is, what their professional responsibilities are. To conclude this chapter, however, it is worth recognizing the difficulties that can arise in meeting one’s professional responsibilities. The engineer and historian Edwin T. Layton, Jr. captured what is perhaps the most constant threat to meeting one’s responsibilities when he notes that “the engineer’s problem has centered on a conflict between professional independence and bureaucratic loyalty,” and “the role of the engineer represents a patchwork of compromises between professional ideals and business demands.”[4] Put simply, the engineer’s “dual loyalty” – to the public and to their client or employer – often makes it difficult to successfully meet their professional obligations.

But a variety of other obstacles, some resulting from this dual loyalty, also exist, and we briefly explore them below. Importantly, many of these obstacles are the result of human psychological limitations or complexities that arise from complex projects and complex organizations. As such, they can help us explain why engineers may fail to meet their responsibilities. However, we should not take them as justification for failing to meet those responsibilities – they aren’t means of letting people off the hook so much as understanding why people could be aware of what is required of them and yet fail.

4.1. The Problem of Many Hands

One common way that individuals attempt to evade personal responsibility for their actions is by pointing out that many individuals had a hand in causing the problem. This problem of many hands is constantly at play in engineering, where nearly all projects involve the input of a number of people. Consider Lund’s responsibility for the Challenger disaster: while he was clearly involved in making the decision to launch, he was certainly not the only one involved in that decision. Thus, he could claim that he is not (wholly) responsible for the disaster that resulted.

The problem of many hands is a general ethical problem with a complex literature. Our main goal here is simply to note the problem, but we can also quickly examine one of the approaches philosophers have taken to responding to the problem. The Principle of Responsibility for Action in Groups states that the degree of responsibility of each member of the group depends on the extent to which the member caused the action by some action reasonably avoidable on their part. This principle captures the fact that sometimes a person may be in a position to take heroic action to prevent some harm, but that we should not really expect such heroic action. On the other hand, if a person is in a position to take a more reasonable action to avoid the harm – for instance by simply saying “no” to the launch of a shuttle – then their responsibility for the result is greater.

4.2. Blind Spots

There is a general human inclination to overestimate how ethical we actually are. Importantly, this is true of even morally decent people who are trying to get it right. And this is because all of us have blind spots. In some cases these are things we are simply not aware of – you cannot see what you do not know and so if you have no knowledge of a certain type of ethical issue then you will be unable to see it when it arises. Similar forms of inattentional blindness can arise even when we do have knowledge because we are simply incapable of being aware of all things at once. Thus, we may be so focused on seeing a certain thing that we simply miss something else that is otherwise “right before our eyes.” It is, in part, because of these sorts of blind spots that important decisions should be made by committee. The hope, in doing so, is that the blind spots “cancel out” across the group – my lack of awareness of an issue is corrected by someone else’s awareness of it.

A more common issue for those working in massive corporations is willful blindness. If we want to believe ourselves morally decent people who would not be involved in wrongdoing, then we may sometimes simply “look the other way” when wrongdoing is occurring. This is especially true in the corporate sector where seeing the wrongdoing may force us into a moral conflict between our duties to the public and our duties to our employer. For instance, ignorance of vital information is an obvious barrier to responsible action and is, in many cases, a legitimate excuse for not acting responsibly – if I did not have the necessary information to know wrongdoing was occurring then I cannot be responsible for that wrongdoing. And, of course, sometimes vital information is simply not available to us and so we are legitimately excused. But we may also engage in willful avoidance of information – if we know that we may potentially discover something that would put us in a position to make morally difficult choices then we may work to avoid discovering that information. Importantly, of course, while ignorance is sometimes a legitimate excuse it is not always, and we typically distinguish between culpable and non-culpable ignorance. Simply consider the phrase “you should have known better” – that is a statement that admits you may have in fact been lacking some information but nevertheless holds you accountable for that lack of information.

4.3. Microscopic Vision

The philosopher Michael Davis has identified microscopic vision as another barrier to responsible action. While looking through a microscope helps us see things we otherwise would not see, it also greatly limits what we can see. The same sort of thing happens with every skill. As Davis puts it:

A shoemaker, for example, can tell more about a shoe in a few seconds than I could tell if I had a week to examine it. He can see that the shoe is well or poorly made, that the materials are good or bad, and so on. I can’t see any of that. But the shoemaker’s insight has its price. While he is paying attention to people’s shoes, he may be missing what the people in them are saying or doing.[5]

Engineers often face microscopic vision due to their technical expertise. This can lead them to fail to recognize the common ways non-experts may mis-use or fail to properly use products as well as ways the general public may respond to engineering projects. And this sort of microscopic vision is worse in large organizations, as Davis suggests when applying this idea to NASA, Morton Thiokol, and the Challenger disaster. For large organizations tend to promote microscopic thinking by specializing individuals and siloing their responsibility for projects.

4.4. Groupthink

Thus far the barriers we have focused on have tended to be barriers each of us, as individuals, may face. The potential solution, then, has typically been to involve others with out thinking. If I have certain blindspots, then I need to reason alongside others who may correct for them. If I have microscopic vision of a certain sort, then I should reason alongside others who see (perhaps still microscopically) what I am ignoring. In these ways, we have been generally suggesting that complex and difficult moral decisions are better made by committee.

While this may generally be true, it is also worth noting that group decision-making can create its own obstacles to responsible action. The most common of these is groupthink, situations in which groups come to agreement at the expense of critical thinking.[6] Groupthink can result, broadly, from groups displaying the very features often prized in organizations: high levels of cohesiveness, solidarity, and loyalty. And there are eight “symptoms” that both evidence groupthink and result from it:

  • An illusion of invulnerability of the group to failure
  • A strong “we-feeling” that views outsiders as adversaries or enemies and encourages shared stereotypes of others
  • Rationalizations that tend to shift responsibility to others
  • An illusion of morality that assumes the inherent morality of the group and thereby discourages careful examination of the moral implications of what the group is doing
  • A tendency of individual members toward self-censorship, resulting from a desire not to “rock the boat”
  • An illusion of unanimity, construing silence of a group member as consent
  • An application of direct pressure on those who show signs of disagreement, often exercised by the group leader who intervenes in an effort to keep the group unified
  • Mindguarding, or protecting the group from dissenting views by preventing their introduction

Many have identified groupthink as the fundamental culprit for a number of disastrous decisions: the US invasion of the Bay of Pigs, the Columbia space shuttle explosion, and the release of the General Motors’ Corvair automobile in the 1960s.

So, if individual thinking can lead us astray due to blindspots and microscopic thinking but collective thinking can lead us astray due to groupthink, what ought we to do? Here the work on combating groupthink is helpful. A number of the symptoms listed above can be directly dealt with: outsiders can be deliberately invited to be involved in the decision, members may submit concerns anonymously, and organizations can cultivate “cultures of dissent” where individuals feel empowered to be critical of the group and its decision-making. In these ways, we can hope to simultaneously correct for common errors in individual decision-making while not falling prey to common errors of collective decision-making.

Check Your Understanding

After successfully completing this chapter, you should be able to answer all the following questions:

  • What is a profession and what are its key features? What makes engineering a profession?
  • What is the fundamental moral commitment of professional engineers?
  • What is the relationship between responsibility as obligation, accountability, and conscientiousness?
  • What is welfare?
  • What is the standard of care?
  • What are the engineers’ duties to warn and whistleblow?
  • What are some of the common barriers to professional responsibility?

References & Further Reading

Bazerman, M.H. & Tenbrunsel, A.E. (2011). Blind Spots: Why We Fail to Do What’s Right and What to Do About It (Princeton University Press).

Curd, M. & May, L. (1984). Professional Responsibility for Harmful Actions (Kendall/Hunt Publishers).

Davis, M. (1998). Thinking Like an Engineer: Studies in the Ethics of a Profession (Oxford University Press).

Davis, M. (2003). “Whistleblowing,” in The Oxford Handbook of Practical Ethics, Hugh LaFollette (ed.) (Oxford University Press).

DeGeorge, R.T. (1981). “Ethical Responsibilities of Engineers in Large Organizations,” Business and Professional Ethics Journal 1:1, pp. 1-14.

Harris, C. E., Pritchard, M.S., James, R.W., Englehardt, E. E., & Rabins, M.J. (2019). Engineering Ethics: Concepts and Cases, 6th ed. (Cengage Learning).

Janis, I. (1982). Groupthink, 2nd ed. (Houghton Mifflin).

Martin, M. & Schinzinger, R. (2015). Ethics in Engineering, 4th ed. (McGraw Hill Publishers).


  1. Michael Davis (1991),“Thinking Like an Engineer: The Place of a Code of Ethics in the Practice of a Profession,” Philosophy & Public Affairs 20:2, 150-167. He notes that his narrative is derived from testimony contained in The Presidential Commission on the Space Shuttle Challenger Disaster (Washington, D.C.: U.S. Government Printing Office, 1986).
  2. Mike W. Martin & Roland Schinzinger (2005). Ethics in Engineering, 4th ed. McGraw Hill Education.
  3. This definition is found in the US court case Paxton v. County of Alameda (1953) 119 C. A. 2d 393, 398, 259 P. 2d 934)
  4. Edwin T. Layton, Jr., The Revolt of the Engineers: Social Responsibility and the American Engineering Profession (Baltimore: Johns Hopkins University Press, 1986), pp. 1, 5.
  5. Michael Davis (1989), "Explaining Wrongdoing," Journal of Social Philosophy, pp. 74-90.
  6. Irving Janis (1982), Groupthink, 2nd ed. (Boston: Houghton Mifflin).

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

The Primacy of the Public by Marcus Schultz-Bergin is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.