1 Ethics as Moral Deliberation

This book is focused on helping you develop your ability to engage in moral deliberation in the context of engineering and technology (and more generally). Importantly, moral deliberation is an activity we can engage in and therefore a skill we can develop through practice; it is not merely a matter of “book learning”. Thus, while this book is aimed at providing various tools to help improve your moral deliberation skills, simply reading the book will not be sufficient to do so; you must take the tools provided in this book and apply them to situations, typically alongside others, to develop your abilities most fully.

While later chapters in this book will focus on more immediately practical tools, in this chapter we want to orient ourselves appropriately by understanding what moral deliberation consists in, including its connection with other fundamental moral concepts. Toward that end, we can begin with a fuller definition of moral deliberation as the active process of considering relevant moral values, moral rules, and moral principles to make a decision or judgment grounded in appropriate moral reasons. Obviously, this definition introduces a variety of other key concepts, and we will detail those more below. For now, it is worth flagging that we call this moral deliberation rather than something like moral reasoning precisely to highlight the fact that the process should aim to consider things from multiple angles, incorporating multiple likely competing values, rules, and/or principles. Deliberation also highlights the fact that complex ethical decisions are often best made collaboratively via deliberation amongst multiple people, although it is also possible (and sometimes necessary) to deliberate with oneself. Finally, deliberation, in contrast to reasoning, suggests that there may not be one singularly correct answer; ethics is a complex matter and so we should not necessarily expect that it can be resolved algorithmically like a problem in mathematics or physics.

1. Moral Deliberation & Moral Perception

Moral deliberation begins with moral perception: the disposition and ability to identify the relevant moral features of a situation. These moral features include the moral values, rules, and principles at play in the situation. And just like visual perception, what you can perceive in a given situation is largely a matter of the concepts you are familiar with. That big leafy green thing is only a ‘tree’ to you if you already have the concept of ‘tree’. Similarly, for a situation to even potentially appear unfair to you, you must be at least somewhat familiar with the concept of fairness. This is why “book learning” still has an important role to play in the development of moral deliberation: it is the means by which we come to familiarize ourselves with those concepts we will later need to perceive and use in deliberation.

The most central element to moral perception is the awareness and identification of relevant moral values: the goals or ends that we believe are important to our lives. Above, I mentioned the concept of fairness, which is a moral value. But so are things like pleasure, honesty, and friendship. There is no fixed list of moral values, and the sorts of things people consider important is quite diverse. For these reasons, we will adopt a perspective known as ethical pluralism. This is a general social outlook that holds that there are many competing but reasonable moral values relevant to human existence. This does not mean anything goes – there are still some putative ‘moral values’ that are simply unreasonable. Valuing the pain of others, for instance, is unreasonable (absent certain other features perhaps). Additionally, adopting the outlook of ethical pluralism doesn’t commit us to the claim that, as a matter of fact, there are many competing values; it may be the case that all true moral values are consistent with one another. But we adopt the outlook of ethical pluralism precisely because we recognize that ethical matters are complex and so even if there is a ‘correct’ set of moral values, it will be very difficult for us to know what they are. Thus, ethical pluralism is largely a methodological outlook: It provides a guide for our method of ethical deliberation, rather than insisting on any deep truths about reality.

Because we start from the perceptive of ethical pluralism, we should immediately recognize the importance of engaging in conversation with others about moral issues and of, more generally, attempting to understand moral situations from multiple diverse perspectives. We should not take our own perspective as veridical – as clearly indicating what is true; rather we should put it up for interrogation in the same way we interrogate the perspectives of others. It is a rare occasion that we may perceive all the relevant moral values at play in a situation. This is partly because different people value different things and so some value may only be relevant in a situation because some specific person with that value is affected by the situation. It is also because we all have a general bias toward our own perspective, our conscious awareness is limited, and so we simply cannot take in and consider all the relevant data all the time. These are just basic facts of human psychology that we must contend with in developing and deploying our moral perception.

So, to be able to engage in moral deliberation we first must be able to perceive the relevant situation reasonably fully from a moral perspective. As noted above, this most fundamentally consists in being able to identify the relevant values at play. But it also involves, as previously noted, being able to consider any relevant moral rules and moral principles. But both concepts are also directly relevant to engaging in the process of moral deliberation, and so we will turn to that now.

2. Moral rules and principles in deliberation

Once we have a decent grasp on the values at play in a situation, we can begin to consciously process those values to determine either what to do (when we need to act) or simply how to evaluate the situation (when we simply need to judge). Here we consider potentially relevant moral rules, which are a type of moral standard that specifies how to respond to certain situations given certain moral values. For instance, it is a general moral rule that you should not murder another human. If we understand ‘murder’ to mean ‘unjustified killing’ then that moral rule is always applicable and clearly gives us a firm guide to action (or inaction). Moral principles are another type of moral standard but provide less definitive guidance than moral rules. Moral principles provide general guides for moral perception and deliberation by directing us in how to consider relevant moral values. But they do not give us a definitive rule to act on. For instance, it is a general moral principle that one ought not cause harm. This directs us to pay attention to whether an action may cause harm, and tells us generally how to respond to it (avoid causing it) but it does not tell us we can never cause harm. And this makes sense: we know that while in many cases it is wrong to cause harm, in some cases it can be right, as when we administer a painful medical treatment to someone to prevent future pain or other issues.

Moral rules are less universally shared than moral principles. This is largely because moral rules take a firmer stand on how to weigh competing values than moral principles do. That is why you will typically find moral rules shared among members of communities that share a lot else in common as well. For instance, members of a specific religious denomination may all largely share a similar set of values and a similar weighting for those values in comparison with one another. As a result, they will share similar moral rules, since moral rules depend on taking a pretty firm stand on the relative weight of various values. The rule against murder, for instance, can be understood as placing the value of ‘human life’ above (just about) any other value. But since there is a diversity of values across large populations, there will tend to be less shared weighting of values and so fewer shared moral rules. Moral principles, on the other hand, never clearly establish a weighting of values. Instead, they may highlight certain values as important relative to some other values but leave open how that set of important values relate to each other. Thus, a principle telling us not to cause harm tells us causing harm is generally a bad thing, but we can also have a principle telling us to provide for the long-term health of others. Exactly what to do when both principles are operative will depend on the context.

Thus, moral rules and moral principles play a role in moral perception by helping to direct our attention to certain features of the situation. But they also play a role in moral deliberation because they function as standards by which we judge actions and outcomes. If we know that a situation involves an unjustified killing and we know there is a moral rule obligating us to avoid unjustified killing, then we know that we should avoid that situation or judge it negatively (as ‘bad’ or ‘wrong’, for instance).

3. Moral Reasons in Moral Deliberation

But what does moral deliberation actually look like? We have said thus far that it involves processing the relevant values, rules, and principles. But how should we go about processing them? We do that, either alone or together, by providing moral reasons: potential justifications for (in)action or evaluation that appeal to accepted moral values, rules, and principles. The key here is that we are justifying why we should do something or why something should be evaluated as ‘bad’ or ‘wrong’. This is importantly distinct from simply explaining the situation. To see the difference, compare the following two statements:

Justification: Majeet adopted the cat because doing so would provide it a healthier life

Explanation: Majeet adopted the cat because he saw it on the street, and it made him cry

In the first case, we are indicating why Majeet should have done what he did by appeal to a moral value (animal welfare, for instance). In the second case, we are simply explaining (part of) the process Majeet went through. Explanations simply tell us why something is as it is while justifications aim to tell us why it should be as it is (or should not be as it is in the case of a negative judgment).

The most common way people confuse justifications and explanations in moral deliberation is by focusing on a person’s psychology: we explain why someone did or will do something by telling a causal story about their thoughts. But a justification for a person’s actions does not depend on that person being aware of that justification when acting. If I jump in a pond to save a drowning child because I thought it was a cool log and I like logs, my action may still be justified because the child was in danger and we should help others who are in danger. Thus, what a person is actually thinking is not necessarily relevant to whether they were justified in what they did (or refrained from doing). Similarly, in deciding what to do in a complex situation, our reasons should not appeal to what we would in fact do in that situation or what any given person is likely to do. These sorts of things may be helpful to consider in trying to understand patterns of behavior and so certainly have their place. But their place is not in moral deliberation, which is focused on reason giving – on justification.

Thus, moral deliberation largely consists in crafting various moral reasons based on the relevant moral qualities of the situation and comparing them. The reason(s) which seem to best capture the values, rules, and principles, and weight them most appropriately relative to each other, are the ones that should carry the day. It should be clear that this is not a precise and algorithmic process and we should not expect that every individual will always come to the same conclusion with the same information. Nonetheless, by crafting reasons and making them explicit to ourselves and others, we open them up for interrogation and thus substantially increase the likelihood that we will settle on both a correct moral decision and be able to justify why it is correct.


Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

The Primacy of the Public by Marcus Schultz-Bergin is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.