Print

Moral psychology, empirical work in

DOI
10.4324/9780415249126-L147-1
Versions
Published
2012
DOI: 10.4324/9780415249126-L147-1
Version: v1,  Published online: 2012
Retrieved April 25, 2024, from https://www.rep.routledge.com/articles/thematic/moral-psychology-empirical-work-in/v-1

5. Moral judgments and intuitions

Ethical theories are often tested against our immediate, pretheoretical judgments about morally significant cases – what we might call ‘moral intuitions’. Consider, for example, the widely shared judgment that slavery is immoral or that Hitler’s campaign of genocide was evil. It counts against a theory to at least some extent if it conflicts with such clear intuitions. But what drives them?

One recent line of empirical research focuses on the role of emotion as opposed to reasoning in moral judgment (see Moral sentiments). In particular, Jonathan Haidt and his colleagues have conducted a number of experiments purporting to reveal a starring role for disgust. In one experiment, participants recorded their moral judgments in response to various hypothetical scenarios either at a clean desk or a disgusting desk (with old food, sticky substances, etc.). Those who scored highly on their ability to perceive changes in their bodily state tended to rate some of the actions as more immoral (Schnall et al. 2008). In another study, participants highly susceptible to hypnotism were made to feel disgust upon hearing a morally neutral word (e.g. ‘often’), and then presented with a set of hypothetical cases, some of which employed the term. Interestingly, on average participants rated the behaviour in some of the cases with the disgust-inducing word as more morally wrong than those that didn’t include the word (Wheatley and Haidt 2005). At this early stage in the research, however, the exact extent of disgust’s role in ordinary moral judgment is unclear (see Mallon and Nichols 2010).

In addition to arguing that emotion drives moral judgment, some have added that reasoning’s role is merely in post hoc rationalization. In a series of studies, participants read cases designed to evoke moral outrage but that apparently lacked any paradigm moral transgressions (e.g. harm or violation of rights). One case, for example, describes a brother and sister who once engage in consensual incest with ample protection and without damaging their relationship. In interviews, people are in a state of moral dumbfounding – they are convinced the action is morally wrong, but are unable to find reasons for this judgment. Haidt (2001) suggests that this is largely mere rationalization: moral judgment is at least typically generated by immediate emotional reactions (compare System 1), and reasoning primarily comes in after the fact to defend the intuitive judgment (compare System 2). (For some criticism, see Mallon and Nichols 2010: §2.)

The focus thus far has been on moral judgment generally, but some empirical research suggests that emotion drives only nonutilitarian moral judgments. Most of the relevant studies involve presenting participants with variants on the famous trolley cases, in which (roughly) a protagonist attempts to save five people from being run over by a trolley, but at the cost of one death to a different person. Traditionally, there have been two key cases discussed. In the Side-Track case, five workers are tied to the tracks on the trolley’s path, but a switch next to the protagonist can change the trolley onto a track with only one person strapped to it. Most philosophers believe it is morally permissible to throw the switch, which saves the five but kills the one – only 8 per cent believe it is impermissible (see Bourget and Chalmers 2009). In a key variant, the Footbridge case, while five workers are strapped to the tracks, one large man is on the footbridge, and the protagonist can stop the train to save the five only by pushing the man over to his death. Often, utilitarians and other consequentialists will maintain that it is morally permissible to flip the switch in Side-Track and push the man in Footbridge (see Consequentialism). But deontologists and other nonutilitarians have often argued that the second case is impermissible, because (for example) it uses a person as a mere means to an end, whereas the death in Side-Track is merely a foreseen but unintended side effect of saving the five. (The cases are tied especially to debates about certain deontological principles; see Deontological ethics; Double effect, principle of; Inviolability, §4.)

Whether philosophers’ intuitions are idiosyncratic, and what drives ordinary intuitions, can be examined empirically. One recent study with thousands of participants indicates that the vast majority believe flipping the switch in Side-Track is permissible, but pushing in Footbridge is not (Hauser et al. 2007). While such results comport well with deontological theories of morality and moral judgment, what appears to drive these intuitions may not. Brain-imaging studies employing fMRI suggest that areas of the brain associated with emotion are more active in generating characteristically deontological judgments (e.g. in Footbridge) as opposed to consequentialist ones (e.g. in Side-Track). And the correlation between affect and deontological judgments supports an inference to a causal relationship when conjoined with studies of patients with brain lesions. For example, those with emotional deficits (e.g. VM patients) tend to report consequentialist intuitions about cases like Footbridge, which suggests that the missing affect plays a causal role in generating the deontological intuition in normal subjects. These and other data indicate – contrary to a traditional theme in the philosophical literature – that deontological intuitions are driven more by affect, while consequentialist judgments rely more on our distinctive reasoning capabilities (for review, see Cushman et al. 2010).

What do these empirical data on moral judgment tell us about morality? Joshua Greene (2008) has argued that rationalist deontology is implausible, since the intuitions on which it rests are driven by an emotional response to transgressions that are ‘up-close and personal’, which is a ‘contingent, nonmoral feature of our evolutionary history’ (70). Similarly, one might argue that alleged counterexamples to consequentialism are not trustworthy, since they are generated in response to morally irrelevant factors, such as disgust (see Sinnott-Armstrong et al. 2010: §3.3). While such conclusions are controversial (see e.g. Berker 2009), many currently agree that the data supports the hypothesis that moral judgment arises from at least two distinct processes: affective/intuitive and conscious/cognitive (in the style of dual-process models). However, much remains unclear at this early stage in the research, including how these processes interact, what their origin is, and whether conclusions about their role in judgments about physical harm (as in the trolley cases) generalize to other facets of morality, such as justice and care (see Cushman et al. 2010). Thus, any conclusions about morality proper based on the empirical data are rather tentative, though the future is promising.

Print
Citing this article:
May, Joshua. Moral judgments and intuitions. Moral psychology, empirical work in, 2012, doi:10.4324/9780415249126-L147-1. Routledge Encyclopedia of Philosophy, Taylor and Francis, https://www.rep.routledge.com/articles/thematic/moral-psychology-empirical-work-in/v-1/sections/moral-judgments-and-intuitions.
Copyright © 1998-2024 Routledge.

Related Articles