Link Search Menu Expand Document

A Problem with the Leading Theory

If the slides are not working, or you prefer them full screen, please try this link.

Notes

Greene et al’s Dual-Process Theory

Greene et al offer a dual-process theory of ethical cognition:

‘this theory associates controlled cognition with utilitarian (or consequentialist) moral judgment aimed at promoting the “greater good” (Mill, 1861/1998) while associating automatic emotional responses with competing deontological judgments that are naturally justified in terms of rights or duties (Kant, 1785/1959).’ (Greene, 2015, p. 203)

The theory was developed in part to explain otherwise apparently anomalous responses to moral dilemmas. In particular, people have substantially different attitudes to killing one person in order to save several others depending on whether the killing involves pressing a switch (as in the Switch dilemma) or whether it involves dropping someone through a trapdoor into the path of great danger (as in the Footbridge dilemma).[1]

What is the explanation Greene et al’s theory offers?

‘this pattern of judgment [Switch—yes; Footbridge—no] reflects the outputs of distinct and (in some cases) competing neural systems [...] The more “personal” harmful action in the footbridge case, pushing the man off the footbridge, triggers a relatively strong negative emotional response, whereas the relatively impersonal harmful action in the switch case does not.’ (Greene, 2015, p. 203—4)

Mixed Behavioural Evidence for This Theory

One prediction of the theory is that increasing time pressure should increase the influence of automatic emotional processes relative to the influence of controlled cognition, which in turn should make responses that are characteristically deontological more likely.

This prediction is supported by (Suter & Hertwig, 2011), among others.[2] But Bago & De Neys (2019) consider what happens when subjects first make a moral judgement under time pressure and extraneous cognitive load and then, just after, make another moral judgement (in answer to the same question) with no time pressure and no extraneous cognitive load. They report:

‘Our critical finding is that although there were some instances in which deliberate correction occurred, these were the exception rather than the rule. Across the studies, results consistently showed that in the vast majority of cases in which people opt for a [consequentialist] response after deliberation, the [consequentialist] response is already given in the initial phase’ (Bago & De Neys, 2019, p. 1794).

Rosas & Aguilar-Pardo (2020) find, conversely to what Greene et al’s theory predicts, that subjects are less likely to give characteristically deontological responses under extreme time pressure.

The converse finding of Rosas & Aguilar-Pardo (2020) is not theoretically unmotivated—there are also some theoretical reasons for holding that automatic emotional processes should support characteristically utilitarian responses (Kurzban, DeScioli, & Fein, 2012).

As there is a substantial body of neuropsychological evidence in favour of Greene et al’s theory (reviewed in Greene, 2014), its defenders may be little moved by the mixed behavioural evidence. But there is a reason, not decisive but substantial, to expect mixed evidence more generally ...

Suggestion

While we have not seen decisive evidence against it, we have seen enough to motivate seeking alternatives.

Glossary

characteristically deontological : According to Greene, a judgement is characteristically deontological if it is one in ‘favor of characteristically deontological conclusions (eg, “It’s wrong despite the benefits”)’ (Greene, 2007, p. 39). According to Gawronski, Armstrong, Conway, Friesdorf, & Hütter (2017, p. 365), ‘a given judgment cannot be categorized as deontological without confirming its property of being sensitive to moral norms.’
dual-process theory : Any theory concerning abilities in a particular domain on which those abilities involve two or more processes which are distinct in this sense: the conditions which influence whether one mindreading process occurs differ from the conditions which influence whether another occurs.
Footbridge : A dilemma; also known as Drop. A runaway trolley is about to run over and kill five people. You can hit a switch that will release the bottom of a footbridge and one person will fall onto the track. The trolley will hit this person, slow down, and not hit the five people further down the track. Is it okay to hit the switch?
Switch : A dilemma; also known as Trolley. A runaway trolley is about to run over and kill five people. You can hit a switch that will divert the trolley onto a different set of tracks where it will kill only one. Is it okay to hit the switch?

References

Bago, B., & De Neys, W. (2019). The Intuitive Greater Good: Testing the Corrective Dual Process Model of Moral Cognition. Journal of Experimental Psychology: General, 148(10), 1782–1801. https://doi.org/10.1037/xge0000533
Conway, P., & Gawronski, B. (2013). Deontological and utilitarian inclinations in moral decision making: A process dissociation approach. Journal of Personality and Social Psychology, 104(2), 216–235. https://doi.org/10.1037/a0031021
Crockett, M. J. (2013). Models of morality. Trends in Cognitive Sciences, 17(8), 363–366. https://doi.org/10.1016/j.tics.2013.06.005
Gawronski, B., Armstrong, J., Conway, P., Friesdorf, R., & Hütter, M. (2017). Consequences, norms, and generalized inaction in moral dilemmas: The CNI model of moral decision-making. Journal of Personality and Social Psychology, 113(3), 343–376. https://doi.org/10.1037/pspa0000086
Gawronski, B., & Beer, J. S. (2017). What makes moral dilemma judgments “utilitarian” or “deontological”? Social Neuroscience, 12(6), 626–632. https://doi.org/10.1080/17470919.2016.1248787
Gawronski, B., Conway, P., Armstrong, J., Friesdorf, R., & Hütter, M. (2018). Effects of incidental emotions on moral dilemma judgments: An analysis using the CNI model. Emotion, 18(7), 989–1008. https://doi.org/10.1037/emo0000399
Greene, J. D. (2007). The Secret Joke of Kant’s Soul. In W. Sinnott-Armstrong (Ed.), Moral Psychology, Vol. 3 (pp. 35–79). MIT Press.
Greene, J. D. (2014). Beyond Point-and-Shoot Morality: Why Cognitive (Neuro)Science Matters for Ethics. Ethics, 124(4), 695–726. https://doi.org/10.1086/675875
Greene, J. D. (2015). The cognitive neuroscience of moral judgment and decision making. In The moral brain: A multidisciplinary perspective (pp. 197–220). Cambridge, MA, US: MIT Press.
Hrdy, S. B. (1979). Infanticide among animals: A review, classification, and examination of the implications for the reproductive strategies of females. Ethology and Sociobiology, 1(1), 13–40. https://doi.org/10.1016/0162-3095(79)90004-9
Kurzban, R., DeScioli, P., & Fein, D. (2012). Hamilton vs. Kant: Pitting adaptations for altruism against adaptations for moral judgment. Evolution and Human Behavior, 33(4), 323–333.
Rosas, A., & Aguilar-Pardo, D. (2020). Extreme time-pressure reveals utilitarian intuitions in sacrificial dilemmas. Thinking & Reasoning, 26(4), 534–551. https://doi.org/10.1080/13546783.2019.1679665
Suter, R. S., & Hertwig, R. (2011). Time and moral judgment. Cognition, 119(3), 454–458. https://doi.org/10.1016/j.cognition.2011.01.018
Trémolière, B., & Bonnefon, J.-F. (2014). Efficient Kill the Cognitive Demands on Counterintuitive Moral Utilitarianism. Personality and Social Psychology Bulletin, 124(3), 379–384. https://doi.org/10.1177/0146167214530436

Endnotes

  1. See Greene (2015, p. 203): ‘We developed this theory in response to a long-standing philosophical puzzle ... Why do people typically say “yes” to hitting the switch, but “no” to pushing?’ ↩︎

  2. See also Trémolière & Bonnefon (2014) and Conway & Gawronski (2013) (who manipulated cognitive load). ↩︎