The trolley problem is a series of thought experiments in ethics and psychology, involving stylized ethical dilemmas of whether to sacrifice one person to save a larger number. The series usually begins with a scenario in which a runaway tram or trolley is on course to collide with and kill a number of people (traditionally five) down the track, but a driver or bystander can intervene and divert the vehicle to kill just one person on a different track. Then other variations of the runaway vehicle, and analogous life-and-death dilemmas (medical, legal etc.) are posed, each containing the choice to either do nothing and let several people die, or intervene and sacrifice one initially "safe" person to save them.
Opinions on the ethics of each scenario turn out to be sensitive to details of the story that may seem immaterial to the abstract dilemma. The question of formulating a general principle that can account for the differing judgements arising in different variants of the story was raised in a 1967 philosophy paper by Philippa Foot, and dubbed "the trolley problem" by Judith Jarvis Thomson in a 1976 article that catalyzed a large literature. Thus, in this subject the trolley problem refers to the meta-problem of why different judgements are arrived at in particular instances, which are called trolley cases, examples, dilemmas, or scenarios.
The most basic version of the dilemma, known as "Bystander at the Switch" or "Switch", goes:
There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person on the side track. You have two options:
- Do nothing and allow the trolley to kill the five people on the main track.
- Pull the lever, diverting the trolley onto the side track where it will kill one person.
Which is the more ethical option? Or, more simply: What is the right thing to do?
Philippa Foot introduced this genre of decision problems in 1967 as part of an analysis of debates on abortion and the doctrine of double effect. Philosophers Judith Thomson, Frances Kamm, and Peter Unger have also analysed the dilemma extensively. Thomson's 1976 article initiated the literature on the trolley problem as a subject in its own right. Characteristic of this literature are colorful and increasingly absurd alternative scenarios in which the sacrificed man is instead pushed onto the tracks as a weight to stop the trolley, has his organs harvested to save transplant patients, or is killed in more indirect ways that complicate the chain of causation and responsibility.
Earlier forms of individual trolley scenarios antedated Foot's publication. Frank Chapman Sharp included a version in a moral questionnaire given to undergraduates at the University of Wisconsin in 1905. In this variation, the railway's switchman controlled the switch, and the lone individual to be sacrificed (or not) was the switchman's child. German philosopher of law Karl Engisch discussed a similar dilemma in his habilitation thesis in 1930, as did as German legal scholar Hans Welzel in a work from 1951. In his commentary on the Talmud, published long before his death in 1953, Avrohom Yeshaya Karelitz considered the question of whether it is ethical to deflect a projectile from a larger crowd toward a smaller one.
Beginning in 2001, the trolley problem and its variants have been used extensively in empirical research on moral psychology. It has been a topic of popular books. Trolley-style scenarios also arise in discussing the ethics of autonomous vehicle design, which may require programming to choose whom or what to strike when a collision appears to be unavoidable.
https://en.wikipedia.org/wiki/Trolley_problem
The Trolley Problem
(utilitarianism or deontological based decision?)
The “trolley problem”: Is it OK to push a person off a footbridge and onto the tracks in front of a speeding trolley in order to stop that trolley from killing five people? Most people say “no.” But then there’s this version: Is it OK to hit a switch turning the trolley away from five people but towards one person? Now, most people say “yes.” Why does it seem right to trade one life for five in one case but not the other?
This set of dilemmas reflects tension between the;
- Deontological perspective, which says morality is fundamentally about rights and duties, and the
- Utilitarian perspective, which says that morality is ultimately about consequences for human well-being...
...So, we did a series of brain imaging experiments and results were consistent with our predictions. More “personal” dilemmas such as the footbridge case, elicit activity in regions of the brain associated with emotion. More “impersonal” dilemmas, such as hitting the switch, elicit activity in regions associated with rule-based reasoning.
In Search of Morality: An Interview with Joshua Greene
https://brainworldmagazine.com/in-search-of-morality-an-interview-with-joshua-greene/
Dual-Process Theory
Greene and colleagues have advanced a dual process theory of moral judgment, suggesting that moral judgments are determined by both automatic, emotional responses and controlled, conscious reasoning. In particular, Greene argues that the "central tension" in ethics between deontology (rights- or duty-based moral theories) and consequentialism (outcome-based theories) reflects the competing influences of these two types of processes:
Characteristically deontological judgments are preferentially supposed by automatic emotional responses, while characteristically consequentialist judgments are preferentially supported by conscious reasoning and allied processes of cognitive control.
In one of the first experiments to suggest a moral dual-process model, Greene and colleagues showed that people making judgments about "personal" moral dilemmas (like whether to push one person in front of an oncoming trolley in order to save five others) engaged several brain regions associated with emotion that were not activated by judgments that were more "impersonal" (like whether to pull a switch to redirect a trolley from a track on which it would kill five people onto a track on which it would kill one other person instead). They also found that for the dilemmas involving "personal" moral questions, those who did make the intuitively unappealing choice had longer reaction times than those who made the more emotionally pleasant decision.
A follow-up study compared "easy" personal moral questions to which subjects had fast reaction times against "hard" dilemmas (like the footbridge problem) to which they had slow reaction times. When responding to the hard problems, subjects displayed increased activity in the anterior dorsolateral prefrontal cortex (DLPFC) and inferior parietal lobes—areas associated with cognitive processing—as well as the anterior cingulate cortex—which has been implicated in error detection between two confusing inputs, as in the Stroop task). This comparison demonstrated that harder problems activated different brain regions, but it didn't prove differential activity for the same moral problem depending on the answer given. This was done in the second part of the study, in which the authors showed that for a given question, those subjects who made the utilitarian choices did have higher activity in the anterior DLPFC and the right inferior parietal lobe than subjects making non-utilitarian choices.
These two studies were correlational, but others have since suggested a causal impact of emotional vs. cognitive processing on deontological vs. utilitarian judgments. A 2008 study by Greene showed that cognitive load caused subjects to take longer to respond when they made a utilitarian moral judgment but had no effect on response time when they made a non-utilitarian judgment, suggesting that the utilitarian thought processes required extra cognitive effort.
https://en.wikipedia.org/wiki/Joshua_Greene_(psychologist)#Dual-process_theory
Individual Differences in Moral Disgust Do Not Predict Utilitarian Judgments, Sexual and Pathogen Disgust Do
Abstract
The role of emotional disgust and disgust sensitivity in moral judgment and decision-making has been debated intensively for over 20 years. Until very recently, there were two main evolutionary narratives for this rather puzzling association. One of the models suggest that it was developed through some form of group selection mechanism, where the internal norms of the groups were acting as pathogen safety mechanisms. Another model suggested that these mechanisms were developed through hygiene norms, which were piggybacking on pathogen disgust mechanisms. In this study we present another alternative, namely that this mechanism might have evolved through sexual disgust sensitivity. We note that though the role of disgust in moral judgment has been questioned recently, few studies have taken disgust sensitivity to account. We present data from a large sample (N = 1300) where we analyzed the associations between The Three Domain Disgust Scale and the most commonly used 12 moral dilemmas measuring utilitarian/deontological preferences with Structural Equation Modeling. Our results indicate that of the three domains of disgust, only sexual disgust is associated with more deontological moral preferences. We also found that pathogen disgust was associated with more utilitarian preferences. Implications of the findings are discussed.
Discussion
Moral psychological disgust research has been in a slight tumult for a while. A recent meta-analysis concluded that disgust induction effects, which should amplify people’s punitive tendencies or enhance moral condemnation, are either very minimal or non-existent. Another recent review concluded that disgust and moral judgment are somehow related, however it is not clear exactly how. Prior to the meta-analysis1, several models had been presented for the evolution of cognitive pathways that could explain how disgust-relevant information was channeled for the use of moral cognition. Nonetheless, all of these models seem to be operationally dependent on individual differences in disgust sensitivity. Given that a large body of relevant disgust research has focused on individual differences in moral judgments, it is slightly surprising that experimental studies have been mostly conducted without taking this into account. There seems to be only a handful of studies that have estimated the associations between disgust sensitivity and judgment of moral violations in experimental settings. Yet, moral preferences, moral cognitive judgments and individual differences in disgust sensitivity appear to be intimately associated in complicated ways.
Higher disgust sensitivity specifically predicts more conservative attitudes towards abortion and gay marriage, while individuals with higher disgust sensitivity have been shown to be more avoidant of moral norm violators and to judge them more harshly. Previous research has also associated sexual and moral disgust with an anti-psychopathic personality trait called Honesty-Humility from the HEXACO personality inventory. In addition, sexual disgust has been associated with conservative political attitudes. In some studies, disgust sensitivity is associated with norm adherence; while in others, it is associated with concerns for ritualistic purity, albeit this research is still relatively scarce.
Human moral cognition seems to have two major moral evaluation processors that form characteristically deontological and/or utilitarian judgments with respect to moral situations. Utilitarianism is an ethical philosophy that aims to maximize aggregate welfare (“good”) and to minimize suffering (“bad”). It is commonly juxtaposed with the deontological position, which states that moral rules are inviolable and do not fluctuate across situations. For utilitarians, murder can be justified when the costs are outweighed by the benefits, for instance, when killing a dangerous criminal prevents further murders from taking place. Deontologists, on the other hand, think that acts are either right or wrong irrespective of their consequences. According to their position, if a moral rule is violated in one situation, it can be violated in any situation, and thus ceases being a moral rule. “Do not kill” is a typical example of a deontological rule. For a deontologist, murder always violates the fundamental moral principle which states that people (or animals) should not be treated as objects, even if this would save lives. For a utilitarian, the ends justify the means whereas for a deontologist they do not. The notion of characteristically deontological or utilitarian judgments comes from Joshua Greene16, highlighting that whether a person ascribes to a certain moral philosophy is dissociated from the qualities of the judgment itself.
Studies examining the effects of disgust induction on utilitarian judgment have produced conflicting results which could in part be due to the fact that these studies have not taken disgust sensitivity into account. Considering that individual differences in disgust sensitivity are relevant in moral judgment formation, it is surprising that the links between disgust sensitivity and utilitarian preferences have not been extensively investigated. Chapman & Anderson also raise a similar issue and recommend that fundamental work should be conducted by simply evaluating individual differences in disgust sensitivity towards moral cognitive stimuli that have not been intended as disgusting. Additionally, we argue that different types of disgust sensitivity should be taken into consideration. A recent model, based on extensive evolutionary theorizing, differentiates between Moral, Sexual and Pathogen Disgust. The contradictory findings – that disgust sensitivity is linked to moral judgment but disgust primes are not – could be due to the fact that the disgust induction stimuli have led to a type of disgust less relevant for moral judgment. Another possibility is that, previous studies might have systematically confounded Sexual and Pathogen Disgust. As far as we are aware, this is the first study to investigate the relationship between individual differences in different components of disgust and utilitarian moral judgments with non-disgust inducing stimuli. Previous studies, if they have included disgust sensitivity measures in their final analysis, have not investigated all of the domains of disgust9, or have not investigated moral dilemmas and utilitarian judgement, but images of moral violations.
Thus, we decided to investigate the links between Pathogen, Sexual and Moral Disgust with respect to the most extensively studied moral cognitive process, that of utilitarian judgment formation. We found the evolutionary theoretical basis of the TDDS to be the best fit for the current study, as the connection between a specific evolved emotion and morality seems to require an evolutionary explanation. We hypothesized that Moral Disgust should be associated with more deontological or less utilitarian moral judgements, since it is operationalized as individual dislike towards moral rule-breaking. Furthermore, based on dual process model theorizing, any emotional sensitivity or emotional induction effects should lead individuals towards more deontological or norm-obedient judgments and behaviors rather than utilitarian ones. We therefore did not not expect any other effects to emerge.
We used the most commonly utilized 12 moral dilemmas originally created by Greene et al., since these dilemmas have been shown to have excellent psychometric properties and they reliably measure the same construct. Albeit several different measurements have been used to assess utilitarian moral preferences, this set of 12 dilemmas seems to be the most extensively validated one used to measure utilitarian judgments. To our knowledge, this sort of fundamental analysis between the most theoretically grounded disgust measures and utilitarian preferences has not been done previously, and our research aims to fill in this existing gap (as recommended by Chapman and Anderson).
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5374518/
https://sci-tech-philosophy.blogspot.com/p/impaled-horns-of-dilemma.html
No comments:
Post a Comment