Modelling moral uncertainty
Are you already subscribed?
Login to check
whether this content is already included on your personal or institutional subscription.
Abstract
When making moral decisions, we are often unsure about what we ought to do, and we often disagree one with another about the right course of action. An influential research area in moral philosophy examines the problem of moral uncertainty using the tools of Rational Decision Theory. I argue that this theoretical framework is particularly adequate in addressing what I define “genuineµ moral uncertainty, that is, uncertainty over ethical theories and principles which can prescribe different courses of action. Disciplines like cognitive sciences, evolutionary biology, and neuroscience, however, share a different approach to moral reasoning and decision making, focusing on how our moral behaviour is the result of cognitive patterns, neural correlates, adaptive behaviours, or heuristics, rather than the output of our sophisticated reasoning over complex moral principles. According to these models, moral uncertainty is better understood as the byproduct of biases, reasoning mistakes, and, generally speaking, processes and patterns which are influenced by empirical features of the agent or of the context. I argue that, in many choice problems, “cognitiveµ and genuine moral uncertainty co-exist, but are often difficult to disentangle. Then, using a widely discussed cases in the moral philosophy literature, the doing/ allowing distinction, I propose a causal model which can serve as a useful tool for uncovering the source of moral uncertainty, and, at least in some cases, to disentangle cognitive aspects from genuine ethical issues.
Keywords
- moral uncertainty
- doing/allowing distinction
- moral heuristics
- moral framing