Biases in decision making


When people make decisions, they rarely do so entirely rationally. Instead they take certain shortcuts and they act on their intuitions. When studied collectively, it can be easily seen that people deviate from the completely rational decisions in certain predictable ways. These are called cognitive biases and are studied in fields such as psychology and descriptive economics. They are also of considerable interest in ethics as they are sometimes barriers to clear ethical thinking that need to be overcome (such as the status quo bias), or interesting aspects of human reasoning that may or may not have a real moral grounding (such as the bias towards immediate benefits over greater future benefits).

Status quo bias in bioethics: the case for human enhancement
  I wrote this paper with Nick Bostrom in 2005. In it, we examine the pervasive bias that people have towards staying with the status quo. We then present two simple tests that one can use to diagnose and overcome such a bias. This has very general applications, covering prudential judgments as well as moral judgments, however we chose to illustrate the use of this principle with a particular question: if there were a safe and affordable way to increase human intelligence, would it be good to use it? This paper is very readable, and we believe that the 'reversal tests' that we present would be a welcome addition to the toolset of every moral philosopher (and anyone else for that matter).