|
|
Moral uncertainty
There are many competing theories in normative ethics. Some people say the
right thing to do is what would lead to the most happiness. Others say that
it is what the virtuous person would do, or what is done on a maxim that
we could rationally will all others to follow, or what we are told to do
in a particular religious text. There is much disagreement by intelligent
people over what it is right to do, and while we each have our own beliefs
about the matter, it would be foolish to be 100% certain that we are correct
while all these others are not. Until further evidence is in, or until much
wider consensus is reached, the only rational approach is to spread your
degrees of belief between different ethical theories. For example, I think
that the true ethical theory is likely to be some form of utilitarianism,
but I'm not sure which form, and I also accept that there is some possibility
that it will be something quite different.
However, what are we to do in situations where the different theories that
we have some credence in urge us to do different things? In such situations,
we suffer from what is called moral uncertainty. This is not to
be confused with everyday uncertainty in moral situations: such as where
you don't know which charity you should donate to because you don't know
which one will best help the poor. The moral theories themselves should
tell you how to deal with such a situation (for example, by looking at the
expected benefits). Instead, these are cases where even if you had all the
empirical evidence, you would still not know what to do. What you still
need to know are the moral facts of the matter. Perhaps the philosophical
community will eventually work out what these are: normative ethics will
be settled once and for all. But what are you to do now, when you need to
act?
I believe that this is a very important question, and one that has been
overlooked in the study of ethics. There is a lot of focus on what each
ethical theory tells you to do, but almost none on what to do when you are
uncertain of which theory is correct (as we all are, or should rationally
be). I have been working on this topic for a while with Nick Bostrom, and
we have reached a few conclusions. For example we have shown that the following
two principles of moral uncertainty are both false:
1) Always do what the theory in which you hold the most credence tells you
to do.
2) Always do the act that you most believe is right.
For (1), consider a case in which you have a 2% degree of belief in a theory
which tells you to do A and a 1% degree of belief in 98 other theories which
each tell you to do B (and which say that this is very important). It is
not plausible that the appropriate act must be A.
This type of example, makes (2) look very plausible, but consider a case
in which you have a 51% degree of belief in a theory which tells you to
do A, but says the benefit is only marginally more than doing B, and that
you have a 49% degree of belief in a theory which tells you to do B and
says that this is critically important. It seems that in such a case you
need to morally hedge your bets by doing B. Thus neither of these principles
seems to be correct.
We are in the process of writing a paper in which we discuss these and other
results concerning this very interesting issue. We will not present a full
theory of moral uncertainty, but have quite a few surprising results and
promising ideas.
|