Risks of human extinction


When people think of the threat of human extinction, they typically think in terms of dramatic science fiction stories. Responses to the topic are often light hearted or glib, and very few people are interested in taking the threat of human extinction seriously. However there are a number of sensible questions that can be asked concerning human extinction. For example:

    - How likely is it that humanity will go extinct before 2100?
    - What are the most likely causes of human extinction?
    - What can we do to mitigate these risks?
    - How much should we be prepared to pay in order to mitigate the risks?

These lead to a group of particularly ethical questions:

    - How bad is extinction?
    - Is it worse than the sum of all the deaths involved?
    - Should we also count the loss of all future generations?

I am very interested in the ethics of avoiding human extinction: both the theoretical questions of how to appraise the cost of human extinction and the practical questions of what we should actually be doing to mitigate risks.


Probing the Improbable: methodological challenges for risks with low probabilities and high stakes
  I wrote this paper with Anders Sandberg and Rafaela Hillerbrand. It is about a problem with assessing risks that have very low probabilities and very high stakes. The problem is that when an expert provides a calculation of the probability of a disaster, they are really providing the probability of the disaster occurring, given that their argument is watertight. However, their argument may of course be flawed. If the probability estimate given by an argument is dwarfed by the chance that the argument itself is flawed, then the estimate is suspect. We develop this idea formally, and provide many useful examples. We then apply our reasoning to the risks associated with the Large Hadron Collider and show that the risk is much higher than it initially appears.