Worth highlighting the passage that the “mere ripples” in the title refers to for those skimming the comments:
Referring to events like “Chernobyl, Bhopal, volcano eruptions, earthquakes, draughts [sic], World War I, World War II, epidemics of influenza, smallpox, black plague, and AIDS” Bostrom writes that
these types of disasters have occurred many times and our cultural attitudes towards risk have been shaped by trial-and-error in managing such hazards. But tragic as such events are to the people immediately affected, in the big picture of things—from the perspective of humankind as a whole—even the worst of these catastrophes are mere ripples on the surface of the great sea of life. They haven’t significantly affected the total amount of human suffering or happiness or determined the long-term fate of our species.
Mere ripples! That’s what World War II—including the forced sterilizations mentioned above, the Holocaust that killed 6 million Jews, and the death of some 40 million civilians—is on the Bostromian view. This may sound extremely callous, but there are far more egregious claims of the sort. For example, Bostrom argues that the tiniest reductions in existential risk are morally equivalent to the lives of billions and billions of actual human beings. To illustrate the idea, consider the following forced-choice scenario:
Bostrom’s altruist: Imagine that you’re sitting in front of two red buttons. If you push the first button, 1 billion living, breathing, actual people will not be electrocuted to death. If you push the second button, you will reduce the probability of an existential catastrophe by a teeny-tiny, barely noticeable, almost negligible amount. Which button should you push?
For Bostrom, the answer is absolutely obvious: you should push the second button! The issue isn’t even close to debatable. As Bostrom writes in 2013, even if there is “a mere 1 per cent chance” that 10^54 conscious beings living in computer simulations come to exist in the future, then “the expected value of reducing existential risk by a mere one billionth of one billionth of one percentage point is worth a hundred billion times as much as a billion human lives.” So, take a billion human lives, multiply it by 100 billion, and what you get is the moral equivalent of reducing existential risk on the assumption that there is a “one billionth of one billionth of one percentage point” that we run vast simulations in which 10^54 happy people reside. This means that, on Bostrom’s view, you would be a grotesque moral monster not to push the second button. Sacrifice those people! Think of all the value that would be lost if you don’t!
Worth highlighting the passage that the “mere ripples” in the title refers to for those skimming the comments: