Dylan Matthews:You talk about existential risks in your latest book — big threats that have a chance of wiping out all of humanity. Which of those, if you had to pick one or two, concerns you the most? Is there one where the story of how a disaster would unfold is particularly compelling?
Peter Singer: It’s not just that the disaster story is more compelling, but that there is a reasonably compelling story as to how we can reduce that risk. When it comes to collision with an asteroid, there is a reasonable story about how we could reduce that risk. First we need to discover whether asteroids are on a collision path, and NASA is already doing that, and then we would need to think about how we could deflect it from Earth. So that, I can kind of understand.
Some of the others, it’s hard to know exactly what we could do. Bioterrorism, I guess we can develop ways of making things more secure and making it harder for bioterrorists. But it’s not going to be easy to find exactly what the best strategy is. Things like the singularity — the takeover by artificial intelligence, or something like that — it’s very hard to see exactly, at this stage, anyway, what you could do that would reduce that risk. I don’t know.
Peter Singer interview