No sane person would say, “Well, the risk of a nuclear meltdown at this reactor is only 1 in 1000
There are (checks Wikipedia) 400ish nuclear reactors, which means if everyone followed this reasoning, the risk of a nuclear meltdown would be pretty high.
Existential risks with low probabilities don’t add up in the same way. It’s my belief that the magnitude of a risk equals the badness times the probability (which for xrisk comes out to very, very bad) but not everyone might agree with me, and I’m not sure the nuclear reactor example would convince them.
I have a minor philosophical nitpick.
There are (checks Wikipedia) 400ish nuclear reactors, which means if everyone followed this reasoning, the risk of a nuclear meltdown would be pretty high.
Existential risks with low probabilities don’t add up in the same way. It’s my belief that the magnitude of a risk equals the badness times the probability (which for xrisk comes out to very, very bad) but not everyone might agree with me, and I’m not sure the nuclear reactor example would convince them.