I agree, and though it doesn’t matter from an expected value point of view, I suspect part of what people object to in those risks is not just the probabilities being low but also there being lots of uncertainty around them.
Or actually, it could change the expected value calculation too if the probabilities aren’t normally distributed, e.g. one could look at an x-risk and judge most of the probability density to be around 0.001% but feel pretty confident that it’s not more than 0.01% and not at all confident that it’s not below 0.0001% or even 0.00001% etc. This makes it different from your examples, which probably have relatively narrow and normally distributed probabilities (because we have well-grounded base rates for airline accidents and voting and—I believe—robust scientific models of asteroid risks).
Edit: I see that Richard Y Chappell made this point already.
I agree, and though it doesn’t matter from an expected value point of view, I suspect part of what people object to in those risks is not just the probabilities being low but also there being lots of uncertainty around them.
Or actually, it could change the expected value calculation too if the probabilities aren’t normally distributed, e.g. one could look at an x-risk and judge most of the probability density to be around 0.001% but feel pretty confident that it’s not more than 0.01% and not at all confident that it’s not below 0.0001% or even 0.00001% etc. This makes it different from your examples, which probably have relatively narrow and normally distributed probabilities (because we have well-grounded base rates for airline accidents and voting and—I believe—robust scientific models of asteroid risks).
Edit: I see that Richard Y Chappell made this point already.