Results are sensitive to the distribution type, but focusing on the far right tail is most relevant for extinction risk.
I guess reasonable distribution types will lead to astronomically low extinction risk as long as one focusses on the rightmost points of the tail distribution.
Extraordinary evidence would be needed to justify a meaningfully higher risk estimate.
To clarify:
Extraordinary evidence would be required to move up sufficiently many orders of magnitude for an AI, bio or nuclear conflict to have a decent chance of causing human extinction. I think underweighting the outside view is a major reason leading to overly high risk.
Thanks, SummaryBot!
I guess reasonable distribution types will lead to astronomically low extinction risk as long as one focusses on the rightmost points of the tail distribution.
To clarify: