By the way, while I still stand by the core claims in the post, covid-19 has made me update to be more worried about tail risks from extreme climate change (though I continue to think it’s very overrated in the general public). A lot of the reason I didn’t (and mostly still don’t) think climate is a large direct existential or global catastrophic risk is that if I dive into the numbers/direct statements from climate scientists, the closer you go to the actual modeling the more sanguine people are.
However, I’ve since gotten more pessimistic both about how good people in general are at extreme probabilities and which biases they have. Specifically, naively I’d have guessed that climate scientists are biased to believe their problem is more important, but now I think there are larger effects from respectability + selection effects from less alarmist and more certain papers getting funding.
To a lesser extent, I’ve updated moderately towards civilizational inadequacy being common even in the face of slow-moving and large problems. I don’t want to update too strongly because of a) issues of over-indexing on one example and b) I suspect that as much as I want to guard against biases in this direction, in practice I think there’s a high chance I would be counterfactually more cavalier about government adequacy if I lived in China, Mongolia, or New Zealand.
I don’t want the verbal arguments to sway people too strongly, you can see my comments on Metaculus for more probabilistic reasoning.
By the way, while I still stand by the core claims in the post, covid-19 has made me update to be more worried about tail risks from extreme climate change (though I continue to think it’s very overrated in the general public). A lot of the reason I didn’t (and mostly still don’t) think climate is a large direct existential or global catastrophic risk is that if I dive into the numbers/direct statements from climate scientists, the closer you go to the actual modeling the more sanguine people are.
However, I’ve since gotten more pessimistic both about how good people in general are at extreme probabilities and which biases they have. Specifically, naively I’d have guessed that climate scientists are biased to believe their problem is more important, but now I think there are larger effects from respectability + selection effects from less alarmist and more certain papers getting funding.
To a lesser extent, I’ve updated moderately towards civilizational inadequacy being common even in the face of slow-moving and large problems. I don’t want to update too strongly because of a) issues of over-indexing on one example and b) I suspect that as much as I want to guard against biases in this direction, in practice I think there’s a high chance I would be counterfactually more cavalier about government adequacy if I lived in China, Mongolia, or New Zealand.
I don’t want the verbal arguments to sway people too strongly, you can see my comments on Metaculus for more probabilistic reasoning.