But I don’t see a case for climate change risk specifically approaching anywhere near those levels, especially on timescales less than 100 years or so.
I think the thing with climate change is that unlike those other things it’s not just a vague possibility, it’s a certainty. The uncertainty lies in the precise entity of the risk. At the higher end of warming it gets damn well dangerous (not to mention, it can be the trigger for other crises, e.g. imagine India suffering from killer heatwaves leading to additional friction with Pakistan, both nuclear powers). So it’s a baseline of merely “a lot dead people, a lot lost wealth, a lot things to somehow fix or repair”, and then the tail outcomes are potentially much much worse. They’re considered unlikely but of course we may have overlooked a feedback loop or tipping point too much. I honestly don’t feel as confident that climate change isn’t a big risk to our civilization when it’s likely to stress multiple infrastructures at once (mainly, food supply combined with a need to change our energy usage combined with a need to provide more AC and refrigeration as a matter of survival in some regions combined with sea levels rising which may eat on valuable land and cities).
I’m often tempted to have views like this. But as my friend roughly puts it, “once you apply the standard of ‘good person’ to people you interact with, you’d soon find yourself without any allies, friends, employers, or idols.”
I’m not saying “these people are evil and irredeemable, ignore them”. But I’m saying they are being fundamentally irrational about it. “You can’t reason a person out of a position they didn’t reason themselves in”. In other words, I don’t think it’s worth worrying about not mentioning climate change merely for the sake of not alienating them when the result is it will alienate many more people on other sides of the spectrum. Besides, those among those people who think like you might also go “oh well these guys are wrong about climate change but I can’t hold it against them since they had to put together a compromise statement”. I think as of now many minimizing attitudes towards AI risk are also irrational, but it’s still a much newer topic and a more speculative one, with less evidence behind it. I think people might still be in the “figuring things out” stage for that, while for climate change, opinions are very much fossilized, and in some cases determined by things other than rational evaluation of the evidence. Basically, I think in this specific circumstance, there is no way of being neutral: either mentioning or not mentioning climate change gets read as a signal. You can only pick which side of the issue to stand on, and if you think you have a better shot with people who ground their thinking in evidence, then the side that believes climate change is real has more of those.
I think the thing with climate change is that unlike those other things it’s not just a vague possibility, it’s a certainty. The uncertainty lies in the precise entity of the risk. At the higher end of warming it gets damn well dangerous (not to mention, it can be the trigger for other crises, e.g. imagine India suffering from killer heatwaves leading to additional friction with Pakistan, both nuclear powers). So it’s a baseline of merely “a lot dead people, a lot lost wealth, a lot things to somehow fix or repair”, and then the tail outcomes are potentially much much worse. They’re considered unlikely but of course we may have overlooked a feedback loop or tipping point too much. I honestly don’t feel as confident that climate change isn’t a big risk to our civilization when it’s likely to stress multiple infrastructures at once (mainly, food supply combined with a need to change our energy usage combined with a need to provide more AC and refrigeration as a matter of survival in some regions combined with sea levels rising which may eat on valuable land and cities).
I’m not saying “these people are evil and irredeemable, ignore them”. But I’m saying they are being fundamentally irrational about it. “You can’t reason a person out of a position they didn’t reason themselves in”. In other words, I don’t think it’s worth worrying about not mentioning climate change merely for the sake of not alienating them when the result is it will alienate many more people on other sides of the spectrum. Besides, those among those people who think like you might also go “oh well these guys are wrong about climate change but I can’t hold it against them since they had to put together a compromise statement”. I think as of now many minimizing attitudes towards AI risk are also irrational, but it’s still a much newer topic and a more speculative one, with less evidence behind it. I think people might still be in the “figuring things out” stage for that, while for climate change, opinions are very much fossilized, and in some cases determined by things other than rational evaluation of the evidence. Basically, I think in this specific circumstance, there is no way of being neutral: either mentioning or not mentioning climate change gets read as a signal. You can only pick which side of the issue to stand on, and if you think you have a better shot with people who ground their thinking in evidence, then the side that believes climate change is real has more of those.