Thanks for the post! The climate issue is something I keep on coming up against when I try to think seriously about other long-term possible x risks. This seems particularly concerning given the uncertainty of the models you describe here. I’m not across all of the studies as other commentors here, but it just strikes me that if cause prioritisation was going to be externally influenced, climate would be the big ticket topic that we are most likely to be led astray by.
What strikes me with this issue in EA, more than any other, is the reliance on our own intuition/politicisation in deciding which level of risk we ascribe to it. Understandably, given the high level of mainstream media and political coverage this issue has seen. I don’t want to be provocative, but with this issue more than others I think EA community members need to be especially cognisant about models (or authors) we tend to agree with, and think critically about where we might be employing cognitive dissonance and other heuristics.
Thanks for the post! The climate issue is something I keep on coming up against when I try to think seriously about other long-term possible x risks. This seems particularly concerning given the uncertainty of the models you describe here. I’m not across all of the studies as other commentors here, but it just strikes me that if cause prioritisation was going to be externally influenced, climate would be the big ticket topic that we are most likely to be led astray by.
What strikes me with this issue in EA, more than any other, is the reliance on our own intuition/politicisation in deciding which level of risk we ascribe to it. Understandably, given the high level of mainstream media and political coverage this issue has seen. I don’t want to be provocative, but with this issue more than others I think EA community members need to be especially cognisant about models (or authors) we tend to agree with, and think critically about where we might be employing cognitive dissonance and other heuristics.