I think it’s really important to look at the underlying assumptions of any long-term EA project, and the movement might not be doing this enough. We take as way too obvious that the social and political climate we’re currently operating in will stay the same. But in reality, everything could change significantly due to things like climate change (in one direction) or economic growth (in the other).
Thanks a ton for your comment! I’m planning to write a follow-up EA forum post on cascading and interlinking effects—and I agree with you in that I think a lot of times, EA frameworks only take into account first-order impacts while assuming linearity between cause areas.
Thanks for this post!
I think it’s really important to look at the underlying assumptions of any long-term EA project, and the movement might not be doing this enough. We take as way too obvious that the social and political climate we’re currently operating in will stay the same. But in reality, everything could change significantly due to things like climate change (in one direction) or economic growth (in the other).
Thanks a ton for your comment! I’m planning to write a follow-up EA forum post on cascading and interlinking effects—and I agree with you in that I think a lot of times, EA frameworks only take into account first-order impacts while assuming linearity between cause areas.