Yes, you are correct and thank you for forcing me to further clarify my position (in what follows I leave out WAW since I know absolutely nothing about it):
EA funds, which I will assume is representative of EA priorities has these funds a) “Global Health and Development”; b) “Animal Welfare”; c) “Long-Term Future”; d) “EA Meta”. Let’s leave D aside for the purposes of this discussion.
There is good reason to believe the importance and tractability of specific climate change interventions can equal or even exceed those of A & B. We have not done enough research to determine if this is the case.
The arguments in favor of C being the only area we should be concerned with, or the area we should be most concerned with, are:
I) reminiscent of other arguments in the history of thought that compel us (humans) because we do not account for the limits of our own rationality. I could say a lot more about this another time, suffice it to say here that in the end I cautiously accept these arguments and believe x-risk deserves a lot of our attention.
II) are popular within this community for psychological as well as purely rational reasons. There is nothing wrong with that and it might even be needed to build a dedicated community.
III) For these reasons I think we are biased towards C, and should employ measurements to correct for this bias.
None of these priorities is neglected by the world, but certain interventions or research opportunities within them are. EA has spent an enormous amount of effort finding opportunities for marginal value add in A, B & C.
Climate change should be researched just as much as A & B. One way of accounting for the bias I see in C is to divert a certain portion of resources to climate change research despite our strongly held beliefs. I simply cannot accept the conclusion that unless climate change renders our planet uninhabitable before we colonize Mars, we have better things to worry about. That sounds absurd in light of the fact that certain detrimental effects of climate change are already happening, and even the best case future scenarios include a lot of suffering. It might still be right, but it’s absurdity means we need to give it more attention.
What surprises me the most from the discussion of this post (and I realize it’s readers are a tiny sample size of the larger community) is that no one has come back with: “we did the research years ago, we could find no marginal value add. Please read this article for all the details”.
Yes, you are correct and thank you for forcing me to further clarify my position (in what follows I leave out WAW since I know absolutely nothing about it):
EA funds, which I will assume is representative of EA priorities has these funds a) “Global Health and Development”; b) “Animal Welfare”; c) “Long-Term Future”; d) “EA Meta”. Let’s leave D aside for the purposes of this discussion.
There is good reason to believe the importance and tractability of specific climate change interventions can equal or even exceed those of A & B. We have not done enough research to determine if this is the case.
The arguments in favor of C being the only area we should be concerned with, or the area we should be most concerned with, are:
I) reminiscent of other arguments in the history of thought that compel us (humans) because we do not account for the limits of our own rationality. I could say a lot more about this another time, suffice it to say here that in the end I cautiously accept these arguments and believe x-risk deserves a lot of our attention.
II) are popular within this community for psychological as well as purely rational reasons. There is nothing wrong with that and it might even be needed to build a dedicated community.
III) For these reasons I think we are biased towards C, and should employ measurements to correct for this bias.
None of these priorities is neglected by the world, but certain interventions or research opportunities within them are. EA has spent an enormous amount of effort finding opportunities for marginal value add in A, B & C.
Climate change should be researched just as much as A & B. One way of accounting for the bias I see in C is to divert a certain portion of resources to climate change research despite our strongly held beliefs. I simply cannot accept the conclusion that unless climate change renders our planet uninhabitable before we colonize Mars, we have better things to worry about. That sounds absurd in light of the fact that certain detrimental effects of climate change are already happening, and even the best case future scenarios include a lot of suffering. It might still be right, but it’s absurdity means we need to give it more attention.
What surprises me the most from the discussion of this post (and I realize it’s readers are a tiny sample size of the larger community) is that no one has come back with: “we did the research years ago, we could find no marginal value add. Please read this article for all the details”.