Can you say more about why you decided to categorize climate change as neartermist?
The short answer is that we didnot categorize climate change as neartermist.
As we explain in this post, we calculated “the average of cause prioritization ratings for biosecurity, nuclear security, AI risk, existential risk, and ‘other longtermism’ to indicate prioritization of ‘Longtermist’ cause areas, and the average of global poverty / global health, mental health, and ‘other neartermism’ to indicate prioritization of ‘Neartermist’ cause areas.”
I imagine the confusion may be due to you looking at our linked post from last year. We note in the text that the previous post also took the approach of “creat[ing] indicators of leanings towards broadly ‘longtermist’ cause areas, relative to more traditional ‘neartermist’/global health and wellbeing cause areas.” But the specific list of causes we used is the one described in this post.
As to why Climate change is categorised among the neartermist causes: as we explain in the linked post, this was based on the results of an exploratory factor analysis, where Climate change loaded on the same factor as the other ‘neartermist’ causes, Global poverty and Mental health. So this is an empirical approach, rather than based on a priori classification.
Ironically, we did not include Climate change in the measure of neartermist preferences this year, in part, to avoid people complaining on theoretical grounds that Climate change should be a longtermist cause. That said, it still shows a clear positive association with endorsement of neartermist causes and a negative association with the longtermist causes. Perhaps most tellingly, it is also negatively associated with an explicit longtermist statement: “The impact of our actions on the very long-term future is the most important consideration when it comes to doing good.”
Is it something like, the names are not literal (someone who is concerned that AI will kill everyone in the next 15 years seems literally more neartermist than someone who is worried about the effects of climate change over the next century) but instead represent two major clusters of EA cause prioritization with the main factor being existential impact?
Yes, we note in the text:
As we noted in our FTX Community Response Survey report, although we refer to these groups of causes as “longtermist” and “neartermist”, which they are often referred to as in the EA community, we are not committed to the claim that support for these causes is explained by longtermism/neartermism specifically. For example, support for “neartermist” causes might be explained by beliefs about appropriate kinds of evidence rather than beliefs about the value of the future per se.
The short answer is that we did not categorize climate change as neartermist.
As we explain in this post, we calculated “the average of cause prioritization ratings for biosecurity, nuclear security, AI risk, existential risk, and ‘other longtermism’ to indicate prioritization of ‘Longtermist’ cause areas, and the average of global poverty / global health, mental health, and ‘other neartermism’ to indicate prioritization of ‘Neartermist’ cause areas.”
I imagine the confusion may be due to you looking at our linked post from last year. We note in the text that the previous post also took the approach of “creat[ing] indicators of leanings towards broadly ‘longtermist’ cause areas, relative to more traditional ‘neartermist’/global health and wellbeing cause areas.” But the specific list of causes we used is the one described in this post.
As to why Climate change is categorised among the neartermist causes: as we explain in the linked post, this was based on the results of an exploratory factor analysis, where Climate change loaded on the same factor as the other ‘neartermist’ causes, Global poverty and Mental health. So this is an empirical approach, rather than based on a priori classification.
Ironically, we did not include Climate change in the measure of neartermist preferences this year, in part, to avoid people complaining on theoretical grounds that Climate change should be a longtermist cause. That said, it still shows a clear positive association with endorsement of neartermist causes and a negative association with the longtermist causes. Perhaps most tellingly, it is also negatively associated with an explicit longtermist statement: “The impact of our actions on the very long-term future is the most important consideration when it comes to doing good.”
Yes, we note in the text:
As we noted in our FTX Community Response Survey report, although we refer to these groups of causes as “longtermist” and “neartermist”, which they are often referred to as in the EA community, we are not committed to the claim that support for these causes is explained by longtermism/neartermism specifically. For example, support for “neartermist” causes might be explained by beliefs about appropriate kinds of evidence rather than beliefs about the value of the future per se.
Thanks! Sorry for commenting without reading more closely!