Executive summary: The EA Survey reveals that Global poverty, AI risk, and Biosecurity are the highest prioritized causes, with longtermist causes being favored especially among highly-engaged EAs, and that key philosophical ideas are associated with cause prioritization.
Key points:
Global poverty, AI risk, and Biosecurity are the top prioritized causes overall, with Climate change polarizing and Mental health lower priority.
Longtermist causes are prioritized by 63.6% of respondents vs. 46.8% for neartermist causes, a gap that widens among highly engaged EAs.
Prioritization of longtermist and existential risk causes has increased over time, while Global poverty has decreased but remains high.
Higher engagement predicts greater support for longtermist over neartermist causes, while higher age predicts the reverse. Other demographic factors have smaller effects.
Philosophical ideas around longtermism, risk neutrality, and digital sentience correlate with longtermist cause prioritization. Belief in ant sentience is substantial.
In an allocation task, respondents assign the most resources to Global health/poverty, then AI risk, then animal welfare, with actual allocations lower on Global poverty and animal welfare than the survey or an earlier survey of EA leaders.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, andcontact us if you have feedback.
This is not a bad summary overall, but has some errors/confusions:
Longtermist causes are prioritized by 63.6% of respondents vs. 46.8% for neartermist causes, a gap that widens among highly engaged EAs.
Both parts of this are technically true, but the statistic referred to in the first half of the sentence is different from the one that we reported to show the gap between low/high engagement EAs.
Comparable statistics would be:
Overall: 37.6% of respondents most prioritised only a longtermist cause, 21.0% most prioritised only a neartermist cause.
Among the highly engaged, 47% most prioritized only a longtermist cause and only 13% most prioritized a neartermist cause.
Among, the less engaged 26% most prioritized a longtermist cause and 31% most prioritized a neartermist cause.
In an allocation task, respondents assign the most resources to Global health/poverty, then AI risk, then animal welfare
True looking at farmed animal welfare, but a combined animal welfare category would be neck and neck with AI (slightly ahead but not significant).
with actual allocations lower on Global poverty and animal welfare than the survey or an earlier survey of EA leaders.
Actual allocations to Global Poverty are higher than our survey allocations and actual allocations to FAW are lower. I don’t have statistics for actual allocations to WAW, but they are likely dramatically lower.
Executive summary: The EA Survey reveals that Global poverty, AI risk, and Biosecurity are the highest prioritized causes, with longtermist causes being favored especially among highly-engaged EAs, and that key philosophical ideas are associated with cause prioritization.
Key points:
Global poverty, AI risk, and Biosecurity are the top prioritized causes overall, with Climate change polarizing and Mental health lower priority.
Longtermist causes are prioritized by 63.6% of respondents vs. 46.8% for neartermist causes, a gap that widens among highly engaged EAs.
Prioritization of longtermist and existential risk causes has increased over time, while Global poverty has decreased but remains high.
Higher engagement predicts greater support for longtermist over neartermist causes, while higher age predicts the reverse. Other demographic factors have smaller effects.
Philosophical ideas around longtermism, risk neutrality, and digital sentience correlate with longtermist cause prioritization. Belief in ant sentience is substantial.
In an allocation task, respondents assign the most resources to Global health/poverty, then AI risk, then animal welfare, with actual allocations lower on Global poverty and animal welfare than the survey or an earlier survey of EA leaders.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.
This is not a bad summary overall, but has some errors/confusions:
Both parts of this are technically true, but the statistic referred to in the first half of the sentence is different from the one that we reported to show the gap between low/high engagement EAs.
Comparable statistics would be:
Overall: 37.6% of respondents most prioritised only a longtermist cause, 21.0% most prioritised only a neartermist cause.
Among the highly engaged, 47% most prioritized only a longtermist cause and only 13% most prioritized a neartermist cause.
Among, the less engaged 26% most prioritized a longtermist cause and 31% most prioritized a neartermist cause.
True looking at farmed animal welfare, but a combined animal welfare category would be neck and neck with AI (slightly ahead but not significant).
Actual allocations to Global Poverty are higher than our survey allocations and actual allocations to FAW are lower. I don’t have statistics for actual allocations to WAW, but they are likely dramatically lower.