1. ClusterFree 2. Center for Reducing Suffering 3. Arthropoda Foundation 4. Shrimp Welfare Project 5. Effective Altruism Infrastructure Fund 6. Forethought Foundation 7. Wild Animal Initiative 8. Center for Wild Animal Welfare 9. Animal Welfare Fund 10. Aquatic Life Institute 11. Longview Philanthropy’s Emerging Challenges Fund 12. Legal Impact for Chickens 13. The Humane League 14. Rethink Priorities 15. Centre for Enabling EA Learning & Research 16. MATS Research
Methodology
I used AI for advice (unlike last year) with Claude-Opus-4.5 and Gemini-3-Pro-Preview. I didn’t take either of their suggested rankings of course, but they both gave me a pretty decent starting point to work with such that I’m pretty sure I’d endorse either’s list as a marginal improvement to the vote distribution
Both were prompted with a list of my values and takes in addition to roughly 200k tokens scraped from the relevant posts list.
More substantially, my final list was probably based on a combination of not-really-all-that-amazing-but-better-than-nothing heuristics: what I feel like EA at large is under-funding, what I would be most excited to see, what tentatively aligns with my literally endorsed beliefs about the nature of suffering, what cause areas make theoretical sense to be near the Pareto frontier.
Take
Honestly I think I would (will?) almost certainly adjust my list if I look(ed) into it for just a few additional hours
At some level, implicitly ranking charities [eg by donating to one and not another] is kind of an insane thing for an individual to do—not in an anti-EA way (you can do way better than vibes/guessing randomly) but in a “there must be better mechanisms/institutions for outsourcing donation advice than GiveWell and ACE and ad hoc posts/tweets/etc and it’s really hard and high stakes” way.
Like what I would love is a lineup of 10-100 very highly engaged and informed people (could create the list simply by number of endorsements/requests) who talk about their strategy and values in a couple pages and then I just defer to them (does this exist?)
List [not necessarily final!]
1. ClusterFree
2. Center for Reducing Suffering
3. Arthropoda Foundation
4. Shrimp Welfare Project
5. Effective Altruism Infrastructure Fund
6. Forethought Foundation
7. Wild Animal Initiative
8. Center for Wild Animal Welfare
9. Animal Welfare Fund
10. Aquatic Life Institute
11. Longview Philanthropy’s Emerging Challenges Fund
12. Legal Impact for Chickens
13. The Humane League
14. Rethink Priorities
15. Centre for Enabling EA Learning & Research
16. MATS Research
Methodology
I used AI for advice (unlike last year) with Claude-Opus-4.5 and Gemini-3-Pro-Preview. I didn’t take either of their suggested rankings of course, but they both gave me a pretty decent starting point to work with such that I’m pretty sure I’d endorse either’s list as a marginal improvement to the vote distribution
Both were prompted with a list of my values and takes in addition to roughly 200k tokens scraped from the relevant posts list.
More substantially, my final list was probably based on a combination of not-really-all-that-amazing-but-better-than-nothing heuristics: what I feel like EA at large is under-funding, what I would be most excited to see, what tentatively aligns with my literally endorsed beliefs about the nature of suffering, what cause areas make theoretical sense to be near the Pareto frontier.
Take
Honestly I think I would (will?) almost certainly adjust my list if I look(ed) into it for just a few additional hours
At some level, implicitly ranking charities [eg by donating to one and not another] is kind of an insane thing for an individual to do—not in an anti-EA way (you can do way better than vibes/guessing randomly) but in a “there must be better mechanisms/institutions for outsourcing donation advice than GiveWell and ACE and ad hoc posts/tweets/etc and it’s really hard and high stakes” way.
Like what I would love is a lineup of 10-100 very highly engaged and informed people (could create the list simply by number of endorsements/requests) who talk about their strategy and values in a couple pages and then I just defer to them (does this exist?)