Ideas EAIF is excited to receive applications for

The EA Infrastructure Fund isn’t currently funding-constrained. Hooray! This means that if you submit a strong application that fits within our “principles-first” effective altruism scope soon, we’d be excited to fund it, and won’t be constrained by a lack of money. We’re open to considering a range of grant sizes, including grants over $500,000 and below $10,000.[1]

In part, we’re writing this post because we spoke to a few people with projects we’d be interested in funding who didn’t know that they could apply to EAIF. If you’re unsure, feel free to ask me questions or just apply!

The rest of this post gives you some tips and ideas for how you could apply, including ideas we’re excited to receive applications for. I (Jamie) wrote this post relatively quickly; EAIF staff might make more such posts if people find them helpful.

🔍 What’s in scope?

  • Research that aids prioritisation across different cause areas.

  • Projects that build communities focused on impartial, scope-sensitive and ambitious altruism.

  • Infrastructure, especially epistemic infrastructure, to support these aims.

  • (More on this post and our website, though the site needs a bit of a revamp. And please err on the side of applying. You don’t need to be fully ‘principles first’; that’s our strategy.)

💪 What makes an application strong?

  • A great idea — promising theory of change and expected cost-effectiveness.[2]

  • Evidence suggesting you are likely to execute well on the idea.

  • (I’m simplifying a bit of course. See also Michael Aird’s tips here.)

The second part is straightforward enough; if your project has been ongoing for a while, we’d like to understand the results you’ve achieved so far. If it’s brand new, or you’re pivoting a lot, we’re interested in evidence about your broader achievements and skills that would set you up well to do a good job.

You might already have a great idea. If so, nice one! Please ignore the rest of this post and crack on with an application. If not, I’ll now highlight a few specific topics that we’re especially interested in receiving applications for at the moment.[3]

💡 Consider applying for projects in these areas

Epistemics and integrity

What’s the problem?

  • EA is vulnerable to groupthink, echo chambers, and excessive deference to authority.

  • A bunch of big EA mistakes and failures were perhaps (partly) due to these things.

  • A lot of external criticism of EA stems back to this.

What could be done?

  • Training programmes and fellowships that help individual participants develop good epistemic habits or integrity directly (e.g. Scout Mindset, Fermi estimates, developing virtues), indirectly (e.g. helping them form their own views on cause prioritisation), or as part of a broader package.

  • Training, tools, or platforms for forecasting and prediction markets.

  • Researching and creating tools that aid structured and informed decision-making.

  • Developing filtering and vetting mechanisms to weed out applicants with low integrity or poor epistemics.

  • New structures or incentives at the community level: integrating external feedback, incentivising red-teaming, or creating better discussion platforms.

What have we funded recently?

  • Elizabeth Van Nostrand and Timothy Telleen Lawton recorded a discussion about why Elizabeth left EA and why Timothy is seeking a ‘renaissance’ of EA instead. They’re turning this into a broader podcast.

  • EA Netherlands is working with Shoshannah Tekofsky to develop 5-10 unique rationality workshops to be presented to 100-240 Dutch EAs over a 12-month period, aiming to improve their epistemic skills and decision-making processes.

  • André Ferretti launched the “Retrocaster” tool on Clearer Thinking, to enhance users’ forecasting skills. By obscuring data from sources like Our World in Data, Retrocaster invites users to forecast hidden trends.

Harri Besceli, another EAIF Fund Manager, wrote more thoughts on EA epistemics projects here. This is beyond EAIF’s scope but if you have a for-profit idea here, feel free to contact me.[4]

EA brand and reputation

What’s the problem?

  • Since FTX, the public perception of EA has become significantly worse.

  • This makes it harder to grow and do community outreach.

  • Organisations and individuals are less willing to associate with EA; this reduces the benefits it provides and further worsens its reputation.

What could be done?

  • Good PR. There’s a whole massive industry out there focused on exactly this, and presumably a bunch of it works. Not all PR work is dishonest.

  • Empirical testing of different messages and frames to see what resonates best with different target audiences.

  • More/​better comms and marketing generally for promising organisations.

  • Inwards-focusing interventions that help create a healthier self-identity, culture, and vision, or that systematically boost morale (beyond one-off celebratory posts).

  • Support for high-quality journalism on relevant topics.

What have we funded recently?

  • Yi-Yang Chua is exploring eight community health projects. Some relate to navigating EA identity; others might have knock-on effects for EA’s reputation by mitigating harms and avoiding scandals.

  • Honestly not much. Please send us requests!

I’ve focused on addressing challenges of poor brand and reputation, but of course the ideal would be to actually fix any underlying issues that have bad consequences and in turn cause poor reputation. Proposals relating to those are of course welcome (e.g. on epistemics & integrity).

Funding diversification

What’s the problem?

  • Many promising projects are bottlenecked by funding, from AI safety to animal welfare.

  • Projects are often dependent on funding from Open Philanthropy, which makes their situation unstable and incentivises deference to OP’s views.

  • There’s less funding in EA than there used to be (especially due to the FTX crash) or could be (especially given historical reliance on OP and FTX).

What could be done?

  • Projects focused on broadly raising funding from outside the EA community.

  • More targeted fundraising, like projects focusing specifically on high-net-worth donors, local donors in priority areas (e.g. India), or specific professions and interest groups (e.g. software engineers, alt protein startup founders, AI lab staff).

  • Regranting projects.

  • Projects focused on democratising decision making within the EA community.

  • Philanthropic advising, grantmaking, or talent pipelines to help address bottlenecks here.

What have we funded recently?

  • Giv Effektivt hired its first FTE staff member to reach high-net-worth individuals and improve operations, media outreach, and SEO.

  • EA Poland grew and promoted a platform for cost-effective donations to address global poverty, factory farming, and climate change.

  • But we’ve mostly only received applications for broad, national effective giving initiatives; and there are so many more opportunities in this space!

Areas deprioritised by Good Ventures

Good Ventures announced that it would stop supporting certain sub-causes via Open Philanthropy. We expect that programmes focused on rationality or supporting under 18s (aka ‘high school outreach’) are the most obviously in-scope-for-EAIF affected areas; you can check this post for other possibilities.

We expect that Good Ventures’ withdrawal here leaves at least some promising projects underfunded, and we’d be excited to help fill (some of) the gap.

✨ This is by no means an exhaustive list!

There are lots of problems in effective altruism, and lots of bottlenecks faced by projects making use of EA principles; if you have noticed an issue, let us know about how you can help fix it by submitting an application.

For instance, if you’ve been kicking around for a few years — you’ve built up some solid career capital in top orgs, and have a rich understanding of the EA community, warts and all — then there’s a good chance we’d be excited to fund you to make progress on tackling an issue you’ve identified.[5]

And of course, other people have already done some thinking and suggested some ideas. Here are a few longlists of potential projects, if you want to scour for options[6]:

❓ Ask me almost anything

I’m happy to do an informal ‘ask me anything’ here — I encourage you to ask away in the comments section if there’s anything you’re unsure about or that is holding you back, and I expect to be able to respond to most/​all of them. You can also email me (jamie@effectivealtruismfunds.org) or use my anonymous advice form, but posting your comment here is a public good if you’re up for it, since others might have the same question.

But if you already know everything you need to know…

🚀 Apply

See also: “Don’t think, just apply! (usually)”. By the way, EAIF’s turnaround times are much better than they used to be; typically 6 weeks or less.

The application form is here. Thanks!

  1. ^

    We don’t have a hard upper bound at the moment. Historically, most of our grants have been between about $10,000 and $200,000. We’d be a bit hesitant evaluating something much higher than $500,000 but we’re open to it. If it was over $1m, we’d likely encourage you to apply elsewhere, e.g. Open Philanthropy.

  2. ^

    Scalability and high upside value can make an application more promising but are not requirements.

  3. ^

    The first three of these are inspired by a few calls Harri Besceli (another EAIF Fund Manager) carried out with some people who work in EA community building or who have been engaged in the EA community for a long time. But the bullet points here are my own take; this isn’t a writeup of the findings so to speak. I’m not trying to ‘make the case’ for any of these areas’ importance here; it’s fine if you disagree, I’m just flagging that we’d be excited for applications in these areas.

  4. ^

    I also work at Polaris Ventures, which makes investments and might be interested.

  5. ^

    Of course, this isn’t guaranteed; it still needs to be an in-scope, strong application. And we sometimes receive strong applications from people who are newer to effective altruism, too.

  6. ^

    Caveats:

    • With the exception of mine and CE’s, these lists all contain ideas that wouldn’t be in-scope for EAIF.

    • Many of these lists were put together quickly or had a low bar for inclusion. Some are mostly outdated, some may focus on worldviews you disagree with, etc. You shouldn’t treat an idea being mentioned on one of these lists as a strong vote of confidence from anyone that it’s actually a good use of time. These are usually just ideas.

    • Even if it is a great idea, you still need to have relevant skills and track record to be able to put in a 💪 strong application.