AI Safety Microgrant Round
We are pleased to announce an AI Safety Microgrants Round, which will provide micro-grants to field-building projects and other initiatives that can be done with less.
We believe there are projects and individuals in the AI Safety space who lack funding but have high agency and potential. We think individuals helping to fund projects will be particularly important given recent changes in the availability of funding. For this reason, we decided to experiment with a small micro-grant round as a test and demonstration of this concept.
To keep the evaluation simple, we’re focusing on field-building projects rather than projects that would require complex evaluation (we expect that most technical projects would be much more difficult to evaluate).
We are offering microgrants up to $2,000 USD with the total size of this round being $6,000 USD (we know that this is a tiny round, but we are running this as a proof-of-concept). One possible way this could pan out would be two grants of $2000 and two of $1000, although we aren’t wedded and we are fine with request for less. We want to fund grant requests of this size where EA funds possibly has a bit too much overhead.
The process is as follows:
Fill out this form at microgrant.ai (<15 min).
Shortlisted applications receive a follow-up call within two weeks after the applications close
We may also send an email with follow-up questions if we need more information. We would expect a reply within a few days so that we could confirm the grant within a week
To inspire applicants, here are some examples of projects where a microgrant would have been helpful:
Chris recently received a grant through an Australian-based organisation to hire two facilitators at $1,000 each for running a local version of the AGI safety fundamentals course. Intro fellowships have proven to be a great way of engaging people and we would be excited about funding this if it would help kickstart a new local AI safety group rather than just being an isolated project.
We might have funded something like the AI safety nudge competition to help people overcome their procrastination, and are excited about the potential to motivate a large number of people to accelerate their AI safety journeys and about experimenting with a potentially scalable intervention.
The Sydney AI Safety Fellowship was originally going to be funded with three people each throwing in $2000, which would have been enough for a coworking space, weekly lunches and some socials for a few participants. We would be excited about funding a similar project if they were likely to attract good candidates—especially in communities and countries where this is currently nonexistent.
We are open to smaller grant applications, but in a lot of cases, small grants don’t make much of a counterfactual difference, however here are two examples of the kinds of grants we would be excited about:
One of us thinks there should be a logo for AI safety just like EA has the light bulb. This may be considered for a microgrant.
One of us recently granted $200 to half subsidise ten maths lessons. This allowed the grantee to build up evidence to subsequently get a grant to half-subsidise their lessons. We are excited about grants that assist someone in testing their fit. One of us had also granted $100 to someone in a low and middle-income country to get them a device for self-study so they could upskill themselves.
In contrast, here are the kinds of grants we wouldn’t be very excited about:
Any project with significant downside risk
$2000 towards a project which requires $8,000 which hasn’t been obtained yet
Funding to add one more fellow to a program that already has ten fellows. These kind of grants could be impactful, but allowing these kind of grants would like greatly increase the number of applications we would have to evaluate. Either at least 25% of the funding should be coming from the micro-grant round or the grant should make a qualitative difference.
Next Steps
Fill out the form at Microgrant.ai by December 1, 2022. If we’re interested in your project/idea, we will get back to you in a few days after the application closes. We are aiming to schedule calls with shortlisted applicants within two weeks of applications closing. Best of luck! We also encourage you to consider whether you should be submitting a grant application to EA Funds. You can email hello@microgrant.ai if you have questions.
Fine Print
This grant round is being funded by Chris Leong, Damola Morenikeji and David Kristoffersson. Thanks to A_Donor, Yanni Kyriacos, Evan Gaensbauer and Brendon Wong for their advice and assistance in this project.
We may be unable to provide grants to some applicants if there are sanctions applicable to their country.
- After recent FTX events, what are alternative sources of funding for longtermist projects? by 12 Nov 2022 16:51 UTC; 61 points) (
- EA & LW Forums Weekly Summary (7th Nov − 13th Nov 22′) by 16 Nov 2022 3:04 UTC; 38 points) (
- EA & LW Forums Weekly Summary (14th Nov − 27th Nov 22′) by 29 Nov 2022 22:59 UTC; 22 points) (
- EA & LW Forums Weekly Summary (14th Nov − 27th Nov 22′) by 29 Nov 2022 23:00 UTC; 21 points) (LessWrong;
- EA & LW Forums Weekly Summary (7th Nov − 13th Nov 22′) by 16 Nov 2022 3:04 UTC; 19 points) (LessWrong;
Personally, I want to make a recommendation for anyone working on technical AI safety without a computer from after 2018 to apply for a grant for a new one. (Grad students with access to university compute are slightly less critical, but if it’s affecting your work at all, you should still do this.)
Hey, is this still a thing? I was just thinking there should be microgrants for AI x-risk researchers
The round has finished. It was a one time thing. Now that Nonlinear and Lightcone have run grant rounds, I don’t see a need to organise another round.