Sharing a piece of advice I’ve given to a few people about applying for (EA) funding.
I’ve heard various people working on early-stage projects express hesitancy about applying for EA funding because their plan isn’t “complete” enough. They don’t feel confident enough in their proposal, or think what they’re asking for is too small. They seem to assume that EA funders only want to look at proposals with a long time-horizons from applicants who will work full-time who are confident their plan will work.
In my experience (I’ve done various bits of grantmaking and regularly talk to EA funders), grantmakers in EA spaces are generally happy to receive applications that don’t have these qualities. It’s okay to apply if you just want to test a project out for a few months; maybe you won’t be full-time, maybe you aren’t confident in some part of the theory of change, maybe it’s just a few months. You should apply and just explain your thinking, including all of your uncertainties.
Funders are uncertain too, and often prefer to fund tests for a few months than commit to multi-year projects with full-time staff because tests give them useful information about you and the theory of change. Ideally, funders eventually support long-term projects too.
I’m not super confident in this take, but I ran it past a few EA funders and they agreed. Note that I think this probably doesn’t apply outside of EA; I understand many grant applications require detailed plans.
Yeah, I wish someone had told me this earlier—it would have led me to apply a lot earlier and not “saving my chance.” There’s a couple of layers to this thought process in my opinion:
Talented people often feel like they are not the ideal candidates/ they don’t have the right qualifications.
The kind of people EA attracts generally have a track record of checking every box, so they carry this “trait” over into the EA space
In general, there’s a lot of uncertainty in fields like AI governance even among experts from what I can glean
Cultures particularly in the global south punish people for being uncertain, let alone quantifying uncertanity
Sharing a piece of advice I’ve given to a few people about applying for (EA) funding.
I’ve heard various people working on early-stage projects express hesitancy about applying for EA funding because their plan isn’t “complete” enough. They don’t feel confident enough in their proposal, or think what they’re asking for is too small. They seem to assume that EA funders only want to look at proposals with a long time-horizons from applicants who will work full-time who are confident their plan will work.
In my experience (I’ve done various bits of grantmaking and regularly talk to EA funders), grantmakers in EA spaces are generally happy to receive applications that don’t have these qualities. It’s okay to apply if you just want to test a project out for a few months; maybe you won’t be full-time, maybe you aren’t confident in some part of the theory of change, maybe it’s just a few months. You should apply and just explain your thinking, including all of your uncertainties.
Funders are uncertain too, and often prefer to fund tests for a few months than commit to multi-year projects with full-time staff because tests give them useful information about you and the theory of change. Ideally, funders eventually support long-term projects too.
I’m not super confident in this take, but I ran it past a few EA funders and they agreed. Note that I think this probably doesn’t apply outside of EA; I understand many grant applications require detailed plans.
Yeah, I wish someone had told me this earlier—it would have led me to apply a lot earlier and not “saving my chance.” There’s a couple of layers to this thought process in my opinion:
Talented people often feel like they are not the ideal candidates/ they don’t have the right qualifications.
The kind of people EA attracts generally have a track record of checking every box, so they carry this “trait” over into the EA space
In general, there’s a lot of uncertainty in fields like AI governance even among experts from what I can glean
Cultures particularly in the global south punish people for being uncertain, let alone quantifying uncertanity