When EA was starting, there was a small amount of talent, and a smaller amount of funding. As one might expect, things went slowly for the first few years.
Then once OP decided to focus on X-risks, there was ~$8B potential funding, but still fairly little talent/capacity. I think the conventional wisdom then was that we were unlikely to be bottlenecked by money anytime soon, and lots of people were encouraged to do direct work.
Then FTX Future Fund came in, and the situation got even more out-of-control. ~Twice the funding. Projects got more ambitious, but it was clear there were significant capacity (funder and organization) constraints.
Then (1) FTX crashed, and (2) lots of smart people came into the system. Project capacity grew, AI advances freaked out a lot of people, and successful community projects helped train a lot of smart young people to work on X-risks.
But funding has not kept up. OP has been slow to hire for many x-risk roles (AI safety, movement building, outreach / fundraising). Other large funders have been slow to join in.
So now there’s a crunch for funding. There are a bunch of smart-seeming AI people now who I bet could have gotten funding during the FFF, likely even before then with OP, but are under the bar now.
I imagine that this situation will eventually improve, but of course, it would be incredibly nice if it could happen sooner. It seems like EA leadership eventually fix things, but it often happens slower than is ideal, with a lot of opportunity loss in that time.
Opportunistic people can fill in the gaps. Looking back, I think more money and leadership in the early days would have gone far. Then, more organizational/development capacity during the FFF era. Now, more funding seems unusually valuable.
If you’ve been thinking about donating to the longtermist space, specifically around AI safety, I think it’s likely that funding this year will be more useful than funding in the next 1-3 years. (Of course, I’d recommend using strong advisors or giving to funds, instead of just choosing directly, unless you can spend a fair bit of time analyzing things).
If you’re considering entering the field as a nonprofit employee, heed some caution. I still think the space can use great talent, but note that this is an unusually competitive time to get many paid roles or to get nonprofit grants.
If you have limited time to investigate / work with, I’d probably recommend either the LTFF or choosing a regranter you like at Manifund.
If you have a fair bit more time, and ideally the expectation of more money in the future, then I think a lot of small-to-medium (1-10 employee) organizations can use some long-term, high-touch donors. Honestly this may settle more down to fit / relationships than identifying the absolute best org—as long as it’s funded by one of the groups listed above or OP, as money itself is a bit fungible between orgs.
I think a lot of nonprofits have surprisingly few independent donors, or even strong people that can provide decent independent takes. I might write more about this later.
(That said, there are definitely ways to be annoying / a hindrance, as an active donor, so try to be really humble here if you are new to this)
On the funding-talent balance:
When EA was starting, there was a small amount of talent, and a smaller amount of funding. As one might expect, things went slowly for the first few years.
Then once OP decided to focus on X-risks, there was ~$8B potential funding, but still fairly little talent/capacity. I think the conventional wisdom then was that we were unlikely to be bottlenecked by money anytime soon, and lots of people were encouraged to do direct work.
Then FTX Future Fund came in, and the situation got even more out-of-control. ~Twice the funding. Projects got more ambitious, but it was clear there were significant capacity (funder and organization) constraints.
Then (1) FTX crashed, and (2) lots of smart people came into the system. Project capacity grew, AI advances freaked out a lot of people, and successful community projects helped train a lot of smart young people to work on X-risks.
But funding has not kept up. OP has been slow to hire for many x-risk roles (AI safety, movement building, outreach / fundraising). Other large funders have been slow to join in.
So now there’s a crunch for funding. There are a bunch of smart-seeming AI people now who I bet could have gotten funding during the FFF, likely even before then with OP, but are under the bar now.
I imagine that this situation will eventually improve, but of course, it would be incredibly nice if it could happen sooner. It seems like EA leadership eventually fix things, but it often happens slower than is ideal, with a lot of opportunity loss in that time.
Opportunistic people can fill in the gaps. Looking back, I think more money and leadership in the early days would have gone far. Then, more organizational/development capacity during the FFF era. Now, more funding seems unusually valuable.
If you’ve been thinking about donating to the longtermist space, specifically around AI safety, I think it’s likely that funding this year will be more useful than funding in the next 1-3 years. (Of course, I’d recommend using strong advisors or giving to funds, instead of just choosing directly, unless you can spend a fair bit of time analyzing things).
If you’re considering entering the field as a nonprofit employee, heed some caution. I still think the space can use great talent, but note that this is an unusually competitive time to get many paid roles or to get nonprofit grants.
Any thoughts on where e.g. 50K could be well spent?
(For longtermism)
If you have limited time to investigate / work with, I’d probably recommend either the LTFF or choosing a regranter you like at Manifund.
If you have a fair bit more time, and ideally the expectation of more money in the future, then I think a lot of small-to-medium (1-10 employee) organizations can use some long-term, high-touch donors. Honestly this may settle more down to fit / relationships than identifying the absolute best org—as long as it’s funded by one of the groups listed above or OP, as money itself is a bit fungible between orgs.
I think a lot of nonprofits have surprisingly few independent donors, or even strong people that can provide decent independent takes. I might write more about this later.
(That said, there are definitely ways to be annoying / a hindrance, as an active donor, so try to be really humble here if you are new to this)