(rushed comment, but still thought it was worth posting. )
I’m not sure what the “quality adjusted” dollars means, but in terms of dollars, I think net spend on AI safety is more like 200M / year instead of 10s of millions.
EA Funds spends maybe 5M / year on AI Safety? I’d be very surprised if it was <1M / year.
FTX gave maybe another 100M of AI Safety related grants, not including Anthropic ( I estimate)
That gives 150M.
I also think lab spending such as Anthropic, OpenAI, and DeepMind’s safety team should be counted here. I’d put this at like 50M / year, which gives a lower bound total of 200M in 2022, because other people might be spending money.
I imagine that net spend in 2023 will be significantly lower than this though, 2022 was unusually high, likely due to FTX things.
Of course, spending money does not equate with impact, it’s pretty plausible that much of this money was spent very ineffectively.
(+1 to this approach for estimating neglectedness; I think dollars spent is a pretty reasonable place to start, even though quality adjustments might change the picture a lot. I also think it’s reasonable to look at number of people.)
Looks like the estimate in the 80k article is from 2020, though the callout in the biorisk article doesn’t mention it — and yeah, AIS spending has really taken off since then.
I think the OP amount should be higher because I think one should count X% of the spending on longtermist community-building as being AIS spending, for some X. [NB: I work on this team.]
I downloaded the public OP grant database data for 2022 and put it here. For 2022, the sum of all grants tagged AIS and LTist community-building is ~$155m. I think a reasonable choice of X is between 50% and 100%, so taking 75% at a whim, that gives ~$115m for 2022.
(rushed comment, but still thought it was worth posting. )
I’m not sure what the “quality adjusted” dollars means, but in terms of dollars, I think net spend on AI safety is more like 200M / year instead of 10s of millions.
Very rough estimates for 2022:
From OP’s website, it looks looks like:
15M to a bunch of academics
13M to something at MIT
10M to Redwood
10M to Constellation
5M to CAIS
~25M of other grants (e.g. CNAS, SERI MATS)
Adds up to like 65M
EA Funds spends maybe 5M / year on AI Safety? I’d be very surprised if it was <1M / year.
FTX gave maybe another 100M of AI Safety related grants, not including Anthropic ( I estimate)
That gives 150M.
I also think lab spending such as Anthropic, OpenAI, and DeepMind’s safety team should be counted here. I’d put this at like 50M / year, which gives a lower bound total of 200M in 2022, because other people might be spending money.
I imagine that net spend in 2023 will be significantly lower than this though, 2022 was unusually high, likely due to FTX things.
Of course, spending money does not equate with impact, it’s pretty plausible that much of this money was spent very ineffectively.
(+1 to this approach for estimating neglectedness; I think dollars spent is a pretty reasonable place to start, even though quality adjustments might change the picture a lot. I also think it’s reasonable to look at number of people.)
Looks like the estimate in the 80k article is from 2020, though the callout in the biorisk article doesn’t mention it — and yeah, AIS spending has really taken off since then.
I think the OP amount should be higher because I think one should count X% of the spending on longtermist community-building as being AIS spending, for some X. [NB: I work on this team.]
I downloaded the public OP grant database data for 2022 and put it here. For 2022, the sum of all grants tagged AIS and LTist community-building is ~$155m. I think a reasonable choice of X is between 50% and 100%, so taking 75% at a whim, that gives ~$115m for 2022.
Makes sense, so order $300m total?
thanks, this is helpful!