I think this is likely different from AIS being “solved” and is necessarily contextualized in the full breadth of the world’s most pressing problems, including other x-risks and s-risks, and their relative neglectedness.
One thing to keep in mind is that nothing is static. Just as attention and resources towards AIS may ebb and flow in the coming years, so will attention to other highly pressing problems, like other x-risks and s-risks.
But let’s ignore that for now and go through a few sketches that might turn into BOTECs.
80000 Hours currently estimates that 10s of millions of quality-adjusted dollars are spent on AIS. Commenters estimate about 300M/year is spent on AIS in 2022, while roughly 1 billion a year (quality-adjusted) is spent on reducing bio x-risks.
So first order, if you think bio x-risk is 10x less important ∩ tractable than AIS, then at the point where AIS has roughly 1 billion quality-adjusted dollars/year spent on it, then bio-risk is sufficiently relatively neglected that a moderate comparative advantage for someone generally talented should push them to work against bio-risks over AIS. Similarly, at 10x AIS importance ∩ tractability compared to biorisk, then you should consider AIS and bio x-risk equally neglected relative to other factors at the $10 B/year mark. To be clear, this is just saying that AIS is no longer “most neglected relative to its importance” which is a very high bar; even at 10B/year it’d arguably still be extremely neglected in absolute terms.[1]
Likewise, if you think bio x-risk is 100x less important ∩ tractable than AIS, the above numbers should be $10B/year and $100B/year, respectively.
(Some people think the difference is much more than 100x but I don’t personally find the arguments convincing after having looked into it non-trivially. That said, I don’t have much access to private information, and no original insights).
However, as mentioned in the beginning, this assumes, likely incorrectly, that resources on bio-x-risk is relatively static. To the extent this assumption is false, you’d need to dynamically adjust this estimate over time.
I mention bio x-risk because it’s probably the most directly comparable problem that’s important, neglected and also relatively scalable. If we’re thinking about decisions on the level of the individual rather than say from the movement or large funders, so there’s no scalability constraint, there are plausibly at least a few other options that’s already both extremely important and more neglected than AI safety such that it makes sense for nonzero people who are unusually suited for such work to work on; e.g. here’s a recent Forum argument for digital consciousness.
(rushed comment, but still thought it was worth posting. )
I’m not sure what the “quality adjusted” dollars means, but in terms of dollars, I think net spend on AI safety is more like 200M / year instead of 10s of millions.
EA Funds spends maybe 5M / year on AI Safety? I’d be very surprised if it was <1M / year.
FTX gave maybe another 100M of AI Safety related grants, not including Anthropic ( I estimate)
That gives 150M.
I also think lab spending such as Anthropic, OpenAI, and DeepMind’s safety team should be counted here. I’d put this at like 50M / year, which gives a lower bound total of 200M in 2022, because other people might be spending money.
I imagine that net spend in 2023 will be significantly lower than this though, 2022 was unusually high, likely due to FTX things.
Of course, spending money does not equate with impact, it’s pretty plausible that much of this money was spent very ineffectively.
(+1 to this approach for estimating neglectedness; I think dollars spent is a pretty reasonable place to start, even though quality adjustments might change the picture a lot. I also think it’s reasonable to look at number of people.)
Looks like the estimate in the 80k article is from 2020, though the callout in the biorisk article doesn’t mention it — and yeah, AIS spending has really taken off since then.
I think the OP amount should be higher because I think one should count X% of the spending on longtermist community-building as being AIS spending, for some X. [NB: I work on this team.]
I downloaded the public OP grant database data for 2022 and put it here. For 2022, the sum of all grants tagged AIS and LTist community-building is ~$155m. I think a reasonable choice of X is between 50% and 100%, so taking 75% at a whim, that gives ~$115m for 2022.
One thing to keep in mind is that nothing is static. Just as attention and resources towards AIS may ebb and flow in the coming years, so will attention to other highly pressing problems, like other x-risks and s-risks.
But let’s ignore that for now and go through a few sketches that might turn into BOTECs.
80000 Hours currently estimates that 10s of millions of quality-adjusted dollars are spent on AIS.Commenters estimate about 300M/year is spent on AIS in 2022, while roughly 1 billion a year (quality-adjusted) is spent on reducing bio x-risks.So first order, if you think bio x-risk is 10x less important ∩ tractable than AIS, then at the point where AIS has roughly 1 billion quality-adjusted dollars/year spent on it, then bio-risk is sufficiently relatively neglected that a moderate comparative advantage for someone generally talented should push them to work against bio-risks over AIS. Similarly, at 10x AIS importance ∩ tractability compared to biorisk, then you should consider AIS and bio x-risk equally neglected relative to other factors at the $10 B/year mark. To be clear, this is just saying that AIS is no longer “most neglected relative to its importance” which is a very high bar; even at 10B/year it’d arguably still be extremely neglected in absolute terms.[1]
Likewise, if you think bio x-risk is 100x less important ∩ tractable than AIS, the above numbers should be $10B/year and $100B/year, respectively.
(Some people think the difference is much more than 100x but I don’t personally find the arguments convincing after having looked into it non-trivially. That said, I don’t have much access to private information, and no original insights).
However, as mentioned in the beginning, this assumes, likely incorrectly, that resources on bio-x-risk is relatively static. To the extent this assumption is false, you’d need to dynamically adjust this estimate over time.
I mention bio x-risk because it’s probably the most directly comparable problem that’s important, neglected and also relatively scalable. If we’re thinking about decisions on the level of the individual rather than say from the movement or large funders, so there’s no scalability constraint, there are plausibly at least a few other options that’s already both extremely important and more neglected than AI safety such that it makes sense for nonzero people who are unusually suited for such work to work on; e.g. here’s a recent Forum argument for digital consciousness.
Note that the world probably spends 10s of billions a year on reducing climate risk, and similar or greater amounts on ice cream.
(rushed comment, but still thought it was worth posting. )
I’m not sure what the “quality adjusted” dollars means, but in terms of dollars, I think net spend on AI safety is more like 200M / year instead of 10s of millions.
Very rough estimates for 2022:
From OP’s website, it looks looks like:
15M to a bunch of academics
13M to something at MIT
10M to Redwood
10M to Constellation
5M to CAIS
~25M of other grants (e.g. CNAS, SERI MATS)
Adds up to like 65M
EA Funds spends maybe 5M / year on AI Safety? I’d be very surprised if it was <1M / year.
FTX gave maybe another 100M of AI Safety related grants, not including Anthropic ( I estimate)
That gives 150M.
I also think lab spending such as Anthropic, OpenAI, and DeepMind’s safety team should be counted here. I’d put this at like 50M / year, which gives a lower bound total of 200M in 2022, because other people might be spending money.
I imagine that net spend in 2023 will be significantly lower than this though, 2022 was unusually high, likely due to FTX things.
Of course, spending money does not equate with impact, it’s pretty plausible that much of this money was spent very ineffectively.
(+1 to this approach for estimating neglectedness; I think dollars spent is a pretty reasonable place to start, even though quality adjustments might change the picture a lot. I also think it’s reasonable to look at number of people.)
Looks like the estimate in the 80k article is from 2020, though the callout in the biorisk article doesn’t mention it — and yeah, AIS spending has really taken off since then.
I think the OP amount should be higher because I think one should count X% of the spending on longtermist community-building as being AIS spending, for some X. [NB: I work on this team.]
I downloaded the public OP grant database data for 2022 and put it here. For 2022, the sum of all grants tagged AIS and LTist community-building is ~$155m. I think a reasonable choice of X is between 50% and 100%, so taking 75% at a whim, that gives ~$115m for 2022.
Makes sense, so order $300m total?
thanks, this is helpful!