We need funding pots specifically for the non-technical elements of AI Safety.
I agree there’s lots of room for non-technical work to help improve the impacts of AI. Still, I’m not sure funder interest and experience in non-technical AI work is what’s missing. As some examples of how this interest and experience is already present:
OpenPhil has a program officer who specializes in AI governance and policy, and OpenPhil has given out over $80 million in AI governance grants.
That’s a very good point. I still feel there could be more contests, grants, orgs etc in this area but you’re right in that there’s resources there and there’s some serious knowledge at those orgs. Perhaps talent, not funding, is main bottleneck we need to address. They two may be interrelated to an extent.
It’s really frustrating to see so much governance talent at law conferences but very little within EA working on Longtermist issues. I think it’s a mixture of a lack of outreach in those industries and the fact EA’s reputation has taken a couple of dings in the public media lately. People in those industries are very sensitive to PR risk, understandably.
I’ve been seriously considering writing a longtermist book for the legal and governance sector lately, just to get the conversation on the table, but it’s something one can’t rush into.
Thanks for pointing out that post to be. It’s a great read :) Appreciate it!
Thanks for this post!
I agree there’s lots of room for non-technical work to help improve the impacts of AI. Still, I’m not sure funder interest and experience in non-technical AI work is what’s missing. As some examples of how this interest and experience is already present:
OpenPhil has a program officer who specializes in AI governance and policy, and OpenPhil has given out over $80 million in AI governance grants.
The current fund chair of the Long-Term Future Fund has a background in non-technical AI research.
FTX Future Fund staff have backgrounds in philosophy, economics, and law.
(So why isn’t more AI governance work happening? In case you haven’t seen it yet, you might find discussion of bottlenecks in this post interesting.)
That’s a very good point. I still feel there could be more contests, grants, orgs etc in this area but you’re right in that there’s resources there and there’s some serious knowledge at those orgs. Perhaps talent, not funding, is main bottleneck we need to address. They two may be interrelated to an extent.
It’s really frustrating to see so much governance talent at law conferences but very little within EA working on Longtermist issues. I think it’s a mixture of a lack of outreach in those industries and the fact EA’s reputation has taken a couple of dings in the public media lately. People in those industries are very sensitive to PR risk, understandably.
I’ve been seriously considering writing a longtermist book for the legal and governance sector lately, just to get the conversation on the table, but it’s something one can’t rush into.
Thanks for pointing out that post to be. It’s a great read :) Appreciate it!