The pipeline for (x-risk-focused) AI strategy/governance/forecasting careers has never been strong, especially for new researchers. But it feels particularly weak recently (e.g. no summer research programs this year from Rethink Priorities, SERI SRF, or AI Impacts, at least as of now, and as few job openings as ever). (Also no governance course from AGI Safety Fundamentals in a while and no governance-focused programs elsewhere.)[1] We’re presumably missing out on a lot of talent.
I’m not sure what the solution is, or even what the problem is—I think it’s somewhat about funding and somewhat about mentorship and mostly about [orgs not prioritizing boosting early-career folks and not supporting them for various idiosyncratic reasons] + [the community being insufficiently coordinated to realize that it’s dropping the ball and it’s nobody’s job to notice and nobody has great solutions anyway].
If you have information or takes, I’d be excited to learn. If you’ve been looking for early-career support (an educational program, way to test fit, way to gain experience, summer program, first job in AI strategy/governance/forecasting, etc.), I’d be really excited to hear your perspective (feel free to PM).
(In AI alignment, I think SERI MATS has improved the early-career pipeline dramatically—kudos to them. Maybe I should ask them why they haven’t expanded to AI strategy or if they have takes on that pipeline. For now, maybe they’re evidence that someone prioritizing pipeline-improving is necessary for it to happen...)
- ^
Added on May 24: the comments naturally focused on these examples, but I wasn’t asserting that summer research programs or courses are the most important bottlenecks—they just were salient to me recently.
To help with the talent pipeline, GovAI currently runs twice-a-year three-month fellowships. We’ve also started offering one-year Research Scholar positions. We’re also now experimenting with a new policy program. Supporting the AI governance talent pipeline is one of our key priorities as an organization.
That being said, we’re very very far from filling the community’s needs in this regard. We’re currently getting far more strong applications than we have open slots. (I believe our acceptance rate for the Summer Fellowship is something like 5% and will probably keep getting lower. We now need to reject people who actually seem really promising.) We’d like to scale our programs up more, but even then there will still be an enormous unmet need. I would definitely welcome more programs in this space!
I would also strongly recommend having a version of the fellowship that aligns with US university schedules, unlike the current Summer fellowship!
I was very glad to see the research scholar pathway open up, it seems exactly right for someone like me (advanced early career, is that a stable segment?).
I’m also glad to hear of the interest too, although it’s too bad that the acceptance rate is lower than ideal. Then again, to many folks coming from academic grant funding ecosystems, 5% is fairly typical, for major funding in my fields at least.
I totally agree there’s a gap here. At BlueDot Impact (/ AGI safety fundamentals), we’re currently working on understanding the pipeline for ourselves.
We’ll be launching another governance course in the next week, and in the longer term we will publish more info on governance careers on our website, as and when we establish the information for ourselves.
In the meantime, there’s great advice on this account, mostly targeted at people in the US, but there might be some transferrable lessons:
https://forum.effectivealtruism.org/users/us-policy-careers
May I just add that, as someone who self-studied my way through the public reading list recently, I’d rate many of the resources there very highly.
It’s worth mentioning the Horizon Fellowship and RAND Fellowship.
I also have the impression that there’s a gap and would be interested in whether funders are not prioritizing it too much, or whether there’s a lack of (sufficiently strong) proposals.
Another AI governance program which just started its second round is Training For Good’s EU Tech Policy fellowship, where I think the reading and discussion group part has significant overlap with the AGISF program. (Besides that it has policy trainings in Brussels plus for some fellows also a 4-6 months placement at an EU think tank.)
This is a timely post. It feels like funding is a critical obstacle for many organisations.
One idea: Given the recent calls by many tech industry leaders for rapid work on AI governance, is there an opportunity to request direct funding from them for independent work in this area.
To be very specific: Has someone contacted OpenAI and said: “Hey, we read with great interest your recent article about the need for governance of superintelligence. We have some very specific work (list specific items) in that area which we believe can contribute to making this happen. But we’re massively understaffed and underfunded. With $1m from you, we could put 10 researchers working on these questions for 1 year. Would you be willing to fund this work?”
What’s in it for them? Two things:
If they are sincere (as I believe they are), then they will want this work to happen, and some groups in the EA sphere are probably better placed to make it happen than they themselves are.
We can offer independence (any results will be from the EA group, not from OpenAI and not influenced or edited by OpenAI) but at the same time we can openly credit them with funding this work, which would be good PR and a show of good faith on their part.
Forgive me if this is something that everyone is already doing all the time! I’m still quite new to EA!
Given the (accusations of) conflicts of interest in OpenAI’s calls for regulation of AI, I would be quite averse to relying on OpenAI for funding for AI governance