AGI Safety Needs People With All Skillsets!
(Cross-posted to LessWrong)
For quite a while, I had two major misconceptions about careers and volunteering in AGI safety. And I know others have them, too:
Only people with a background in computer science or math can help with AGI safety.
Only people smarter than I can help with AGI safety.
This is false, and here’s why.
If we want to win the race against AI capabilities research, we indeed need as many geniuses with a background in computer science or adjacent fields as we can get. But these people would have a hard time doing their work without others who set up the organizations they work at, who build the funding ecosystem, who run the retreats where they meet like-minded people, who design their websites, who wipe their office floors. In addition, they would profit from having people who do their taxes, can advise them on visa issues, help them with health problems, give them productivity coaching, and so on and so on. Over the last years, AGI safety has grown into a large ecosystem of individuals and organizations. This ecosystem depends on far more than only those who do the research.
Here is a list of skill bottlenecks I’ve found in conversations with other AI safety field builders:
Discussion group facilitators (e.g. for AGISF)
Event organizers (e.g. for AI Safety Camp, conferences)
Lawyers (e.g. for helping to set up organizations, and advising on the tax implications of novel systems we’re designing like Impact Markets)
Founders (An alignment project incubator is in the making. Until it is ready to go public, you might want to fill out 80k’s census of everyone who could ever see themselves doing longtermist work)
Communicators (written and verbal)
Everything HR-related
Mobility/Visa support
Hiring
Salary-related things: Calculating living cost, taxes, health insurance, …
Bookkeeping
Bookkeepers
Accountants
Payable
Receivable
Auditors
Software engineers (to build projects like Stampy, Idea Marketplace, Impact Markets, etc.)
Particularly valuable are people who can get a broad overview of the current AGI alignment ecosystem and kickstart projects which can absorb people in a scalable manner.
…and you, too, can find your niche, even if you don’t bring any of these particular skill sets. For inspiration, here is a list of less standard career paths which turned out valuable for the ecosystem:
YouTubers (Rob Miles)
Architects (Tereza Flidrova)
Cooking and cleaning staff (e.g. at CEEALAR)
Graphic designers (for logos and websites)
…
If you want to help and are not sure how: Feel free to use the comments below this post for career discussions. And make sure to reach out to 80,000 hours and AI Safety Support for free career coaching!
I would add, “people with a technical background who also have strong writing skills.” Maybe this is subsumed by communicators but I wanted to flag it specifically.
A lot of the best researchers either don’t like to write, are slow at writing well, or simply aren’t very good at writing well. But there is much that needs to be written. For this reason I’ve found recently that writing appears to be one of my comparative advantages.
You do need to be somewhat technical to understand the content you’re writing about, but you don’t have to be a top of the line researcher.
Related question: I have strong writing skills, but no technical background. If I wanted to learn more technical stuff so I could help technical researchers write things, what should I learn? Where should I start? (bearing in mind that I’d be starting at the 101 level for most things).
The technical track of the AGI Safety Fundamentals course probably is the best entry route, and then getting to know more people in the field and discovering your niche.
The readings are linked on the homepage, and each week’s core readings only take 2-3hrs. With a reading group or e.g. tutoring from a bachelor-level computer science student, they should be totally doable without waiting for the official cohort to start.
And of course, the encouragement to reach out to 80k and AISS goes for you, too. :)
In addition, there’s a wonderful way to learn the necessary technical bits while contributing as a writer from the start: You could join the team of editors for Stampy, an interactive AGI safety FAQ in the making. It would greatly profit from more people adding questions, as well as from more people of all experience levels who write up answers to them.
An opportunity to practice your writing skills: There’s a practicum for learning how to distill AGI safety writing starting soon!
Strongly agree! I actually drafted this post during a conversation of two alignment field builders when one said, “Somebody should write a forum post about this!”
Communication in particular would seem to be key to successful efforts in this area, given that both premises and conclusions tend to be counterintuitive to the uninitiated.
Ineffective communication on this topic is not merely neutral, it is actively harmful.
Whether it is :
Shades of fanaticism
Overconfidence & dismissiveness of alternatives
Unclear writing with heavy use of jargon
The lack of simple, widely appealing stories that explain why the problem is serious and how one might help fix it
Such issues of communication :
Put a hard ceiling on the sort of funding this cause might receive
Summon an entire community of detractors that might not otherwise have existed (with implications for EA as a whole)
I’d love to see more communicators and storytellers get a crack at this. Particularly, having multiple “points of entry” to the issue that do not require a PhD to understand, and that people can easily be linked to, could be of immense value.