“Resources that are only relevant to people interested in AI governance and (to some extent) technical AI safety
You could participate in the AGI Safety Fundamentals course’s Governance track, or—when the course isn’t running—work through all or part of the curriculum independently. This seems like an unusually good way for most people to learn about AI risk and AI governance (from a longtermist or existential-risk-focused perspective).
But I’d suggest being discerning with this list, as I also think that some of those ideas are relatively low-priority and that the arguments presented for prioritizing those particular ideas are relatively weak, at least from a longtermist/existential-risk-focused perspective.”
Just popping in to say you might find this post (of mine) useful: Interested in EA/longtermist research careers? Here are my top recommended resources Also this comment I left on it:
“Resources that are only relevant to people interested in AI governance and (to some extent) technical AI safety
You could participate in the AGI Safety Fundamentals course’s Governance track, or—when the course isn’t running—work through all or part of the curriculum independently. This seems like an unusually good way for most people to learn about AI risk and AI governance (from a longtermist or existential-risk-focused perspective).
Description of some organizations relevant to long-term AI governance (non-exhaustive) (2021) collects and overviews some organizations you might be interested in applying to. (This link is from week 7 of the AGI Safety Fundamentals course’s Governance track.)
I think Some AI Governance Research Ideas would be my top recommendation for a public list of AI governance research ideas.
But I’d suggest being discerning with this list, as I also think that some of those ideas are relatively low-priority and that the arguments presented for prioritizing those particular ideas are relatively weak, at least from a longtermist/existential-risk-focused perspective.”
“I’d suggest being discerning with this list”
Definitely agree with this!