I think being open to talking with as many people as you can about AGI x-risk is especially important now. This is a chance for it to become a mainstream political issue. Try and steer conversations about AI-induced job losses toward “the big one, the AI Apocalypse; that’s the end game that we need to prevent”.
I’ll note that I’ve been using the term “AI Apocalypse” to refer to AGI x-risk in a non-jargony way for a while now, when talking to friends and family outside of the EA/LW/x-risk community.
I think being open to talking with as many people as you can about AGI x-risk is especially important now. This is a chance for it to become a mainstream political issue. Try and steer conversations about AI-induced job losses toward “the big one, the AI Apocalypse; that’s the end game that we need to prevent”.
I’ll note that I’ve been using the term “AI Apocalypse” to refer to AGI x-risk in a non-jargony way for a while now, when talking to friends and family outside of the EA/LW/x-risk community.