I think not mentioning climate change, though technically it’s correct to not consider it a likely extinction risk, was probably the biggest problem with the statement. It may have given it a tinge or vibe of right-wing political polarization, as it feels like it’s almost ignoring the elephant in the room, and that puts people on the defensive. Perhaps a broader statement could have mentioned “risks to our species or our civilization such as nuclear war, climate change and pandemics”, which broadens the kind of risks included. After all, some very extreme climate change scenarios could be an X-risk, much like nuclear war could “only” cause a civilization collapse, but not extinction. Plus, these risks are correlated and entangled (climate change could cause new pandemics due to shifting habitats for pathogens, AI could raise international tensions and trigger a nuclear war if put in charge of defense, and so on). An acknowledgement of that is important.
There is unfortunately some degree of “these things just sound like Real Serious Problems and that thing sounds like a sci-fi movie plot” going on here too, and I don’t think you can do much about that. The point of the message should not be compromised on that—part of the goal is exactly to make people think “this might not be sci-fi any more, but your reality”.
My impression is that on the left there is a strong current that tries to push the idea of EAs being one and the same with longtermists, and both being lost after right-wing coded worries and ignoring the real threats (climate change). Statements about climate change not being an existential threat are often misinterpreted (even if technically true, they come off as dismissal). In practice, to most people, existential (everyone dies) and civilizational (some people survive, but they’re in a Mad Max post-apocalyptic state) risks are both so awful that they count as negative infinity, and warrant equal effort to be averted.
I’m going to be real, I don’t trust much the rationality of anyone who right now believes that climate change is straight up fake, as some do—that is a position patently divorced from reality. Meanwhile there’s plenty of people who are getting soured up on AI safety because they see it presented as something that is being used to steal attention from it. I don’t know precisely how can one assess the relative impacts of these trends, but I think it would be very urgent to determine it.