I am also highly uncertain of EAs’ ability to intervene in cultural change, but I do want us to take a hard look at it and discuss it. It may be a cause that is tractable early on, but hopeless if ignored.
You may not think Hsu’s case “actually matters”, but how many turns of the wheel is it before it is someone else?
Peter Singer has taken enough controversial stances to be “cancelled” from any direction. I want the next Singer(s) to still feel free to try to figure out what really matters, and what we should do.
We needn’t take on reputational risk unnecessarily, but if it is possible for EAs to coordinate to stop a Cultural Revolution, that would seem to be a Cause X candidate. Toby Ord describes a great-power war as an existential risk factor, as it would hurt our odds on: AI, nuclear war, and climate change, all at once. I think losing free expression would also qualify as an existential risk factor.