Maybe EA can’t affect political polarization nearly as much as the other way around—political polarization can dramatically affect EA.
EA could get “cancelled”, perhaps for its sometimes tough-to-swallow focus on prioritization and tradeoffs, for a single poorly phrased and high profile comment, for an internal problem at an EA org, or perhaps for lack of overall diversity. Less dramatically, EA could lose influence by affiliating with a political party, or by refusing to, and degrading norms around discourse and epistemics could turn EA discussions into partisan battlegrounds.
This post makes a lot of sense, I definitely think EA should consider the best direction for the political climate to grow in. We should also spend plenty of time considering how EA will be affected by an increasing polarization, and how we can respond.
EAs in policy and government seem particularly interesting here, as potentially highly subject to the effects of political climate. I’d love to hear how EAs in policy and government have responded or might respond to these kinds of dynamics.
One question: Should EA try to be “strategically neutral” or explicitly nonpartisan? Many organizations do so, from think tanks to newspapers to non-profits. What lessons can we learn from other non-partisan groups and movements? What are their policies, how do they implement them, and what effects do they have?
Off the top of my head: The ability to host conferences without angry protesters in front, the chance to be mentioned in a favorable manner by a mainstream major news outlet and willingness of high profile people to associate with EA. Look up what EA intellectuals thought in the near past about why it would be unwise for EA to make too much noise outside the Overton window. This is still valid, except now the Overton has begun to shift at an increasing pace.
Note that this is not meant to be an endorsement of EA aligning with or paying lip service to political trends. I personally believe an increase of enforced epistemic biases to be an existential threat to the core values of EA.
Maybe EA can’t affect political polarization nearly as much as the other way around—political polarization can dramatically affect EA.
EA could get “cancelled”, perhaps for its sometimes tough-to-swallow focus on prioritization and tradeoffs, for a single poorly phrased and high profile comment, for an internal problem at an EA org, or perhaps for lack of overall diversity. Less dramatically, EA could lose influence by affiliating with a political party, or by refusing to, and degrading norms around discourse and epistemics could turn EA discussions into partisan battlegrounds.
This post makes a lot of sense, I definitely think EA should consider the best direction for the political climate to grow in. We should also spend plenty of time considering how EA will be affected by an increasing polarization, and how we can respond.
EAs in policy and government seem particularly interesting here, as potentially highly subject to the effects of political climate. I’d love to hear how EAs in policy and government have responded or might respond to these kinds of dynamics.
One question: Should EA try to be “strategically neutral” or explicitly nonpartisan? Many organizations do so, from think tanks to newspapers to non-profits. What lessons can we learn from other non-partisan groups and movements? What are their policies, how do they implement them, and what effects do they have?
Thanks for this post, very interesting!
What does “cancelling” mean, concretely? I don’t imagine the websites will be closed down. What will we lose?
Off the top of my head: The ability to host conferences without angry protesters in front, the chance to be mentioned in a favorable manner by a mainstream major news outlet and willingness of high profile people to associate with EA. Look up what EA intellectuals thought in the near past about why it would be unwise for EA to make too much noise outside the Overton window. This is still valid, except now the Overton has begun to shift at an increasing pace.
Note that this is not meant to be an endorsement of EA aligning with or paying lip service to political trends. I personally believe an increase of enforced epistemic biases to be an existential threat to the core values of EA.