Wil—this is perfectly reasonable, and generally true. ‘Avoid schisms’ is prudent for movement-building.
However, let me try to steel-man a possible counter-argument. In modern culture, we see many examples of woke activists taking over movements and organizations from the inside, by making ideologically motivated demands, playing various victim cards, demanding more ‘diversity and inclusion’, and trying to nudge the movement/organization in partisan political directions. Ever since the ‘long march through institutions’ (1967) and Rules for Radicals (1971), this has been refined into an extremely common and effective tactic, and it has arguably had very negative effects in academia, media, corporations, and governments.
Movements and organizations often find it difficult to protect themselves from woke takeover, because they don’t understand what’s happening, they don’t have good counter-arguments, they’re too guilt-prone and easily shamed, and they’re too conflict-averse. All too often, the movement/organization feels like they face a dilemma: either (1) give in to the woke activists, submit to all their demands, and take the movement in partisan politicized directions, or (2) accept a schism in which the woke get to take over the larger portion of the original movement, and a minority of anti-woke, ornery, heterodox contrarians go off and start their own movement (which is what we’re currently seeing in American academia, media, corporations, and state governments).
I hope EAs see what’s happening here, and understand the clear and present dangers of a woke takeover of the EA movement. We need to find a third option between accepting a woke takeover, and falling into a woke-versus-antiwoke schism. IMHO, that third option needs to be grounded in a radically honest discussion of what’s really been happening in the EA movement over the last few months. I don’t know what the optimal solution would be. But it might involve making it clear to the minority of woke activists in EA that their ideological values are simply not consistent with EA ethical and epistemic values—just as many other kinds of political, religious, and ideological values are not consistent with EA values.
I agree with much of your underlying frustration, Geoffrey, but I worry that explicitly anti-woke sentiment could encourage the perception that the woke are “not welcome here”.
So many people find and contribute to EA from woke/woke-adjacent circles like climate change activism, welfare capitalism, and animal welfare. Even if you and I disagree with their ideological views, they’re still trying to improve the world, the same way as you or I.
I hope that much the same way that EA is influenced by wokism, woke EAs are influenced by EA refine the ethics and epistemics of their ideology. I’d like to throw in a (perhaps naive) vote for not explicitly alienating woke EAs if at all possible.
Ariel—thanks for your calm & constructive comment. I take your point that many people in those movements such as animal welfare and climate change activism tend to be woke-adjacent. I also accept that everybody thinks they’re trying to improve the world, given their own values and beliefs.
It’s worth having a discussion about whether EA should be explicitly anti-woke, or woke-neutral, or pro-woke (which often includes pretending not to know what ‘woke’ means). However, there’s a variant of O’Sullivan’s Law that seem to operate in modern culture, such that any organization that is not explicitly anti-woke tends to become woke.
Hi Geoffrey, I commented earlier asking what you mean by woke, and would like to clarify that I’m not “pretending not to know what ‘woke’ means.” It’s a word that I only ever see employed derisively and I am truly not sure what it’s supposed to mean beyond “views that the speaker holds in contempt.” So if you are able to give a sense of what “woke” means to you I would appreciate it, as that would help me understand your viewpoint.
Thanks, but that doesn’t actually help me engage with the objections of those who are afraid of wokism (or SJW, or DEI) weakening the movement, because each of those terms can mean so many different things.
A sampling of ideas that seem like they could be included under the umbrella of “wokism” in an EA context:
“Catering at EA events should be vegan”
“EA spaces should be welcoming for trans people”
“EA would be stronger if EAs were less homogenous”
“Reports of sexual assault and harassment should be taken seriously”
“Racism, including so-called ‘scientific’ racism, is a scourge”
As is probably evident from my comment history, I do happen to agree with all of these assertions. But I would be interested in engaging respectfully with someone who didn’t. What I can’t do is meaningfully respond to the idea that wokism, undefined, is threatening EA.
(edited to add - if anyone disagree voting would be willing to tell me what they disagree with, I would appreciate it)
I think part of the difficulty here is that “wokism” seems to refer to a cluster of ideas and practices that seem to be a genuine cluster, but don’t have especially clear boundaries or a singular easy definition.
What I do notice is that none of the ideas you listed, at least at the level of abstraction at which you listed them, are things that anyone, woke or anti-woke or anywhere in between, will disagree with. But I’ll try to give some analysis of what I would understand to be woke in the general vicinity of these ideas. Note that I am not asserting any normative position myself, just trying to describe what I understand these words to mean.
I don’t think veganism really has much to do with wokism. Whatever you think about EA event catering, it just seems like an orthogonal issue.
I suspect everyone would prefer that EA spaces be welcoming of trans people, but there may be disagreement on what exactly that requires on a very concrete level, or how to trade it off against other values. Should we start meetings by having everyone go around and give their pronouns? Wokism might say yes, other people (including some trans people) might say no. Should we kick people out of EA spaces for using the “wrong” pronouns? Wokism might say yes, other might say no as that is a bad tradeoff against free speech and epistemic health.
I suspect everyone thinks reports of assault and harassment should be taken seriously. Does that mean that we believe all women? Wokism might say yes, others might so no. Does that mean that people accused should be confronted with the particular accusations against them, and allowed to present evidence in response? Wokism might say no, others might say yes, good epistemics requires that.
I’m honestly not sure what specifically you mean by “so-called ‘scientific’ racism” or “scourge”, and I’m not sure if that’s a road worth going down.
Again, I’m not asserting any position myself here, just trying to help clarify what I think people mean by “wokism”, in the hopes that the rest of you can have a productive conversation.
none of the ideas you listed, at least at the level of abstraction at which you listed them, are things that anyone, woke or anti-woke or anywhere in between, will disagree with
This is a tangent, but raising my hand as someone who does disagree that EA events should generally have only vegan food. I think having good vegan food available is very important, and think you can make a good case for excluding meat, but the more you constrain the menu the harder it is for people to find the food they need. This is especially a problem for longer or residential events, where the downsides of a limited diet compound and going out to get different food can be logistically challenging.
I’m not “pretending to not know what woke means”, I genuinely think it would be constructive for you to define what it is you mean by using it, and by explaining why you think it is a threat to EA.
Some things I think you could mean: -People who talk a lot about “positionality” -People who look at white men distrustfully and assume they have bad intentions -People who talk about diversity and inclusion and virtue signal about said things -People who are part of the “culture wars” in the United States
The problem is that, I genuinely do not know how you define it, and how you think this applies to EA or is some sort of threat to EA.
Another problem is that you seem to assume you can identify whether or not someone is “woke” without actually defining what that means or really knowing the person. I don’t think that’s fair. I also think you are doing what I notice people on twitter do, which is look at really superficial things like how someone talks or presents themselves online, and think you can categorize them as “woke” or “not-woke”. It’s just very polarizing. I actually think you and I agree on a lot more than you would assume but because I disagree with you using “woke” language you assume I am “pro-woke”.
You’re welcome! I agree that the discussion is worth having, and won’t pretend to know the right answer. Your point that the de-facto choice may be between anti-woke and (eventually) pro-woke is legitimate.
One consideration which we might be underestimating (I’m not just saying this; I mean it :P) is the impact of ways EA could influence woke ideology:
Expose woke people to prioritarianism, which incorporates their perception of the effects of oppression between groups of people, while often resulting in de-facto EA conclusions.
Expose woke people to the possible moral significance of disenfranchised groups which are typically ignored in the public eye, such as future people and wild animals.
Encourage woke people to quantify their perceptions of how oppressed different groups are, and how to make tradeoffs between interventions which help different groups. This also often leads to EA conclusions. For example, under most plausible assumptions of the sentience of farmed animals, it seems likely that a given intervention in farmed animal welfare will reduce more suffering than a given anti-racist intervention.
Strong agree! The basic framework of EA, using utilitarian EV calculus to have more impact, can be adopted by folks on the left or right. People who are more into social justice and climate change can learn to have better feedback mechanisms to increase impact.
At the same time, conservative religious groups that do a ton of charity could also be led to using more effective interventions. I don’t think the EA framework needs to be politicized.
This comment implies the only relevant division is over wokery. I’m not sure why you focused only on that, but there are other ways people can practically disagree about what to do...
Wokeism has been a on/off discussion topic in EA for about as long as I can remember. My woke friends complain that EA is hopelessly anti-woke, and my anti-woke friends complain that EA is hopelessly woke. The predictions of political schism or ideological takeover keep being made, and keep being false.
In my opinion, we’ve already found a “third option” which works: the empathy to seek mutual understanding, the philosophical sophistication to critique fashionable ideas, and the willingness to share our perspective even when it seems unpopular.
In my opinion, we’ve already found a “third option” which works: the empathy to seek mutual understanding, the philosophical sophistication to critique fashionable ideas, and the willingness to share our perspective even when it seems unpopular.
I found parts of this 3-month old comment by a non-Western trans man writing about the masculinity-femininity divide to be really insightful and prescient.
Just as many people point to ‘toxic masculinity’ (which can also be present in women), I think they should also acknowledge the existence of ‘toxic femininity’ (which can also be present in men). FWIW, I think a lot of activists raised in (somewhat-functioning) democracies are underestimating the dangers of limiting free expression, the dangers of marginalizing people whose features were historically associated with having more power, and the possibility that they might be becoming more sensitive to things that they can otherwise overcome.
Hey Geoffrey, thanks for pointing this out. I agree it seems like you immediately got downvoted hard—I’ve strong agreed to try and correct that a bit.
I broadly agree with you on this, and I’m glad we’re having this conversation. However I think framing it this way is problematic and leads to tribalism. The way I see it the ‘woke takeover’ is really just movements growing up and learning to regulate some of their sharper edges in exchange for more social acceptance and political power.
Different movements do better or worse—New Atheism is an example that was ruined by this modulation. I’m optimistic that EA can learn to become more welcoming and palatable to normal folks on the left and right, while keeping the old guard, if we play our cards right.
The largest divide seems to be the older folks who prize unconventional dating and social norms like polyamory, radical honesty, etc, versus a lot of the more “normal” folks that may be turned off by that sort of thing. For instance leading a local group in Raleigh NC, we have a large number of people that have relatively standard intuitions about sex and relationships.
My biggest goal is learning how to increase their involvement and engagement in EA without turning them off—something we’ve already dealt with a bit from SBF.
Building that middle ground framework will be tough, do you have any ideas here as to where we can start?
The way I see it the ‘woke takeover’ is really just movements growing up and learning to regulate some of their sharper edges in exchange for more social acceptance and political power.
I don’t agree with this part of the comment, but am aware that you may not have the particular context that may be informing Geoffrey’s view (I say may because I don’t want to claim to speak for Geoffrey).
These two podcasts, one by Ezra Klein with Michelle Goldberg and one by the NY Times, point to the impact of what is roughly referred to in these podcasts as “identity politics” or “purity politics” (which other people may refer to as “woke politics”). The impact, according to those interviewed, on these movements and nonprofits, has been to significantly diminish their impact on the outside world.
I also think that it would be naïve to claim that these movements were “growing up” considering how long feminism and the civil rights movement have been around. The views expressed in these podcasts also strongly disagree with your claim that they are gaining more political power.
I think these experiences, from those within nonprofits and movements on the left no less, lend support to what Geoffrey is arguing. Especially considering that the EA movement is ultimately about having the most (positive) impact on the outside world.
“ The way I see it the ‘woke takeover’ is really just movements growing up and learning to regulate some of their sharper edges in exchange for more social acceptance and political power.”
I think there is some truth in movements often “growing up” over time and I agree that in some circumstances people can confuse this with “woke takeover”, but I think it’s important to have a notion of some takeover/entryism as well.
In terms of the difference: to what extent did people in the movement naturally change their views vs. to what extent was it compelled?
I suppose protest can have its place in fixing a system, but at a certain hard-to-identify point, it essentially becomes blackmail.
I’m grateful for this comment, because it’s an exemplar of the kind of comment that makes me feel most disappointed by the EA community.
It’s bad enough that influential EAs have caused a lot of damage to other individuals, and to the good work that might be done by the community. But it’s really upsetting that a lot of the community (at least as exemplified by the comments on the forum; I know this isn’t fully representative) doesn’t seem to take it seriously enough. We’re talking about really horrible examples of racism and sexual harassment here, not ‘woke activism’ gone too far. It hurts people directly, it repels others from the community, and it also makes it harder to further important causes.
It’s also couched in the terms of ‘rationalism’ and academic integrity (“let me try to steel-man a possible counter-argument...”), rather than just coming out and saying what it is. I don’t think you’re (merely) trying to make a hypothetical argument. Similarly the “I hope EAs see what’s [really]* happening here, and understand the clear and present dangers...” sounds alarmist to me.
*I included the [really], because it seems to me like the author of the comment is trying to lend weight to their argument by implying they are revealing something most people would otherwise miss.
I understand the frustrations you and others are voicing, but to me I think it’s more a lack of competence and understanding of management/power differentials/social skills from some of the higher level EAs. I highly doubt that the upper echelons of EA are full of malicious sociopaths who are intentionally harming people.
EA has done a lot of good, and people make mistakes often. I do think we need to rectify them and punish bad behavior, but we should try and make sure we don’t alienate the old guard of EA for making mistakes in the socializing/dating world. A lot of people struggle to understand what is okay and what isn’t—I’d rather try and reconcile or educate them than attack each other. That’s the point of this post.
My comment here was about Geoffrey Miller’s comment, rather than your original post as a whole (albeit I separately took issue with your use of “relatively petty...”), so I’m not sure I follow where you’re going here.
FWIW, if you’re referring to recently-come-to-light examples of sexual harassment and racism when you say “it’s more a lack of competence...”, then I would disagree with your characterisation. I think by saying that the likes of Owen Cotton-Barratt and Nick Bostrom aren’t “malicious sociopaths”, and that they didn’t do it ‘intentionally’ you fail to acknowledge the harm they’ve done. It’s a similar line of argument to your original post when you compare the harm done with “the survival of the human race”. I think it’s missing the point, it’s insensitive, and implies that they’re not soooo bad.
I also worry when the initial reaction to someone’s misdeeds is “let’s make sure we don’t punish them too harshly, or we’ll alienate them”, rather than “this is really wrong, and our first priority should be to make sure it doesn’t happen again”. My initial response isn’t to shed a tear for the damage to the career of the person who did the wrong thing.
I disagree with your framing this as “attacking” the people that have done wrong. If anything, it’s the people on the end of the sexual harassment that have been attacked.
I find it distasteful when people point to things like “EA has done a lot of good” or “EA has saved a lot of lives” in the context of revelations of sexual harassment etc. While it might be factually correct, I think it gives the sense that people think it’s OK to do horrible personal things as long as you donate enough to Givewell (I very much disagree).
And one final point: I don’t think “the old guard of EA” is the right frame (although I’m somewhat biased as I was involved in EA in 2011-12). I don’t believe the majority of wrongdoers are from this group, nor do I believe the majority of this group are wrongdoers.
Thanks for responding. For what it’s worth I personally think OCB should be permanently resigned from a powerful position in EA, and possibly socially distanced. Strong incentives against that type of behavior, especially right now, are extremely important. I’m disappointed with the response from EVF and think it should be far harsher.
The distinction I’m trying to make is that we shouldn’t assume all powerful people in EA are bad apples as a result of this scandal breaking.
I agree with this—it is also why I disagree-voted, and no, I don’t have notifications set up for Geoffrey (as mentioned in another comment by them).
The comment felt to me like it was undermining a lot of the recent criticism regarding people in powerful positions, AT THE VERY LEAST, showing very bad judgement. The comment makes me very sad and angry.
BTW, it’s interesting that there seems to be a cadre of EAs who get notifications whenever I post something, and they immediately disagree-vote on it, within a few minutes, if it involves any criticism of wokeness. But they don’t actually reply with any concrete reasons for disagreement. Then later the other EAs who come across the post naturally tend to agree-vote more often with it.
I would also be interested in what you mean by wokeness—I haven’t downvoted you but I don’t think I follow your point, largely because I don’t know what you mean by “woke takeover” or “woke-versus-antiwoke.”
I don’t get notifications for your posts, I saw it when I read the comments on this post and disagreed with it because I personally dislike the use of the word “woke” and see it as divisive in itself. It would be helpful for me if you could define what you mean by woke and explain what it means to EA. I know it is a common term used in the US and in twitter conversations about American politics, but I would prefer to not see US political discourse language in EA unless it’s really illuminating any real issues or threats. You seem
to be making a lot of general claims like “Movements and organizations often find it difficult to protect themselves from woke takeover, because they don’t understand what’s happening, they don’t have good counter-arguments, they’re too guilt-prone and easily shamed, and they’re too conflict-averse. ”, but it’s not clear to me what you’re referring to.
Thanks for this Lauren I was thinking the same thing. Using labels to ‘us and them’ people isn’t useful, and I think that the word “woke” can mean so many different things that I think it’s not particularly useful.
I think it’s better to name the specific things that you are concerned about, rather than use vague labels like “woke”
Geoffrey, or anyone really, can you please define wokeness?
I fail to see how EA‘s vague opposition to being anti-woke in partisan culture wars are anything more than internecine credible threats to open society. As a neurodivergent and self-identified Black American EA who was moved by and still respects your article on viewpoint and neurodiversity but pragmatically votes on the left as a transpartisan because I don’t see another middle way that isn’t omnicidal?
With genuine respect, I find the blanket dismissals of wokenness to be extremely inflammatory and ineffective in eliciting the calm and respectful pushback from people who want to break new ground that you/EA/we(?) are looking for.
Also, thank you, Lauren, Nick and others for bringing attention to this.
I will admit to having strong downvoted a number of critical posts, while having upvoted others, in order to create an incentive gradient to produce better criticism.
If we start getting less criticism, then I’ll default towards overvoting criticism more.
As far as I can tell, you can’t actually set up notifications for someone’s comments, only posts. Are you suggesting that people are using some external browser extension specific to your comments? I find your notifications theory to be extremely unlikely and borderline conspiratorial.
A more likely explanation is that a lot of people don’t agree with your opinions.
Wil—this is perfectly reasonable, and generally true. ‘Avoid schisms’ is prudent for movement-building.
However, let me try to steel-man a possible counter-argument. In modern culture, we see many examples of woke activists taking over movements and organizations from the inside, by making ideologically motivated demands, playing various victim cards, demanding more ‘diversity and inclusion’, and trying to nudge the movement/organization in partisan political directions. Ever since the ‘long march through institutions’ (1967) and Rules for Radicals (1971), this has been refined into an extremely common and effective tactic, and it has arguably had very negative effects in academia, media, corporations, and governments.
Movements and organizations often find it difficult to protect themselves from woke takeover, because they don’t understand what’s happening, they don’t have good counter-arguments, they’re too guilt-prone and easily shamed, and they’re too conflict-averse. All too often, the movement/organization feels like they face a dilemma: either (1) give in to the woke activists, submit to all their demands, and take the movement in partisan politicized directions, or (2) accept a schism in which the woke get to take over the larger portion of the original movement, and a minority of anti-woke, ornery, heterodox contrarians go off and start their own movement (which is what we’re currently seeing in American academia, media, corporations, and state governments).
I hope EAs see what’s happening here, and understand the clear and present dangers of a woke takeover of the EA movement. We need to find a third option between accepting a woke takeover, and falling into a woke-versus-antiwoke schism. IMHO, that third option needs to be grounded in a radically honest discussion of what’s really been happening in the EA movement over the last few months. I don’t know what the optimal solution would be. But it might involve making it clear to the minority of woke activists in EA that their ideological values are simply not consistent with EA ethical and epistemic values—just as many other kinds of political, religious, and ideological values are not consistent with EA values.
I agree with much of your underlying frustration, Geoffrey, but I worry that explicitly anti-woke sentiment could encourage the perception that the woke are “not welcome here”.
So many people find and contribute to EA from woke/woke-adjacent circles like climate change activism, welfare capitalism, and animal welfare. Even if you and I disagree with their ideological views, they’re still trying to improve the world, the same way as you or I.
I hope that much the same way that EA is influenced by wokism, woke EAs are influenced by EA refine the ethics and epistemics of their ideology. I’d like to throw in a (perhaps naive) vote for not explicitly alienating woke EAs if at all possible.
Ariel—thanks for your calm & constructive comment. I take your point that many people in those movements such as animal welfare and climate change activism tend to be woke-adjacent. I also accept that everybody thinks they’re trying to improve the world, given their own values and beliefs.
It’s worth having a discussion about whether EA should be explicitly anti-woke, or woke-neutral, or pro-woke (which often includes pretending not to know what ‘woke’ means). However, there’s a variant of O’Sullivan’s Law that seem to operate in modern culture, such that any organization that is not explicitly anti-woke tends to become woke.
Hi Geoffrey, I commented earlier asking what you mean by woke, and would like to clarify that I’m not “pretending not to know what ‘woke’ means.” It’s a word that I only ever see employed derisively and I am truly not sure what it’s supposed to mean beyond “views that the speaker holds in contempt.” So if you are able to give a sense of what “woke” means to you I would appreciate it, as that would help me understand your viewpoint.
And what specific, significant “woke” pressures are being put on EA?
(I would categorize some of the ConcernedEA platform as “woke,” but I don’t get the sense that those parts of the platform are getting much support.)
synonyms might be “SJW” or “DEI”.
Thanks, but that doesn’t actually help me engage with the objections of those who are afraid of wokism (or SJW, or DEI) weakening the movement, because each of those terms can mean so many different things.
A sampling of ideas that seem like they could be included under the umbrella of “wokism” in an EA context:
“Catering at EA events should be vegan”
“EA spaces should be welcoming for trans people”
“EA would be stronger if EAs were less homogenous”
“Reports of sexual assault and harassment should be taken seriously”
“Racism, including so-called ‘scientific’ racism, is a scourge”
As is probably evident from my comment history, I do happen to agree with all of these assertions. But I would be interested in engaging respectfully with someone who didn’t. What I can’t do is meaningfully respond to the idea that wokism, undefined, is threatening EA.
(edited to add - if anyone disagree voting would be willing to tell me what they disagree with, I would appreciate it)
I think part of the difficulty here is that “wokism” seems to refer to a cluster of ideas and practices that seem to be a genuine cluster, but don’t have especially clear boundaries or a singular easy definition.
What I do notice is that none of the ideas you listed, at least at the level of abstraction at which you listed them, are things that anyone, woke or anti-woke or anywhere in between, will disagree with. But I’ll try to give some analysis of what I would understand to be woke in the general vicinity of these ideas. Note that I am not asserting any normative position myself, just trying to describe what I understand these words to mean.
I don’t think veganism really has much to do with wokism. Whatever you think about EA event catering, it just seems like an orthogonal issue.
I suspect everyone would prefer that EA spaces be welcoming of trans people, but there may be disagreement on what exactly that requires on a very concrete level, or how to trade it off against other values. Should we start meetings by having everyone go around and give their pronouns? Wokism might say yes, other people (including some trans people) might say no. Should we kick people out of EA spaces for using the “wrong” pronouns? Wokism might say yes, other might say no as that is a bad tradeoff against free speech and epistemic health.
I suspect everyone thinks reports of assault and harassment should be taken seriously. Does that mean that we believe all women? Wokism might say yes, others might so no. Does that mean that people accused should be confronted with the particular accusations against them, and allowed to present evidence in response? Wokism might say no, others might say yes, good epistemics requires that.
I’m honestly not sure what specifically you mean by “so-called ‘scientific’ racism” or “scourge”, and I’m not sure if that’s a road worth going down.
Again, I’m not asserting any position myself here, just trying to help clarify what I think people mean by “wokism”, in the hopes that the rest of you can have a productive conversation.
This is a tangent, but raising my hand as someone who does disagree that EA events should generally have only vegan food. I think having good vegan food available is very important, and think you can make a good case for excluding meat, but the more you constrain the menu the harder it is for people to find the food they need. This is especially a problem for longer or residential events, where the downsides of a limited diet compound and going out to get different food can be logistically challenging.
I agree on food. I was careless with my qualifications, sorry about that.
I’m not “pretending to not know what woke means”, I genuinely think it would be constructive for you to define what it is you mean by using it, and by explaining why you think it is a threat to EA.
Some things I think you could mean:
-People who talk a lot about “positionality”
-People who look at white men distrustfully and assume they have bad intentions
-People who talk about diversity and inclusion and virtue signal about said things
-People who are part of the “culture wars” in the United States
The problem is that, I genuinely do not know how you define it, and how you think this applies to EA or is some sort of threat to EA.
Another problem is that you seem to assume you can identify whether or not someone is “woke” without actually defining what that means or really knowing the person. I don’t think that’s fair. I also think you are doing what I notice people on twitter do, which is look at really superficial things like how someone talks or presents themselves online, and think you can categorize them as “woke” or “not-woke”. It’s just very polarizing. I actually think you and I agree on a lot more than you would assume but because I disagree with you using “woke” language you assume I am “pro-woke”.
You’re welcome! I agree that the discussion is worth having, and won’t pretend to know the right answer. Your point that the de-facto choice may be between anti-woke and (eventually) pro-woke is legitimate.
One consideration which we might be underestimating (I’m not just saying this; I mean it :P) is the impact of ways EA could influence woke ideology:
Expose woke people to prioritarianism, which incorporates their perception of the effects of oppression between groups of people, while often resulting in de-facto EA conclusions.
Expose woke people to the possible moral significance of disenfranchised groups which are typically ignored in the public eye, such as future people and wild animals.
Encourage woke people to quantify their perceptions of how oppressed different groups are, and how to make tradeoffs between interventions which help different groups. This also often leads to EA conclusions. For example, under most plausible assumptions of the sentience of farmed animals, it seems likely that a given intervention in farmed animal welfare will reduce more suffering than a given anti-racist intervention.
Strong agree! The basic framework of EA, using utilitarian EV calculus to have more impact, can be adopted by folks on the left or right. People who are more into social justice and climate change can learn to have better feedback mechanisms to increase impact.
At the same time, conservative religious groups that do a ton of charity could also be led to using more effective interventions. I don’t think the EA framework needs to be politicized.
This comment implies the only relevant division is over wokery. I’m not sure why you focused only on that, but there are other ways people can practically disagree about what to do...
Wokeism has been a on/off discussion topic in EA for about as long as I can remember. My woke friends complain that EA is hopelessly anti-woke, and my anti-woke friends complain that EA is hopelessly woke. The predictions of political schism or ideological takeover keep being made, and keep being false.
In my opinion, we’ve already found a “third option” which works: the empathy to seek mutual understanding, the philosophical sophistication to critique fashionable ideas, and the willingness to share our perspective even when it seems unpopular.
I like this :)
I found parts of this 3-month old comment by a non-Western trans man writing about the masculinity-femininity divide to be really insightful and prescient.
Just as many people point to ‘toxic masculinity’ (which can also be present in women), I think they should also acknowledge the existence of ‘toxic femininity’ (which can also be present in men). FWIW, I think a lot of activists raised in (somewhat-functioning) democracies are underestimating the dangers of limiting free expression, the dangers of marginalizing people whose features were historically associated with having more power, and the possibility that they might be becoming more sensitive to things that they can otherwise overcome.
Hey Geoffrey, thanks for pointing this out. I agree it seems like you immediately got downvoted hard—I’ve strong agreed to try and correct that a bit.
I broadly agree with you on this, and I’m glad we’re having this conversation. However I think framing it this way is problematic and leads to tribalism. The way I see it the ‘woke takeover’ is really just movements growing up and learning to regulate some of their sharper edges in exchange for more social acceptance and political power.
Different movements do better or worse—New Atheism is an example that was ruined by this modulation. I’m optimistic that EA can learn to become more welcoming and palatable to normal folks on the left and right, while keeping the old guard, if we play our cards right.
The largest divide seems to be the older folks who prize unconventional dating and social norms like polyamory, radical honesty, etc, versus a lot of the more “normal” folks that may be turned off by that sort of thing. For instance leading a local group in Raleigh NC, we have a large number of people that have relatively standard intuitions about sex and relationships.
My biggest goal is learning how to increase their involvement and engagement in EA without turning them off—something we’ve already dealt with a bit from SBF.
Building that middle ground framework will be tough, do you have any ideas here as to where we can start?
I don’t agree with this part of the comment, but am aware that you may not have the particular context that may be informing Geoffrey’s view (I say may because I don’t want to claim to speak for Geoffrey).
These two podcasts, one by Ezra Klein with Michelle Goldberg and one by the NY Times, point to the impact of what is roughly referred to in these podcasts as “identity politics” or “purity politics” (which other people may refer to as “woke politics”). The impact, according to those interviewed, on these movements and nonprofits, has been to significantly diminish their impact on the outside world.
I also think that it would be naïve to claim that these movements were “growing up” considering how long feminism and the civil rights movement have been around. The views expressed in these podcasts also strongly disagree with your claim that they are gaining more political power.
I think these experiences, from those within nonprofits and movements on the left no less, lend support to what Geoffrey is arguing. Especially considering that the EA movement is ultimately about having the most (positive) impact on the outside world.
“ The way I see it the ‘woke takeover’ is really just movements growing up and learning to regulate some of their sharper edges in exchange for more social acceptance and political power.”
I think there is some truth in movements often “growing up” over time and I agree that in some circumstances people can confuse this with “woke takeover”, but I think it’s important to have a notion of some takeover/entryism as well.
In terms of the difference: to what extent did people in the movement naturally change their views vs. to what extent was it compelled?
I suppose protest can have its place in fixing a system, but at a certain hard-to-identify point, it essentially becomes blackmail.
Wil—thanks for the constructive reply. That’s all reasonable. I’ve got to teach soon, but will try to respond properly later.
I’m grateful for this comment, because it’s an exemplar of the kind of comment that makes me feel most disappointed by the EA community.
It’s bad enough that influential EAs have caused a lot of damage to other individuals, and to the good work that might be done by the community. But it’s really upsetting that a lot of the community (at least as exemplified by the comments on the forum; I know this isn’t fully representative) doesn’t seem to take it seriously enough. We’re talking about really horrible examples of racism and sexual harassment here, not ‘woke activism’ gone too far. It hurts people directly, it repels others from the community, and it also makes it harder to further important causes.
It’s also couched in the terms of ‘rationalism’ and academic integrity (“let me try to steel-man a possible counter-argument...”), rather than just coming out and saying what it is. I don’t think you’re (merely) trying to make a hypothetical argument. Similarly the “I hope EAs see what’s [really]* happening here, and understand the clear and present dangers...” sounds alarmist to me.
*I included the [really], because it seems to me like the author of the comment is trying to lend weight to their argument by implying they are revealing something most people would otherwise miss.
I understand the frustrations you and others are voicing, but to me I think it’s more a lack of competence and understanding of management/power differentials/social skills from some of the higher level EAs. I highly doubt that the upper echelons of EA are full of malicious sociopaths who are intentionally harming people.
EA has done a lot of good, and people make mistakes often. I do think we need to rectify them and punish bad behavior, but we should try and make sure we don’t alienate the old guard of EA for making mistakes in the socializing/dating world. A lot of people struggle to understand what is okay and what isn’t—I’d rather try and reconcile or educate them than attack each other. That’s the point of this post.
Does that framing make sense to you?
Hi Wil,
My comment here was about Geoffrey Miller’s comment, rather than your original post as a whole (albeit I separately took issue with your use of “relatively petty...”), so I’m not sure I follow where you’re going here.
FWIW, if you’re referring to recently-come-to-light examples of sexual harassment and racism when you say “it’s more a lack of competence...”, then I would disagree with your characterisation. I think by saying that the likes of Owen Cotton-Barratt and Nick Bostrom aren’t “malicious sociopaths”, and that they didn’t do it ‘intentionally’ you fail to acknowledge the harm they’ve done. It’s a similar line of argument to your original post when you compare the harm done with “the survival of the human race”. I think it’s missing the point, it’s insensitive, and implies that they’re not soooo bad.
I also worry when the initial reaction to someone’s misdeeds is “let’s make sure we don’t punish them too harshly, or we’ll alienate them”, rather than “this is really wrong, and our first priority should be to make sure it doesn’t happen again”. My initial response isn’t to shed a tear for the damage to the career of the person who did the wrong thing.
I disagree with your framing this as “attacking” the people that have done wrong. If anything, it’s the people on the end of the sexual harassment that have been attacked.
I find it distasteful when people point to things like “EA has done a lot of good” or “EA has saved a lot of lives” in the context of revelations of sexual harassment etc. While it might be factually correct, I think it gives the sense that people think it’s OK to do horrible personal things as long as you donate enough to Givewell (I very much disagree).
And one final point: I don’t think “the old guard of EA” is the right frame (although I’m somewhat biased as I was involved in EA in 2011-12). I don’t believe the majority of wrongdoers are from this group, nor do I believe the majority of this group are wrongdoers.
So no, that framing does not make sense to me.
Thanks for responding. For what it’s worth I personally think OCB should be permanently resigned from a powerful position in EA, and possibly socially distanced. Strong incentives against that type of behavior, especially right now, are extremely important. I’m disappointed with the response from EVF and think it should be far harsher.
The distinction I’m trying to make is that we shouldn’t assume all powerful people in EA are bad apples as a result of this scandal breaking.
Thanks Wil. I can agree with that.
I agree with this—it is also why I disagree-voted, and no, I don’t have notifications set up for Geoffrey (as mentioned in another comment by them).
The comment felt to me like it was undermining a lot of the recent criticism regarding people in powerful positions, AT THE VERY LEAST, showing very bad judgement. The comment makes me very sad and angry.
BTW, it’s interesting that there seems to be a cadre of EAs who get notifications whenever I post something, and they immediately disagree-vote on it, within a few minutes, if it involves any criticism of wokeness. But they don’t actually reply with any concrete reasons for disagreement. Then later the other EAs who come across the post naturally tend to agree-vote more often with it.
Yet another tactic of woke takeover?
I would also be interested in what you mean by wokeness—I haven’t downvoted you but I don’t think I follow your point, largely because I don’t know what you mean by “woke takeover” or “woke-versus-antiwoke.”
I don’t get notifications for your posts, I saw it when I read the comments on this post and disagreed with it because I personally dislike the use of the word “woke” and see it as divisive in itself. It would be helpful for me if you could define what you mean by woke and explain what it means to EA. I know it is a common term used in the US and in twitter conversations about American politics, but I would prefer to not see US political discourse language in EA unless it’s really illuminating any real issues or threats. You seem to be making a lot of general claims like “Movements and organizations often find it difficult to protect themselves from woke takeover, because they don’t understand what’s happening, they don’t have good counter-arguments, they’re too guilt-prone and easily shamed, and they’re too conflict-averse. ”, but it’s not clear to me what you’re referring to.
Thanks for this Lauren I was thinking the same thing. Using labels to ‘us and them’ people isn’t useful, and I think that the word “woke” can mean so many different things that I think it’s not particularly useful.
I think it’s better to name the specific things that you are concerned about, rather than use vague labels like “woke”
Geoffrey, or anyone really, can you please define wokeness?
I fail to see how EA‘s vague opposition to being anti-woke in partisan culture wars are anything more than internecine credible threats to open society. As a neurodivergent
and self-identified Black AmericanEA who was moved by and still respects your article on viewpoint and neurodiversity but pragmatically votes on the left as a transpartisan because I don’t see another middle way that isn’t omnicidal?With genuine respect, I find the blanket dismissals of wokenness to be extremely inflammatory and ineffective in eliciting the calm and respectful pushback from people who want to break new ground that you/EA/we(?) are looking for.
Also, thank you, Lauren, Nick and others for bringing attention to this.
If it makes you feel any better, there also seems to be a cadre who downvotes certain critical posts soon after they are made.
(I do not get an alert on anyone’s posts, by the way)
I will admit to having strong downvoted a number of critical posts, while having upvoted others, in order to create an incentive gradient to produce better criticism.
If we start getting less criticism, then I’ll default towards overvoting criticism more.
As far as I can tell, you can’t actually set up notifications for someone’s comments, only posts. Are you suggesting that people are using some external browser extension specific to your comments? I find your notifications theory to be extremely unlikely and borderline conspiratorial.
A more likely explanation is that a lot of people don’t agree with your opinions.