I’m a 22-year-old woman involved in Effective Altruism. I’m sad, disappointed, and scared.
Before I get to the heart of what I want to share, a couple of disclaimers:
I am one person and, as such, I recognize that my perspective is both informed and limited by my experience and identity. I would like to share my perspective in the hope that it may interact with and broaden yours.
In writing this post, my aim is not to be combative nor divisive. The values of Effective Altruism are my values (for the most part—I likely value rationality less than many), and its goals are my goals. I do not, therefore, aim to “take down” or harm the Effective Altruism community. Rather, I hope to challenge us all to think about what it means to be in community.
Who am I and why am I writing this?
I’m a 22-year-old college senior, set to graduate with a degree in Human and Organizational Development in May. I learned about Effective Altruism last fall, when I transferred to a new university and started attending my school’s EA events and fellowships. I have become increasingly involved in the Effective Altruism community over the past 14 months - participating in intro and in-depth fellowships, taking on a leadership role in my school’s EA club, and attending an EAGx and an EAG.
So why am I writing this? Because I am at a point in my life where I have to make a lot of choices: where I want to live; what type of work I want to engage in; the people whom I want to surround myself with. When I found Effective Altruism, it seemed as though I had stumbled across a movement and a community that would provide me with guidance in all three of these areas. However, as my social and intellectual circles became increasingly entangled with EA, I grew hesitant, then skeptical, then downright sad as I observed behavior (both in-person and online) from those involved in the EA community. I’m writing this because I want to be able to feel proud when I tell people that I am involved in Effective Altruism. I want to feel as if I can encourage others to join without an added list of disclaimers about the type of behavior they may encounter. Lastly, I want the Effective Altruism community to revisit and continuously strive towards what the Centre for Effective Altruism calls the core principles of EA: commitment to others, scientific mindset, openness, integrity, and collaborative spirit.
Gender and Culture
According to the EA Survey 2020 (the latest year for which I could find data), the makeup of people involved in EA was very similar to that of 2019: 76% white and 71% male. Lack of diversity within movements, organizations, and “intellectual projects” is incredibly damaging for many important reasons. CEA writes several of these reasons on their Diversity and Inclusion page, but the one I would like to highlight is We don’t want to miss important perspectives. As the website reads, “if the community isn’t able to welcome and encourage members who don’t resemble the existing community, we will consistently miss out on the perspectives of underrepresented groups.” I agree with this statement and that is why I am concerned—I’m unconvinced that this community is effective in “welcoming and encouraging” people who don’t fit the majority white-and-man mold.
I question the welcoming-ness of the EA community because, despite fitting the EA mold in many ways—I’m white, American, in my twenties, and will soon graduate from a highly-ranked college—I still often feel as if the EA community is not something I want to be a part of. I can imagine that those with even less of the predominant EA demographic characteristics face exponentially increased barriers to entry. Several (yet not all) instances in which I’ve felt this way…
A friend visited the Bay Area this summer to spend time with several of our mutual friends (all heavily involved in EA). He casually mentioned that some of them made a list ranking women in EA in the Bay that they wanted to hook up with.
At an EAG afterparty, an attendee talked about how he scheduled a one-on-one with someone because he found her attractive.
I read this blog post and the comments and controversy it generated. The amount of invalidation and general nastiness in the comments (that have since been deleted so I won’t link to them) shocked and saddened me.
I read about Kathy Forth, a woman who was heavily involved in the Effective Altruism and Rationalist communities. She committed suicide in 2018, attributing large portions of her suffering to her experiences of sexual harassment and sexual assault in these communities. She accuses several people of harassment, at least one of whom is an incredibly prominent figure in the EA community. It is unclear to me what, if any, actions were taken in response to (some) of her claims and her suicide. What is clear is the pages and pages of tumblr posts and Reddit threats, some from prominent members of the EA and Rationalist communities, disparaging Kathy and denying her accusations.
To be clear, I am not saying that all women have negative experiences in the EA community, nor that all men in EA are perpetrators. People of all genders can perpetrate sexual harassment, sexual assault, and generally offensive behavior. Furthermore, I anticipate responses to some of the instances that I listed above, especially the first two, expressing that they seem pretty minor, or not that big of a deal. And maybe, in your eyes, ranking women based on how attractive they are isn’t a big deal. Maybe you think it’s funny or just doesn’t hold much significance. But, based on my own experience and those of friends and acquaintances, I can assure you that to at least some of those women who are on that list—it is a big deal. Not just because it’s immature or grossly objectifying, but because it is our small, daily actions that inform culture, and behaviors like ranking women or scheduling meetings based on attractiveness may be small, but they contribute to a culture that accepts the objectification of women and the valuing of physical attractiveness above character and contribution.
Effective Altruism Has an Emotions Problem
I’m 89% certain that Effective Altruism has an emotions problem and I’m 98% certain that it’s alienating people from the EA community. What do I mean by that?
On the forum in particular and in EA discourse in general, there is a tendency to give less weight/be more critical of posts that are more emotion-heavy and less rational. This tendency makes sense based on EA principles… to a certain extent. To stay true to the aforementioned values of scientific mindset and openness, it makes sense that we challenge people’s ideas and are truth-seeking in our comments. However, there is an important distinction between interrogating someone’s research and interrogating someone’s lived experience. I fear that the attitude of truth-seeking and challenging one another to be better has led to an inclination to suspend compassion in the absence of substantial evidence of wrongdoing. You’re allowed to be sorry that someone experienced something without fully understanding it.
There are certain unofficial norms surrounding posting on the forum that I worry alienate people who are new to the community/people whose tendencies for writing fall outside of the jargon-ridden, bulleted, and rationalist styles that are predominant. While writing this post, I have adopted my own writing style to be more in line with the typical styles on the forum because I believe it will increase my chances of being taken seriously.
An extreme extension of utilitarian, rationalist, and effective altruist logic can blind us to the negative experiences of individuals and major flaws in the EA community. I fear that people within the EA community are not always taking allegations of harm seriously out of concern that (1) there is “more impactful” work that they could be doing than investigating such allegations, (2) investigating allegations of harm against prominent individuals may damage the reputation of Effective Altruism, and (3) some individuals are having such a “high-impact” that they don’t want to find them guilty of an act that may impede such effective work.
I could tell you how tears streamed down my face as I read through accounts of women who have been harmed by people within the Effective Altruism community. I could describe how my fists curled and my jaw clenched as I scrolled through forum comments and Reddit threats full of disbelief and belittlement. I could try to convey the rising temperature of my blood as it boiled; I could explain to you that I could not focus in class for a full two days. But I don’t think that I will. I’m unsure that the Effective Altruism community has room for my anger.
What should we do?
First off, I want to acknowledge that there are some incredible, thoughtful people who are dedicating much of their time to thinking about how to foster a healthy and compassionate community within EA spaces. I recognize that the issues I have raised are complex and without a simple solution, and I appreciate those who have spent time thinking about how to make things better. Here are some ideas…
Hold yourself and your people accountable. What we do and say informs who we are. If you see someone, especially someone you’re close to, say or do something you disagree with or that you believe contributes to an unwelcoming culture, let them know. This is especially relevant to people you call your friend. In order to do good better, we must aim to make ourselves better too.
Listen to people when they tell you how they feel. Let people talk about how they feel. Humans are not devoid of emotions and we never will be; emotions are an integral characteristic of human experience, not a flaw in the way we operate. You don’t have to agree with or understand someone to extend compassion!
Recognize that in order to create a healthy, effective, and compassionate community, we need to hold people accountable for their actions regardless of status, connections, or expected-value calculations about their potential for impact. Effective Altruism will not thrive if we unquestionably uphold people who cause harm.
I’m saddened by the negative experiences people in the Effective Altruism community have had, disappointed in the ways in which some people in the community have responded, and scared that the movement will continue to be a space that does not feel safe nor welcoming for many people. I want to be a part of a community where people aim to do the most good that they can do, innovate ways to do good better, and challenge one another to be better. But, if this same community continues to also create a culture that is ok with demeaning women (or anyone, for that matter), shields individuals from accountability because of their status, and predicates their expansion of compassion on the existence of strong evidence, I want nothing to do with it.
However, I am hopeful. I think that the EA community has the potential to become a space in which all people feel supported, safe, and welcomed. The ideas for change listed above are cursory and I plan on continuing to think and talk about what we can do to positively impact this community as it grows and evolves. Thank you, truly, for your time and consideration in reading this post. I’m happy to engage with anyone who has thoughts on what I wrote and I hope this inspires both introspection and community change.
- Learning from non-EAs who seek to do good by 8 Dec 2022 0:22 UTC; 263 points) (
- EA career guide for people from LMICs by 15 Dec 2022 14:37 UTC; 252 points) (
- Thoughts on EA, post-FTX by 10 Sep 2023 11:05 UTC; 144 points) (
- Posts from 2022 you thought were valuable (or underrated) by 17 Jan 2023 16:42 UTC; 87 points) (
- EA & LW Forums Weekly Summary (28th Nov − 4th Dec 22′) by 6 Dec 2022 9:38 UTC; 36 points) (
- EA & LW Forums Weekly Summary (28th Nov − 4th Dec 22′) by 6 Dec 2022 9:38 UTC; 10 points) (LessWrong;
- EA, Sexual Harassment, and Abuse by 5 Feb 2023 10:19 UTC; 3 points) (
- 22 Dec 2022 9:33 UTC; 1 point) 's comment on New blog: Some doubts about effective altruism by (
I’m one of the people (maybe the first person?) who made a post saying that (some of) Kathy’s accusations were false. I did this because those accusations were genuinely false, could have seriously damaged the lives of innocent people, and I had strong evidence of this from multiple very credible sources.
I’m extremely prepared to defend my actions here, but prefer not to do it in public in order to not further harm anyone else’s reputation (including Kathy’s). If you want more details, feel free to email me at scott@slatestarcodex.com and I will figure out how much information I can give you without violating anyone’s trust.
I’m glad you made your post about how Kathy’s accusations were false. I believe that was the right thing to do—certainly given the information you had available.
But I wish you had left this sentence out, or written it more carefully:
It was obvious to me reading this post that the author made a really serious effort to stay constructive. (Thanks for that, Maya!) It seems to me that we should recognize that, and you’re erasing an important distinction when you categorize the OP with imprudent tumblr call-out posts.
If nothing else, no one is being called out by name here, and the author doesn’t link any of the tumblr posts and Reddit threads she refers to.
I don’t think causing reputational harm to any individual was the author’s intent in writing this. Fear of unfair individual reputational harm from what’s written here seems a bit unjustified.
EDIT: After some time to cool down, I’ve removed that sentence from the comment, and somewhat edited this comment which was originally defending it.
I do think the sentence was true. By that I mean that (this is just a guess, not something I know from specifically asking them) the main reason other people were unwilling to post the information they had, was because they were worried that someone would write a public essay saying “X doesn’t believe sexual assault victims” or “EA has a culture of doubting sexual assault victims”. And they all hoped someone else would go first to mention all the evidence that these particular rumors were untrue, so that that person could be the one to get flak over this for the rest of their life (which I have, so good prediction!), instead of them. I think there’s a culture of fear around these kinds of issues that it’s useful to bring to the foreground if we want to model them correctly.
But I think you’re gesturing at a point where if I appear to be implicitly criticizing Maya for bringing that up, fewer people will bring things like that up in the future, and even if this particular episode was false, many similar ones will be true, so her bringing it up is positive expected value, so I shouldn’t sound critical in any way that discourages future people from doing things like that.
Although it’s possible that the value gained by saying this true thing is higher than the value lost by potential chilling effects, I don’t want to claim to have an opinion on this, because in fact I wrote that comment feeling pretty triggered and upset, without any effective value calculations at all. Given that it did get heavily upvoted, I can see a stronger argument for the chilling effect part and will edit it out.
Hi Scott,
Thank you for both of your comments. I appreciate you explaining why you wrote a post about Kathy and I think it’s useful context for people to understand as they are thinking about these issues. My intention was not to call anybody out, rather, to point to a pattern of behavior that I observed and describe how it made me (and could make others) feel.
Hi Maya! Thank you for posting about your experience. I think it is a valuable to have this perspective and I’m sure it wasn’t easy to write and post publicly. I’m not sure if you reached out to Scott, but if you did and made any updates regarding your belief of Kathy Forth’s accusations, then I do think it would be very impactful if you could update your post to reflect that. It seems like this one part of your post triggered a lot of old trauma in the community and likely overshadowed the other concerns contained in the post. I believe an update (no matter in which direction) could really improve trust in the capacity for good-faith discussions around this difficult topic.
Hi Constance!
Thanks for your comment—for all who are interested, I did reach out to Scott and he provided me with an in-depth explanation of some of the context behind Kathy’s accusations and suicide. His explanation provided me with a deeper understanding of the situation and helped me realize that action was taken to check the validity of some of Kathy claims and that there was a more involved and nuanced response to the situation than I realized initially.
Hi Maya, glad to hear that that was the outcome of your deeper dive. If you’re comfortable, I think it might be good if you edited in a comment about this to your top-level post (and maybe that’s what Constance meant?), because a lot of people read posts but then don’t read the comments, and so they might not otherwise know you updated about this (very important-seeming, to me) question (like something like “Edit: after checking out some of the claims raised in the comments, I now think the situation was more like [whatever you think]”)
Yes updating and creating an “Edit:” right after point #4 would be the ideal place to put this update so that it reaches the most readers.
Maya, I’m glad that you talked to Scott and got more information. I hope that the deeper context has provided some reassurance to you that there exist parts of EA as a community that do care about the concerns of women and that there is a path available to change the culture.
Thanks for removing the sentence.
I’m sorry you’ve gotten flak. I don’t think you deserve it. I think you did the right thing, and the silence of other people “in the know” doesn’t reflect particularly well on them. (Not in the sense that we should call them out, but in the sense that they should maybe think about whether they knowingly let a likely-innocent person suffer unjust reputation harm.)
Agreed. I think the culture of fear goes in both directions. Women often seem to fear making accusations.
Not what I was gesturing at, but potentially valid.
My thinking is that attempts to share info “in good faith” should not be punished, regardless of whether that info pushes towards condemnation vs exoneration. (We can debate what exactly counts as “good faith”, but I think it should be defined ~symmetrically for both types of info. I’d like more discussion of what constitutes “good faith”, and fewer implications that [call-outs/denials] are always bad. I’m open to super restrictive definitions of “good faith”, like “only share info with CEA’s community health team and trust them to take appropriate action” or similar.)
In any case, my main goal was to get you to reciprocate what I saw as the OP’s attempt to be less triggered/more constructive, so thanks for that.
I did not know Kathy well, but I did meet and talk with her at length on a number of occasions in EA/aligned spaces. We talked about cultural issues in the movement and for what it is worth, she came across as someone of good character, good judgement and measured takes.
I am not across the particulars of her accusations and I feel matters like this have a place, actual courts and not forums. I don’t think cherry picked criticisms of her claims are appropriate.
I think EA will continue to stumble on this issue, and our downfall as a movement will continue to be handling deontologicaly or virtuously abhorrent behaviour.
I think the author of this forum post has been points of great importance. In particular, their critique of the style of writing required to be taken seriously and understood in the manner intended, is novel.
I want to strong agree with this post, but a forum glitch is preventing me from doing so, so mentally add +x agreement karma to the tally. [Edit: fixed and upvoted now]
I have also heard from at least one very credible source that at least one of Kathy’s accusations had been professionally investigated and found without any merit.
Maybe also worth adding that the way she wrote the post would in a healthy person be intentionally misleading, and was at least incredibly careless for the strength of accusation. Eg there was some line to the effect of ‘CFAR are involved in child abuse’, where the claim was link-highlighted in a way that strongly suggested corroborating evidence but, as in that paraphrase, the link in fact just went directly to whatever the equivalent website was then for CFAR’s summer camp.
It’s uncomfortable berating the dead, but much more important to preserve the living from incredibly irresponsible aspersions like this.
Thanks for flagging that we had a bug affecting voting! It should be fixed now, please let me know if you see any more related issues.
While this is important (clarifying of misinformation), I want to mention that I don’t think this takes away from the main message of the post. I think it’s important to remember that even with a culture of rationality, there are times when we won’t have enough information to say what happened (unlike in Scotts case), and for that reason Mayas post is very relevant and I am glad it was shared.
It also doesn’t seem appropriate to mention this post as “calling out”. While it’s legitimate to fear reputations being damaged with unsubstantiated claims, this post doesn’t strike me as doing such.
I came to the comments here to also comment quickly on Kathy Forth’s unfortunate death and her allegations. I knew her personally (she subletted in my apartment in Australia for 7 months in 2014, but more meaningfully in terms of knowing her, we also we overlapped at Melbourne meetups many times, and knew many mutual people). Like Scott, I believe she was not making true accusations (though I think she genuinely thought they were true).
I would have said more, but will follow Scott’s lead in not sharing more details. Feel free to DM me.
just to draw some attention to the “(some of)”, Kathy claimed in her suicide note that her actions had led to one person being banned from EA events. My understanding is that she made a mixture of accusations that were corroborated and ones that weren’t, including the ones you refer to. I think this is interesting because it means both:
Kathy was not just a liar who made everything up to cause trouble. I would guess she really was hurt, and directed responsibility for that hurt to a mixture of the right and wrong places. (Maybe no-one thought this, but I just want to make clear that we don’t have to choose between “she was right about everything” and “she was wrong about everything”.)
Kathy was not ignored by the community. Her accusations were taken seriously enough to be investigated, and some of those investigations led to people being banned from events or groups. Reddit may talk shit about her, but the people in a position to do something listened.
(I should say that what I’m saying is mostly based on what Kathy said in her public writings combined with second-or-third hand accounts, and despite talking a little to Kathy at the time I’m missing almost all the details of what actually happened. Feel free to contradict me if something I said seems untrue.)
(edit 2023-10-09: revised “more than one” to “one” because I’m sure that’s true and am not sure the “more than” is true)
For the record, I knew Kathy for several years, initially through a local Less Wrong community, and considered her a friend for some time. I endorse Scott’s assessment, but I’ll emphasise that I think she believed the accusations she made.
Relevant to this post: Many people tried to help Kathy, from 3 groups that I’m aware of. People gave a lot of time and energy. Speaking for myself and what I observed in our local community, I believe we prioritised helping her over protecting our community and over our own wellbeing.
In the end things went poorly on all three, for the community, other individuals and especially for Kathy. But it wasn’t for lack of caring.
If something similar happened today, we would have much more support, through the EA community health team. (I was more involved in LW at the time, and wasn’t aware of support available through EA. The team might not have existed yet in a formal capacity.)
I don’t take Kathy’s letter at face value. However I’m glad to see Julia’s comment confirming that Kathy’s accusations were investigated (I would expect no less) and in one case acted upon.
As for the analytical vs emotional – it’s hard to express my emotions around this in written words. And especially hard to do so without saying more than I think is appropriate.
(This is the first public comment I’ve made on this subject. I only make it because it’s understandably still an issue of concern, and few people have much context for it.)
Regardless of the accuracy of this comment, it makes me sad that the top comment on this post is adversarial/argumentative and showing little emotional understanding/empathy (particularly the line “getting called out in posts like this one”). I think it unfortunately demonstrates well the point the author made about EA having an emotions problem:
To be honest I’m relieved this is one of the top comments. I’ve seen Kathy mentioned a few times recently in a way I didn’t think was accurate and I didn’t feel able to respond. I think anyone who comes across her story will have questions and I’m glad someone’s addressed the questions even if it’s just in a limited way.
My point isn’t about the information contained in the comment, it’s about the tone.
I very rarely engage in karma voting, and didn’t do so for this comment either. That said, one relevant point is that the comment with the most karma gets to sit at the top of the comments section. That means that many people probably vote with an intention to functionally “pin” a comment, and it may not be so much that they think the comment should represent the most important reaction to a post, as that they think it provides crucial context for readers. I think this comment does provide context on the part of this otherwise very good and important post that made me most uncomfortable as stated. I also agree that Alexander’s tone isn’t great, though I read it in almost the opposite way from you (as an emotional reaction in defense of his friends who came forward about Forth).
Scott Alexander’s response is the first time I see that there is someone I can contact who has substantial claims of evidence that Kathy Forth’s accusations were false. I’ve heard of Kathy twice in the last month (don’t remember hearing about her at all before then), as have others in my local community. Many find Scott Alexander’s response valuable, which is why it is the top comment. A large part of the EA community appears to only recently be learning about Kathy Forth.
I personally think Scott shows immense emotional maturity responding to this in a context where he is opening himself up to huge scrutiny, including the criticism of being told he is being too adversarial and lacking empathy. He removed the sentence in question after some reflection, updating immediately and explaining his thought process, empathising with another’s perspective and recognizing his own emotional state that led to him to include that sentence. To me these seem the hallmarks of being a well emotionally regulated individual. If it isn’t, what does a person with emotional understanding/empathy do differently in this situation?
Before you answer that question, let’s take a moment to actually highlight what the situation even is:
Kathy Forth was a human, a member of our community, that committed suicide. Given the serious implications of Kathy Forth’s accusations if they were to be true, it seems that we should place a lot of value on anything that can confirm or deny the veracity of Kathy Forth’s story. Do you disagree?
This is the sentence you don’t like that Scott brought up. He removed it.
OP wrote the following words that, for lack of a better word, triggered Scott. She also has the opportunity to amend or qualify these words:
Nowhere in what OP writes above does she even seem to entertain the possibility that at least some of Kathy’s major accusations could be false (she says “It is unclear to me what, if any, actions were taken in response to (some) of her claims and her suicide” but nothing akin to “It is unclear to me whether Kathy Forth’s accusations were true”).
Kathy Forth’s story is a really really serious accusation. One in which whether we believe it is true or false would and should significantly update our priors around how the EA community treats the concerns of women. If viewed to be true, it would frame the experiences of women and the EA community’s prior response to them in a very sinister light. If viewed to be false, then concerns 1+2 don’t have a broader sinister context and could be more optimistically corrected with improved managment and a culture shift in EA. For example...
I agree that Swapcard at EAGs should, in the vast majority of cases, not be used for dating (my dudes in EA… this past weekend at EAGx Berkeley I met a woman that was flat out frustrated that she was hit on through Swapcard one-on-ones twice within 24 hours… facepalm)
I agree profusely that ranking women by how much you want to have sex with them and publicly sharing that casually is creepy as all hell (my dudes in EA… how it is possible that so many of you can be so into “intrumental rationality” and yet act so obviously against your own self-interests when it comes to your behaviour around women is beyond me… facepalm)
These issues seem like something we can come together and fix. But if Kathy Forth’s accusations are true, then it is implied that EA as a community is much more sinister and not interested in addressing the concerns of women. If Kathy Forth’s accusations are true, then there is a deep rot within the EA community. There is no “facepalm” joke I’d be able to light-heartedly say about it, as if it were a thing we can reasonably fix. If Kathy Forth’s accusations are true, OP is right to be scared.
OP’s framing of Kathy Forth’s experience strongly implies they view Kathy Forth’s accusations as more true than false. I don’t know if she believes this because either:
a) She had bad experiences in EA and that made her update more towards “Kathy Forth’s story is probably true”
b) She thought Kathy Forth’s story is probably more true than false, and that made her update her experiences in EA as more bad (in a broader sinister context) than they otherwise would be
But the epistemic particulars are beside the point, because either way, if Scott is able to provide compelling evidence that parts of Kathy Forth’s story is false, this could help OP (and everyone else) feel less sad, disappointed and scared and that can only be a good thing.
OP, if you are reading this, I realize the EA community can seem intimidating. I empathize strongly with you when you expressed concern about how others will judge your writing style and take you less seriously if it did not conform to forum standards (why hello there… my still haven’t-made-one-post self… something I am still embarrassed about given how long I’ve been a part of EA). That said, I am confident if you made a very short post just like “Please, what happened to Kathy Forth and why? I need to know, I cant sleep. I feel sad, disappointed, and scared.” the EA community would have responded with compassion—not caring that the post didn’t abide by some set of forum norms. But, regardless, you did put effort into a longer post here so I just want to say I’m glad you posted this.
Responding to the attention on Kathy’s specific case (I’m aware I’m adding more to it) - I think we’re detracting from the key argument that the EA community as a whole is neglecting to validate and support community members who experience bad things in the community
In this post, it’s women and sexual assault primarily. But there are other posts (1, 2) exempifying ways the EA community itself can and should prioritise internal community health. To argue the truth of one specific example might be detracting from recognising that this might be a systematic problem.
edit: after discussion below & other comments on this post, I feel less strongly about the claim “EA community is bad at addressing harm”, but stand by / am clarifying my general point, which is that the veracity of Kathy’s claims doesn’t detract from any of the other valid points that Maya makes and I don’t think people should discount the rest of these points.
A suggestion to people who are approaching this from a “was Kathy lying?” lens: I think it’s also important to understand this post in the context of the broader movement around sexual assault and violence. The reason this kind of thing stings to a woman in the community is because it says “this is how this community will react if you speak up about harm; this is not a welcoming place for you if you are a survivor.” It’s not about whether Kathy, in particular, was falsely accusing others.
The way I read Maya’s critique here is “there were major accusations of major harm done, and we collectively brushed it off instead of engaging with how this person felt harmed;” which is distinct from “she was right and the perpetrator should be punished”. This is a call for the EA community to be more transparent and fair in how it deals with accusations of wrongdoing, not a callout post of anybody.
Perhaps I would feel differently if I knew of examples of the EA community publicly holding men accountable for harm to women, but as it stands AFAIK we have a lot of examples like those Maya pointed out and not much transparent accountability for them. :/ Would be very happy to be corrected about that.
(Maya, I know it’s probably really hard to see that the first reply on your post is an example of exactly the problem you’re describing, so I just want to add in case you see this that I relate to a lot of what you’ve shared and you have an open offer to DM me if you need someone to hold space for your anger!)
Predictably, I disagree with this in the strongest possible terms.
If someone says false and horrible things to destroy other people’s reputation, the story is “someone said false and horrible things to destroy other people’s reputation”. Not “in some other situation this could have been true”. It might be true! But discussion around the false rumors isn’t the time to talk about that.
Suppose the shoe was on the other foot, and some man (Bob), made some kind of false and horrible rumor about a woman (Alice). Maybe he says that she only got a good position in her organization by sleeping her way to the top. If this was false, the story isn’t “we need to engage with the ways Bob felt harmed and make him feel valid.” It’s not “the Bob lied lens is harsh and unproductive”. It’s “we condemn these false and damaging rumors”. If the headline story is anything else, I don’t trust the community involved one bit, and I would be terrified to be associated with it.
I understand that sexual assault is especially scary, and that it may seem jarring to compare it to less serious accusations like Bob’s. But the original post says we need to express emotions more, and I wanted to try to convey an emotional sense of how scary this position feels to me. Sexual assault is really bad and we need strong norms about it. But we’ve been talking a lot about consequentialism vs. deontology lately, and where each of these is vs. isn’t appropriate. And I think saying “sexual assault is so bad, that for the greater good we need to focus on supporting accusations around it, even when they’re false and will destroy people’s lives” is exactly the bad kind of consequentialism that never works in real life. The specific reason it never works in real life is that once you’re known for throwing the occasional victim under the bus for the greater good, everyone is terrified of associating with you.
This is surprising to me; I know of several cases of people being banned from EA events for harm to women. When I’ve tried to give grants to people, I have gotten unexpected emails from EA higher-ups involved in a monitoring system, who told me that one of those people secretly had a history of harming women and that I should reconsider the grant on that basis. I have personally, at some physical risk to myself, forced a somewhat-resistant person to leave one of my events because they had a history of harm to women (this was Giego C; I think it was clear-cut enough to be okay to name a name here; I know most orgs have already banned him, and if your org hasn’t then I recommend they do too—email me and I can explain why). I know of some other cases where men caused less severe cases of harm or discomfort to women, there were very long discussions by (mostly female members of) EA leadership about whether they should be allowed to continue in their roles, and after some kind of semi-formal proceeding, with the agreement of the victim, after an apology, it was decided that they should be allowed to continue in their roles, sometimes with extra supervision. There’s an entire EA Community Health Team with several employees and a mid-six-figure budget, and a substantial fraction of their job is holding men accountable for harm to women. If none of this existed, maybe I’d feel differently. But right now my experience of EA is that they try really hard to prevent harm to women, so hard that the current disagreement isn’t whether to ban some man accused of harming women, but whether it was okay for me to mention that a false accusation was false.
Again in honor of the original post saying we should be more open about our emotions: I’m sorry for bringing this up. I know everyone hates having to argue about these topics. Realistically I’m writing this because I’m triggered and doing it as a compulsion, and maybe you also wrote your post because you’re triggered and doing it as a compulsion, and maybe Maya wrote her post because she’s triggered and doing it as a compuIsion. This is a terrible topic where a lot of people have been hurt and have strong feelings, and I don’t know how to avoid this kind of cycle where we all argue about horrible things in circles. But I am geninely scared of living in a community where nobody can save good people from false accusations because some kind of mis-aimed concern about the greater good has created a culture of fear around ever speaking out. I have seen something like this happen to other communities I once loved and really don’t want it to happen here. I’m open to talking further by email if you want to continue this conversation in a way that would be awkward on a public forum.
Thank you, this is clarifying for me and I hope for others.
Responses to me, including yours, have helped me update my thinking on how the EA community handles gendered violence. I wasn’t aware of these cases and am glad, and hope that other women seeing this might also feel more supported within EA knowing this. I realize there are obvious reasons why these things aren’t very public, but I hope that somehow we can make it clearer to women that Kathy’s case, and the community’s response, was an outlier.
I would still push back against the gender-reversal false equivalency that you and others have mentioned. EA doesn’t exist in a bubble. We live in a world where survivors, and in particular women, are not supported, not believed, and victim-blamed. Therefore I think it is pretty reasonable to have a prior that we should take accusations seriously and respond to them delicately. The Forum, if anywhere on earth, should be a place where we can have the nuanced understanding that (1) the accusations were false AND (2) because we live in a world where true accusations against powerful men are often disbelieved, causing avoidable harm to victims, we need to keep that context in mind while condemning said false accusations.
So to clarify my stance: I don’t think it was wrong to mention that the false accusation is false. I think it seems dismissive and insensitive to do so without any acknowledgement of the rest of the post. I don’t think it would have hurt your point to say “yes, EA is a male-dominated culture and we need to take seriously the harms done to women in our community. In this specific instance, the accusations were false, and I don’t believe the community’s response to these accusations is representative of how we handle harm.”
I think the disconnect here is that you are responding / care about this specific claim, which you have close knowledge of. I know nothing about it, and am responding to / care about the larger claim about EA’s culture. I believe that Maya’s post is not trying to to make truth claims about Kathy’s case and is more meant to point out a broad trend in EA culture, and I’m trying to encourage people to read it as such, and not let the wrongness of Kathy’s claims undermine Maya’s overall point.
(edit: basically I agree with your comment above:
Thanks for your thoughtful response.
I’m trying to figure out how much of a response to give, and how to balance saying what I believe vs. avoiding any chance to make people feel unwelcome, or inflicting an unpleasant politicized debate on people who don’t want to read it. This comment is a bad compromise between all these things and I apologize for it, but:
I think the Kathy situation is typical of how effective altruists respond to these issues and what their failure modes are. I think “everyone knows” (in Zvi’s sense of the term, where it’s such strong conventional wisdom that nobody ever checks if it’s true ) that the typical response to rape accusations is to challenge and victim-blame survivors. And that although this may be true in some times and places, the typical response in this community is the one which, in fact, actually happened—immediate belief by anyone who didn’t know the situation, and a culture of fear preventing those who did know the situation from speaking out. I think it’s useful to acknowledge and push back against that culture of fear.
(this is also why I stressed the existence of the amazing Community Safety team—I think “everyone knows” that EA doesn’t do anything to hold men accountable for harm, whereas in fact it tries incredibly hard to do this and I’m super impressed by everyone involved)
I acknowledge that makes it sound like we have opposing cultural goals—you want to increase the degree to which people feel comfortable expressing out that EA’s culture might be harmful to women, I want to increase the degree to which people feel comfortable pushing back against claims to that effect which aren’t true. I think there is some subtle complicated sense in which we might not actually have opposing cultural goals, but I agree to a first-order approximation they sure do seem different. And I realize this is an annoyingly stereotypical situation - I, as a cis man, coming into a thread like this and saying I’m worried about a false accusations and chilling effects. My only two defenses are, first, that I only got this way because of specific real and harmful false accusations, that I tried to do an extreme amount of homework on them before calling false, and that I only ever bring up in the context of defending my decision there. And second, that I hope I’m possible to work with and feel safe around, despite my cultural goals, because I want to have a firm deontological commitment to promoting true things and opposing false things, in a way that doesn’t refer to my broader cultural goals at any point.
Thanks, I realize this is a tricky thing to talk about publicly (certainly trickier for you, as someone whose name people actually know, than for me, who can say whatever I want!). I’m coming in with a stronger prior from “the outside world”, where I’ve seen multiple friends ignored/disbelieved/attacked for telling their stories of sexual violence, so maybe I need to better calibrate for intra-EA-community response. I agree/hope that our goals shouldn’t be at odds, and that’s what I was trying to say that maybe did not come across: I didn’t want people to come away from your comment thinking “ah, Maya’s wrong and people shouldn’t criticize EA culture.” I wanted them to come away both knowing the truth about this specific situation AND thinking more broadly about EA culture, because I think this post makes a lot of other very good points that don’t rely on the Kathy claims. (And thinking more broadly could include updating positively like I did, although I didn’t expect that would be the case when I made that comment!)
You’re probably right that it’s not worth giving much more of a response, but I appreciate you engaging with this!
As another data point: I’m a woman, I think I’m the main reason a particular man has been banned from a lot of EA events under certain conditions and I think CEA’s Community Health team have handled this situation extremely well.
But on balance, I’ve found that men in EA treat me with a lot more respect than men do outside of EA. And if anything, I think any complaints I do make are taken too seriously.
This doesn’t excuse bad behaviour of course, even if my experience were typical. But I have always wondered why so much of our energy goes into how women feel in this community vs people with other marginalised characteristics, some of whom no doubt also feel “sad, disappointed, and scared” in EA (e.g. discussions nominally of “diversity and inclusion” often end up just being discussions of how to treat women better).
Thank you for sharing that!
For what it’s worth, I think “discussions of DEI end up becoming discussions about women” is pretty common—not to say it’s excusable, but I don’t think that’s unique to EA.
In the cases like this I’ve been most closely involved in, the women who have reported have not wanted to publicise the event, so sometimes action has been taken but you wouldn’t have heard about it. (I also don’t think it’s a good habit to try to maximise transparency about interpersonal relationships tbh.)
Yeah, this is very fair and I agree that transparency is not always the right call. To clarify, I’ll say that my stance here, medium confidence, is: (1) in instances which the victim/survivor has already made their accusations public, or in instances where it’s already necessarily something that isn’t interpersonal [e.g. hotness ranking], the process of accountability or repair, or at least the fact that one exists, should be public; (2) it should be transparent what kind of process a victim can expect when harm happens.
There’s some literature around procedural justice and trust that indicates that people feel better and trust the outcomes of a process more when it is transparent and invites engagement, regardless of whether the actual outcome favors them or not.
I am glad to hear that there have been cases where women have felt safe reporting and action has been taken!
(edited to delete a para about CEA community health team’s work that I realized was wrong, after seeing this page linked below)
I’d agree I’d favour systems that help people feel confident in the outcome even when it doesn’t favour them, and would like to see EA do better in these areas!
I’m not too confident about this, but one reason you may not have heard about men being held accountable in EA is that it’s not the sort of thing you necessarily publicize. For example, I helped a friend who was raped by a member of the AI safety research community. He blocked her on LessWrong, then posted a deceptive self-vindicating article mischaracterizing her and patting himself on the back.
I told her what was going on and helped her post her response once she’d crafted it via my account. Downvotes ensued for the guy. Eventually he deleted the post.
That’s one example of what (very partial) accountability looks like, but the end result in this case was a decrease in visibility for an anti-accountability post. And except for this thread, I’m not going around talking about my involvement in the situation.
I don’t know how much of the imbalance this accounts for, nor am I claiming that everything is fine. It’s just something to keep in mind as one aspect of parsing the situation.
Thank you, yeah I think I may be overindexing on a few public examples (not being privy to the private examples that you and others in thread have brought up). Glad to hear that there are plenty of examples of the community responding well to protect victims/survivors.
I still also don’t think everything’s fine, but unsure to what extent EA is worse than the rest of the world, where things are also not fine on this front.
I wonder if it would be helpful to have some kind of (heavily anonymized, e.g. summarizing across years) summary statistics about the number of such incidents brought up to CEA community health (since they are the main group collecting such info) and how they were dealt with / what victims choose to do to balance out the public accounts.
Does the appendix in Julia’s post here do what you’re looking for?
https://forum.effectivealtruism.org/posts/NbkxLDECvdGuB95gW/the-community-health-team-s-work-on-interpersonal-harm-in
Yeah I think it does! It might be good to highlight that in a way more people would read it (e.g. I read that post + the appendix but forgot it was there!)
I’m strongly in favour of this—it often feels like the need is to make this public so it becomes something the entire community is responsible for—as opposed to how it currently is (private and something CEA’s comm health mainly is responsible for).
FWIW this is exactly how I feel about gender-based issues in EA!
Can you link to your post? I’m asking in order to avoid the (probably already existing) situation where people see that “some of Forth’s accusations” are allegedly not true, but they don’t know which, so they just doubt all of them.
If someone has a record of repeatedly making accusations that have been proven false, I think it is reasonable and prudent to “just doubt all” their accusations. This person was clearly terribly ill and did not get the help she needed and deserved. It’s painfully clear from reading her heartbreaking note that she was wildly out of touch with reality.
I wouldn’t want anyone to have the impression that Kathy wasn’t given extensive support, or that she wasn’t offered appropriate help. She definitely was, repeatedly and over a long period of time.
Could more effective help have been given? I honestly don’t know, but it was well beyond my ability and capacity at the time.
It was a painful and heartbreaking situation. I think that’s as much as I can say publicly.
Thank you for clarifying. To be clear, I am basing everything I said on the contents of her note and the publicly available things written since (including Scott Alexander’s and Julia Wise’s commens on this post). I don’t know anything about her situation beyond that. It sounds like an impossible situation—I’m sorry for your loss.
Maya, I’m so sorry that things have made you feel this way. I know you’re not alone in this. As Catherine said earlier, either of us (and the rest of the community health team) are here to talk and try to support.
I agree it’s very important that no one should get away with mistreating others because of their status, money, etc. One of the concerns you raise related to this is an accusation that Kathy Forth made. When Kathy raised concerns related to EA, I investigated all the cases where she gave me enough information to do so. In one case, her information allowed me to confirm that a person had acted badly, and to keep them out of EA Global.
At one point we arranged for an independent third party attorney who specialized in workplace sexual harassment claims to investigate a different accusation that Kathy made. After interviewing Kathy, the accused person, and some other people who had been nearby at the time, the investigator concluded that the evidence did not support Kathy’s claim about what had happened. I don’t think Kathy intended to misrepresent anything, but I think her interpretation of what happened was different than what most people’s would have been.
I do want people to know that a lack of visible action doesn’t mean that no one looked into a situation or took it seriously. More here about why there may not be much visible action.
I think these problems are really hard to deal with fairly and well. I’m sure my team doesn’t always have the balance right, but you can read more about our approach here.
Cultural change also can’t all be handled by CEA or any one centralized source. We want to support organizers, employers, online spaces, and other EA spaces in building a healthy culture. To anyone who’s an organizer or other person shaping the culture of an EA space, we’re here to talk if you’d like to.
This is a really good comment IMO, especially the final paragraph which I’m reproducing to avoid it getting lost in a comment which mostly focuses elsewhere.
Hey Maya, I’m Catherine - one of the contact people on CEA’s community health team (along with Julia Wise). I’m so so sorry to hear about your experiences, and the experiences of your friends. I share your sadness and much of your anger too. I’ll PM you, as I think it could be helpful for me to chat with you about the specific problems (if you are able to share more detail) and possible steps.
If anyone else reading this comment who has encountered similar problems in the EA community, I would be very grateful to hear from you too. Here is more info on what we do.
Ways to get in touch with Julia and me :
Email: Julia: julia.wise@centreforeffectivealtruism.org, Catherine: catherine@centreforeffectivealtruism.org
Form (you can choose to be anonymous)
For what it’s worth, I’m a 30 year old woman who’s been involved with EA for eight years and my experience so far has been overwhelmingly welcoming and respectful. This has been true for all of my female EA friends as well. The only difference in treatment I have ever noticed is being slightly more likely to get speaking engagements.
Just posting about this anonymously because I’ve found these sorts of topics can lead to particularly vicious arguments, and I’d rather spend my emotional energy on other things.
Hi hi :) Are you involved in the Magnify Mentoring community at all? I’ve been poorly for the last couple of weeks so I’m a bit behind but I founded and run MM. Personally, I’d also love to chat :) Feel free to reach out anytime. Super Warmly, Kathryn
(saying this in a friend capacity and in shock that I haven’t introduced you two already) - you two should definitely talk!
Awesome :) Really looking forward to it :)
Hi Kathryn,
I’m not yet involved in Magnify Mentoring but would love to be and would love to chat! I’ll message you privately.
Hey Maya, I like your post. It has a very EA conversational style to it which will hopefully help it be well received and I’m guessing took some effort.
A problem I can’t figure out, which you or someone else might be able to help suggest solutions to -
-If I (or someone else) post about something emotional without suggestions for action, everyone’s compassionate but nothing happens, or people suggest actions that I don’t think would help
-If I (or someone else) post about something emotional and suggest some actions that could help fix it, people start debating those actions, and that doesn’t feel like the emotions are being listened to
-But just accepting actions because they’re linked to a bad experience isn’t the right answer either, because someone could have really useful experience to share but their suggestions might be totally wrong
If anyone has any suggestions, I’d welcome them!
I think you may be on the right track with how you wrote this comment actually—taking a moment to let the person know they were heard before switching to problem-solving mode.
IMO social media websites should sometimes give users a reminder to do this after they hit the “submit” button, but before their comment is posted. Perhaps the submit button could check whether a particular tag is present on the original post?
This is well put.
I think people can say that debating is their way of trying to care. Not a full solution but I think people don’t sometimes realise this.
Maybe one way to address this would be separate posts? The first raises the problems, shares emotions. The second suggests particular actions that could help.
Maya—thanks for a thoughtful, considered, balanced, and constructive post.
Regarding the issue that ‘Effective Altruism Has an Emotions Problem’: this is very tricky, insofar as it raises the issue of neurodiversity.
I’ve got Aspergers, and I’m ‘out’ about it (e.g. in this and many other interviews and writings). That means I’m highly systematizing, overly rational (by neurotypical standards), more interested in ideas than in most people, and not always able to understand other people’s emotions, values, or social norms. I’m much stronger on ‘affective empathy’ (feeling distressed by the suffering of others) than on ‘cognitive empathy’ (understanding their beliefs & desires using Theory of Mind.)
Let’s be honest. A lot of us in EA have Aspergers, or are ‘on the autism spectrum’. EA is, to a substantial degree, an attempt by neurodivergent people to combine our rational systematizing with our affective empathy—to integrate our heads and our hearts, as they actually work, not as neurotypical people think they should work.
This has lead to an EA culture that is incredibly welcoming, supportive, and appreciative of neurodivergent people, and that capitalizes on our distinctive strengths. For those of us who are ‘Aspy’, nerdy, or otherwise eccentric by ‘normie’ standards, EA has been an oasis of rationality in a desert of emotionality, virtue-signaling, hypocrisy, and scope-insensitivity.
Granted, it is often helpful to remind neurodivergent people that we can try to improve our emotional skills, sensitivity, and cognitive empathy.
However, I worry that if we try to address this ‘emotions problem’ in ways that might feel awkward, alienating, and unnatural to many neurodivergent people in EA, we’ll lose a lot of what makes EA special and valuable.
I have no idea how to solve this problem, or how to strike the right balance between welcoming and valuing neurodiversity, versus welcoming and valuing more neurotypical norms around emotions and cognitive empathy. I just wanted to introduce this concern, and see what everybody else thinks about it.
I really appreciated your comment and think it’s important to acknowledge and ensure neurodiverse people feel welcome, and I’m coming from a place where I agree with Maya’s reflections on emotions within EA and am neurotypical.
Not sure I have time to post my thoughts in depth but I think the rational Vs. intuitive emotional intelligence tension within EA is something worth a lot more thought. It’s a tension / trade-off I’ve picked up on in the EA professional realm: where people aren’t getting on in EA organisations, where people aren’t feeling heard, and where the working culture becomes one that’s more afraid of losing status / threat mindset than supportive, to the detriment of employees.
Maybe as a counter to what you’re saying, some of the people who helped me best own and articulate my emotions (in the context of another EA repeatedly undermining me) are bay-area rationalist EAs who you might describe as neurodivergent. Why? I think a lot of people from that community have just done the work on themselves to recognise emotions in themselves, and consequently in others. And this is driven by valuing emotions / internal worlds intrinsically—in that integrating head and heart way you write about—and then getting better in that domain.
So to link this back to Maya’s post;
agree with making sure EA is truly inclusive and, in being better at responding to emotions and traumatic experiences, doesn’t swing to excluding neurodivergent people,
I think this tension / trade-off goes beyond social realm, and into the professional, and
I would like to play up how many neurodivergent people—especially those who might instinctively behave in a way that creates the culture Maya has highlighted as problematic - can actually be really good at creating an emotionally responsive and caring environment.
Happy to discuss further time permitting (which is sadly not on my side!)
howdoyousay—thanks for this supportive post.
I agree that many neurodivergent people can develop quite a good set of emotional skills (like some of your Bay Area rationalists did), and can promote emotionally responsive and caring environments.
(When I teach my undergrad course on ‘Human Emotions’—syllabus here—one of my goals is to help neurodivergent students improve their understanding of the evolutionary origins and adaptive functions of specific emotions, so they take them more seriously as human phenomena worth understanding)
My main concern is that EA should not become just another activist movement where emotions over-ride reason, where ‘lived experience’ gets prioritized over quantitative data, and where neurodivergent people get cancelled, shunned, and stigmatized for the slightest violations of social norms, or for ‘offending’ neurotypical people.
You’re right that striking the right balance is worth a lot more discussion—although my sense is that, so far, EA as a community has actually done remarkably well on this issue!
Throwing out one possible approach:
People think about where they have blindspots around reading certain styles of writing, and acknowledge that in those areas, they may not get the point being made, even if there is an important point
When someone makes a post that communicates in a way that you identify as your blindspot, you think about whether you can respond in the same style that they communicated.
If you can—do so. If you can’t—you don’t have to respond to the post at all. This is the crux of my suggestion. If you just see the world differently from someone else, so much so that responding to it would involve a clash of your worldviews, it’s okay to just leave it alone. I think “let it go” is an undervalued approach on every internet forum, and especially so here.
That’s my best guess at a strategy that works both for someone who systematizes a lot reading an “overly” emotional post, and for someone who systematizes very little reading an “overly” analytical post. But I agree this is something of a wicked problem and we need some way to tackle it. In the absence of an explicit approach, I think the OP is right to point out that people will just respond in an analytical way to emotional posts and that may not help anyone at all.
Thank you very much for your perspective! I recently wrote about something closely related to this “emotions problem” but hadn’t considered how the EA community offered a home for neurodivergent folks. I have now added a disclaimer making sure we ‘normies’ remember to keep you in mind!
This post resonated with me, and I am very glad it was posted. I find it really disheartening that the last two posts on women’s experiences in EA have got sidetracked in the comments- this one has gotten sidetracked by discussion about Kathy Forth, and the last one got sidetracked by discussion about polyamory. Neither discussions have seemed to end up in suggestions that are going to be helpful to women. In my opinion, that would be a constructive outcome.
Importantly: I believe it would be incredibly valuable for EA to be seen as a place that is welcoming to women. I think EA is missing out on a huge pool of talent here, and on a huge pool of potential funders! I think people underestimate (1) the degree to which women are put off of EA via bad vibes on the internet (e.g. by scanning the comments on posts like these, and not seeing constructive suggestions to improve women’s experiences), and (2) the huge loss it is for EA to fail to attract more women. Are you funding constrained or talent constrained? Women have talent and money, and we’re (slightly) >50% of the population! I think that ‘bad vibes’ push a significant fraction of women away. In addition, I think it contributes to bad PR for EA in general.
I note that i don’t think it’s a good plan to rely on surveys of EA women (or anecdotes of ‘I am an EA woman and I haven’t had an issue’), because.… most women who sense ‘bad vibes’ in EA communities are not going to stick around to answer a survey. It’s an incredibly biased sample.
Things I would like to see (it’s possible that this is around somewhere and I haven’t found it). I focus mainly on the general issue of ‘how women are perceived and recruited in EA’ (rather than allegations of sexual assault), because I think is a really neglected issue:
Data to assess the extent of the problem (bearing in mind that we already have the somewhat alarming stat that only around 30% of EA are women):
what proportion of senior positions at top EA orgs are occupied by men versus women? I suspect (but I could be wrong) that most of the positions are held by men.
what proportion of talks at EAG are given by men versus women? Do less people attend the talks that are given by a woman or a man, and are people more or less likely to attend a 1 on 1 with a man or woman?
do male sounding names get more karma/ upvotes/ comments on the EA forum?
where do women ‘drop out’ on the EA pipeline? I’m assuming its initial recruitment, but idk. I think working this out is crucial towards increasing the number of women (and therefore to improving women’s experience more broadly- I believe this would likely flow naturally from achieving a more balanced gender ratio).
2. Some potential actions to make EA more welcoming to women:
more women in top, leadership positions (and increasing the visibility of these women). I think this has huge benefit- it will make EA more welcoming to women in general, and I suspect that having broader viewpoints in general will also benefit EA.
If the data suggests that women’s work/ posts/ talks achieve less social influence than men’s (this is my suspicion)- highlight women’s work so it doesn’t get drowned out
doing best to achieve reasonable gender ratio in people giving talks at conferences, i assume this is done already though?
potentially employing someone (with experience at a major org that is not EA) as an expert on this, this is hardly a problem that is unique to EA. They may have the fresh takes/ outside opinion which would be helpful to EA .
This might be where I disagree with others, but I do not think that EA topics are inherently less attractive to women (compared to men), or that women are somehow inherently less skilled at pursuing EA topics. Instead, I am concerned that there is a real lack of deep and critical thought about why women get put off EA; i think the community is missing out on a huge untapped benefit here. I hope that we can apply the data-driven/ research and analytical skills (what attracted me to EA in the first place) towards this issue.
(I also want to explicitly note that I think the work done by the community health team is excellent, and this post is not intended as a criticism to them. In addition, note that many of the points that I make here also apply to other communities that are under-represented in EA, even if I have focused upon women here).
Hey, I’ve been a follower the community for years and your post made me want to create an account and leave a thought. Full disclosure: I’m trans FtM from a not-Western country, never been to an EA meetup but have always been curious. I’m listing these characteristic as I think they’re directly relevant to what I describe below. Also, I have massive Asperger’s (and my English is not perfect), so I want to explicitly note that I 100% value your post, support having a discussion about this, and have no great solutions. Flagging this because the tone of my text may not convey this otherwise.
Some thoughts, ordered from most to least banal:
Punishment for confirmed bad sexual behavior should be swift, harsh, and very likely. Great care must be exercised to make the abused come forward.
Great care must be exercised to thoroughly vet accusations and protect the rights of the accused.
#1 and #2 involve a false positive/negative tradeoff between each other, and I suspect a solution to satisfy a good chunk of people across many backgrounds is likely impossible. This is likely not news to anyone reading this, so I apologize for wasting your time so far.
I think your post adeptly highlights the reason for why EA as a movement will almost inevitably (IMO) collapse or split.
I want to elaborate on the last point most of all. As an observer, I always thought that core point of EA was based in recognizing the fallibility of human emotion in guiding us about priorities, as strong emotional states can cloud our thinking (AFAIK this is not very controversial and is supported by research). This is why Paul Bloom’s “Against Empathy” is referenced relatively frequently in the community, right? My not important personal opinion is that this is a partial picture, because emotion can also be used as a driver/inspiration to care about injustices. Of course the difficult question is whether you can get the best parts of emotion into the movement without getting the worst of emotion as well.
But there’s something broader that concerns me, and I think spells trouble for the EA movement as a whole. I will start from a story that has to do with my FtM self. When I started hormonal therapy I noticed that some of my basic intuitions started changing, the feelings and views I wasn’t even aware of existing. I kind of took them for granted, they were transparent to me. For example, before, I felt a lot of instinctive anger at injustice, and now I still feel it, but it’s more detached and principled. “Less emotional” doesn’t quite describe it, it’s something about the anger becoming more calculating. Like I can be a bit more removed and hypothetical about different problems; I could turn off my empathy a bit (which actually felt scary!) to be more analytical. And to repeat, I assumed my previous emotional responses to be same as everyone’s, it didn’t even occur to me things could be any other way (I’m sorry if this all sounds like pseudoscience).
Now a lot changed in me that could have resulted in the above, and even though other T people I know have on average followed a similar switch (depending on the direction), I didn’t think much of it. So it could be totally useless anecdotally, but it did get me reading on the “gender divide” in Western politics. And to simplify it greatly, it seems like there’s pretty convincing argument that the gradual “feminization” of Western cultures in the last several decades drives a lot of the conflict in politics and across institutions, being in a tension with the typical “masculine” culture. And this is rooted partly in that women and men, on average, have different views on things like speech/harm tradeoff: women tend to prefer to limit speech to minimize harm, whereas men lean in the other direction. (Please note that these are average, population-level statements, and the distributions of these traits do overlap a lot!)
These are some of the more notable readings I’ve read on this divide:
NYT coverage of the gender divide in American politics: https://archive.ph/1LAwU
One interesting highlight:
A Quillette piece with a lot more stats showing patterns like in the piece above, broadly showing that women on average lean toward diversity/inclusion over speech/”truth-seeking” when faced with a choice, and mean lean in the other direction: https://quillette.com/2022/10/08/sex-and-the-academy/
A bunch of Marginal Revolution posts, highlighting what the authors consider to be both “good” and “bad” parts of “feminization”:
https://marginalrevolution.com/marginalrevolution/2019/12/sex-differences-in-personality-are-large-and-important.html
https://marginalrevolution.com/marginalrevolution/2021/09/a-simple-illustration-of-the-benefits-of-feminization.html
https://marginalrevolution.com/marginalrevolution/2021/08/when-did-we-all-become-women.html
https://marginalrevolution.com/marginalrevolution/2022/03/the-complexities-of-feminization.html
https://marginalrevolution.com/marginalrevolution/2020/01/the-feminization-of-society-installment-1637.html
https://marginalrevolution.com/marginalrevolution/2017/05/much-educational-political-polarization-due-feminization.html
A very polemical, negative, and not particularly epistemically sound piece very critical of feminization: https://richardhanania.substack.com/p/womens-tears-win-in-the-marketplace (including it here solely for the value of showing a clear “anti” voice, but I’d read this much more critically than anything else in the list)
A similarly negative piece, but from someone I’d consider of higher integrity: http://www.arnoldkling.com/blog/feminized-culture/
Okay, so what’s the point of my super-rambley comment? In short, whether my and my T friends’ anecdotal experiences are related to this or not, I think centuries of masculinity-type culture have not left us prepared to a world where culture becomes feminized. We are now entering this world, and have to deal with how to strike a balance between these sets of values. And the “Effective Altruism” movement has historically been (to my understanding) a movement that prioritizes asking difficult (even if very offensive) questions and analyzing costs/benefits in a way that increases accuracy (in part, by trying to be less emotional/swayed about the topic of the analysis). It seems to me that as designed, the movement either has to split into two, or just morph into a charity/NGO like many other.
Pessimistically, I do not think there’s a way to accommodate these sets of views. Having to some limited extent “experienced both,” I don’t even know how to feel about them myself, internally, and struggle with this daily. How can we expect an entire movement to align then? Perhaps if the leaders and big thinkers of the movement come together, recognize this as the urgent problem, and figure out a way to approach it, there is some sort of resolution (a Scott Alexander megapost/initiative to organize this?). But again, my prediction is that the unobstructed, calculated, detached, low-anger-type discourse would have to give way as EA becomes more inclusive and accommodating of feminized culture.
P.S. You may read this and wonder what this has to do with handling sexual assault, and I just want to clarify that I don’t necessarily think much about the masculinized/feminized culture divide affects how well we can handle sexual assault, but I think we do see the difference in how the topic is discussed. And much more broadly than that, this post made me think of how the divide affects how true EA stays to the principles it was originally founded on (without framing them as “good” or “bad”).
P.P.S. I’m not a great writer/thinker and I’m sure this comment is rambling/confusing. I just hope there is something useful in it, it seems to me so. If someone can take it and make a more coherent point out of it, please do. If Scott Alexander could write a post tackling the masculinized/feminized culture divide and how it’d affect a movement like EA, I’d be very happy, cause he would be able to research and properly write about this 100000x better than me. I’m not even involved in EA. Thanks for making it so far in my comment!
Upvoted: you’re pointing to an important tension (truth-telling vs inclusiveness).
However I don’t believe this requires the movement to split. There are more and less skilful ways to tell the truth and there are more and less skilful ways to be inclusive.
Both are important to our mission. We can continue to improve at both.
We can’t simply aim to maximise one or the other, though. E.g. Even if someone valued truth-telling above all else, a lack of inclusiveness would keep the movement small, controversial and marginalised.
I think the crux is that the first commenter thinks EA is near the pareto frontier of rationality-warmth, so in practice we can’t improve both, we must pull to one or the other side.
confused_puppy: this is an excellent, informative, and fascinating comment. Thanks for sharing this, especially the observations about how testosterone nudged your emotional responses to certain ethical issues.
Anecdotally, as another person from a non-Western country—currently living in the West, it’s quite disconcerting to me that (certain parts of) Western cultures have become as you described, and I come from a very Westernized city of that country/continent and grew up being taught by Western teachers in Westernized schools where I spoke to everyone in English.
This isn’t just limited to EA. It looks like this has and seems likely to continue to increase tensions between Western/non-Western cultures. I can’t say for sure that the primary reason that EA would split (perhaps into a more rationalist-leaning side and a more progressive-leaning side) is because of this, but I’m also pessimistic about cooperation/reaching compromises.
It would be nice to imagine that aspiring to be a rational, moral community makes us one, but it’s just not so. All the problems in the culture at large will be manifest in EA, with our own virtues and our own flaws relative to baseline.
And that’s not to mitigate: a friend of mine was raped by a member of the Bay Area AI safety community. Predators can get a lot of money and social clout and use it to survive even after their misbehavior comes to light.
I don’t know how to deal with it except to address specific issues as they come to light. I guess I would just say that you are not alone in your concern for these issues, and that others do take significant action to address them. I support what I think of as a sort of “safety culture” for relationships, sexuality, race, and culture in the EA movement, which to me means promoting an openness to the issues, a culture of taking them seriously, and taking real steps to address them when they come up. So I see your post as beneficial in promoting that safety culture.
Hey AllAmericanBreakfast. I’m Catherine from the Community Health team. I’m so so sorry to hear that your friend was raped. If at all possible, I want to make sure they have support, justice, and that the perpetrator doesn’t have the opportunity to do this again. It doesn’t matter if your friend doesn’t identify as EA, if your friend, or the perpetrator are involved in the EA community in anyway we’re here to do our best to help. I’ll reach out via PM.
Hey :) I was raped before I was involved in EA. I normally find these discussions hard and frustrating. I feel we often talk past one another and that the people with similar experiences withdraw because it’s still painful/ they get frustrated and hurt. I would like people like me to know: 1. There are a lot of people who have similar experiences to me who are active in the EA community. You may not see them here because of the aforementioned issue but we are here. 2. There are a lot of people who take these issues very seriously, including me, 3. I trust and endorse Catherine Low entirely. She has seen it all with me and has been kind, empathetic, and not unilateral. 4. To the extent possible, please consider reporting either to Catherine, the community health team or the police, or both. Kirsten is entirely right, this is horrifically unfair and you have no obligation to do so, but it is very important that people with a track record of sexual (any) violence not be in positions of power in any institutions or communities for the safety of other community members. 5. If there is anything whatsoever I can do, including talking openly about my experiences (I do have a blog draft actually about how I coped with my rape) which I am happy to share, an adamant vouch for Catherine and CEA’s team, or just generally a cup of tea, you should hit me up.
I’m very sad to hear about this.
I don’t understand why the community health team is not able to handle this kind of thing. Did your friend make a report? Does the community health team need more funding or employees? Are they afraid to take on people with clout?
Even if the accused is doing a lot of good work, if the accusation is found to be credible, at the very least we should ensure the accused does not occupy a position of responsibility. If they are serious about AI safety, they should agree to this measure themselves, for the sake of guarding humanity’s future.
EAs should work to ensure that positions of responsibility are occupied by people of exemplary moral character, in my view. (Edit for clarification: I don’t want my view rounded off to “EAs should work to ensure positions of responsibility are occupied by the people who are hardest to cancel”. For example, my notion of “exemplary moral character” accounts for the possibility that failure to report on false accusations made by Kathy Forth could represent a character deficit, even if such failure-to-report makes one harder to cancel. I also think that everyone is flawed, and ability to recognize and learn from one’s mistakes is really important.)
My friend is not part of EA, she was just at an EA-adjacent organization, where the community health team does not have reach AFAIK.
Seems to me she should be talking to them anyway.
I am confident this comes from a good place but I really really dislike that this comment is telling (the friend of) someone who was raped what she should do. People who have been raped can respond however they want, whether they decide to report the situation or not is entirely up to them, and I hate when people act like there is one correct response.
Thanks Kirsten.
I’m interested in understanding your position better. Do you agree there are circumstances under which reporting a crime is the correct response? (Would you agree that an FTX employee blowing the whistle on SBF would be the correct response, for example?) If you can think of at least one scenario where you think reporting a crime is the correct response, maybe you could outline how this scenario differs? (For the purpose of our discussion, I’m assuming that the current crime is serious, unambiguous, and unrepented, constituting significant evidence that the perpetrator will cause major harm to others.)
My first guess is you think there’s something unique about rape such that the associated trauma means reporting can cause suffering. In that case, this would appear to be a straightforward demandingness dilemma—one’s feeling about the statement “it is correct to report rape” might be similar to one’s feeling about the statement “it is correct to forgo luxuries to donate to effective charities”. In both cases you’re looking at taking on discomfort yourself in order to do good for others. (In my mind the key considerations for demandingness dilemmas are: how much good you’re doing for others, how much discomfort you’re taking on, and what is personally psychologically sustainable for you. And I think saying “Seems to me they should [do the demanding thing]” is generally OK.)
Thanks for any thoughts you’re willing to share.
Hey Truck Driver Wannabe (great Forum name by the way) - I’m a medical doctor and have recently completed extra training in helping people who’ve experienced sexual assault. There are no ‘shoulds’ (except that the perpetrator should not have done it). I can’t do this topic justice in a Forum commentary (nor would I want to) but if you’d like to contact me directly, I’m happy to talk to you more about this.
Hi Truck Driver Wannabe,
I really appreciate your effort to understand the other side of the argument and I see why you are confused about the reaction.
For me, I find the idea that a person has any responsibility whatsoever to involve the cea community health team in any matter regarding their personal life (including and especially sexual assault) baffling. Reporting to CEA is not obviously net harm reducing, because a predator who is kicked out of CEA sponsored events can and will just move to another community and continue their predatory behavior elsewhere. And that is assuming that CEA handles the situation perfectly.
I also don’t think a person has such a responsibility to report to law enforcement, only partly because law enforcement has generally not earned a reputation for handling these cases well.
If we lived in a different world where law enforcement was more competent in these cases, then I agree this would be a straightforward demandingness dilemma. However, I don’t expect anyone to be publicly retraumatized in the service of helping strangers and I think it is extremely unfair to do so. Being publicly humiliated, mocked, disbelieved, called names, concern trolled, having every past sexual and romantic encounter up for public scrutiny, and being forced to publicly and repeatedly detail the most horrifying moments of your life is not even almost the same as, say, donating ten percent of your income. All or many of these things often happen to people who report sexual assault to a responsible and thorough law enforcement agency that does all the right things and has ample resources.
In general I don’t think it’s that healthy to expect others to give a certain amount of their time or money or anything else. I think we should all set an example in our own lives and be public about why we make the choices we do, but respect that others have the right to choose what and how much they give (emotionally and otherwise). But even if I didn’t believe that in general, I would still believe it in case of sexual assault.
Hi Monica, thanks for the reply.
Suppose my original comment was
And I got these replies:
“I find the idea that a person has any responsibility whatsoever to donate to $EA_CHARITY baffling.”
“Donating to $EA_CHARITY is not obviously net harm reducing. Their work may funge against other efforts. And even if they do perfect work, solving poverty in the developing world still leaves developed-world poverty as a major problem.”
“The person reading your comment could be almost broke, such that if they donate to $EA_CHARITY they would homeless and destitute. It is unreasonable for us to ask anyone to take that sacrifice.”
“Other charities which claim to solve the problem $EA_CHARITY works on have been found to be scams. Don’t be surprised if they sell your credit card details to cybercriminals.”
“People have the right to choose how much they give.”
These are all valid replies I agree with partially or fully.
But they all seem to operate under the assumption that I hold a much different position than the one I actually hold. I’m not totally sure what I did to give people the mistaken impression.
Maybe I just need to learn to avoid triggering people.
In any case, I think you and I agree more than we disagree.
I am quite new to EA and here are some of my novice thoughts reading this post:
I am non white and non male and usually have a very low threshold for when social gatherings make me feel aware of those two facts. EA community has hardly made me feel so.
I am neurodivergent as well, my partner for the last 2 years was too. I’d say I am more emotionally present than outright Aspergers. I have seen that my requests to consider emotions and rational explanations for them have been well appreciated and incorporated into actions by my partner. Our relationship gets stronger the more it is talked about. The phrasing of ‘shouldnt do Xyz cz it will alienate aspy EAs’ is very biased IMO. And counterproductive to the issue
As much as we hate it, EA has a status issue. EA forum is a place for elite (written) stuff. I have heard people say they hate posts talking about emotional stuff/experiences or anything that is not jargon or math heavy. I don’t know what to do about it, I am just saying OPs experience rings a bell.
I have been hit on at EAGs when not seeking that, it is as annoying as it ever is. I usually brush it off by rationalising that the person probably has a terrible social radar (aspy) and doesn’t know how to not act. Hence I don’t report it thinking ‘probably wasnt their intention’. I realise doing so never addresses the issue
As someone who has been broken up with before because someone acted on a joke told to them in a very serious 80k career call, cz they thought that was the right thing to do even if that broke them, I know that groups running EA cannot entirely control the actions of aspy members. But EAs do respond to well rounded discussions and numbers. We just need to open up dialogue about this.
I recently worked for an EA organization and went in expecting loving caring environment to solve the best problems. I left feeling burnt out, unheard and drowned in the religiousness of the cause. Might be an isolated singular experience but I was tired of feeling guilty for taking time off for myself. Even though doing that was highly encouraged explicitly. -How many new EAs have stopped doing things that they derived pleasure from after joining? Stopped having out with Non-EA friends because ‘they don’t get it’ or ‘its too much to explain’. How many miss both things? - good polling qns IMO. Go out and grab a boba tea with that one friend that chats a lot about Netflix. You will be fine. Better even.
(I have written this in bed without being mindful of using jargon to appeal to a certain group of people. I intended to do that.)
Thank you for sharing such a brave, thoughtful and balanced post.
Thank you for having the courage to say this out loud.
Just remember that the “EA community” does not have a monopoly on doing good. If the atmosphere is toxic, people are just going to leave, and rightfully so. You will be better off, and do more good overall, in environments and spaces where you feel respected and safe. If EA is not committed to such a space, they will slowly die off, and be replaced by an organisation that is.
That’s a true point—but I don’t a good objective. EA should strive to exist with the best, highly-aligned to doing good people and I think we need a culture the prioritises people’s lived experiences, feelings, and interactions for that to happen.
Of course. I’m not involved in any irl EA communities so I can’t really judge how bad/good they are. It would be better if the current movement survived with good community norms in place, but if it doesn’t, it’s not the literal end of the world, a new, better community will replace it.
I had sort of the same reaction. To me, “doing the most good” is something I live by. I don’t identify as EA, but as altruistic.
I find it sort of interesting when people refer to the “EA community” and make efforts to change it. I mean, they’re not wrong, but from another perspective, the “EA community” is almost an overgeneralization. Like for instance there are animal rights activists, longtermists, and climate change activists all getting to know each other through EA. There are going to be toxic people or cliques, and it’s sort of weird to say “EA is ____” when plenty of people “within EA” have never met each other and have nothing to do with each other.
Just some thoughts. I don’t disagree with the original post.
In a world where the EA culture is bad, but where no-one else is really doing any better, we may not be able to be replaced in this way, and it becomes even more important to ensure that we get the culture right here.
I think an attitude of irreplaceablity can be dangerous: someone could easily make the mistake of thinking that bad apples need to be protected and covered up for in order to preserve the movement as a whole. (I certainly hope nobody thinks this way here, but this has happened before in other movements).
In truth, the ideas aren’t going away. Individual people can be replaced, and new groups can form in the event of a blowup. Try and fix the culture, sure, but if it’s too far gone, don’t be afraid to blow the whistle and blow it up, in the long term it’s healthier.
What does “blow it up” mean for an EA who decides the culture is beyond fixing, but who doesn’t have significant power within the community? Is it leaving the community in search for a better one to do good in?
Pretty much, yeah, along with exposing bad actors if you judge it safe to do so. There are plenty of non-EA orgs doing work that is effective, you can work them them and try and bring in principles of effectiveness.
Also toxic communities are inherently unsustainable, eventually enough people will leave that a splinter group can be made.
Some suggested remedies. I know some of these are weird, but I honestly think they are good. Many solutions don’t attempt to manage appropriate to scale, in a distributed way or with correct incentives, I think these do:
Poll to understand the scale of the problem
Let’s know how many people feel this way. Is there a link between this and the number of women the community? We don’t have to guess this stuff, we can just know
People at EAGs can report people who used meetings to try and flirt with them in a way they didn’t like. Slowly increase punishments (I suggest probabilistic bans from EAGs eg 5% you are banned for 6 months) until the harms to women are less than the cost of the bans. I like flirting at EAG parties, so I think there is a different tone there, but seems fine for during the day for there to be a high risk to flirting without someone appreciating it.
I like probabilistic bans because most of the time they are just a warning but they still sometimes have bite.
(This image is from the last time we had this discourse. I guess it would replicate in a representative poll. Most women don’t want to be flirted with at a conference during the day, though some do. As I say, seems we should increase the cost of doing so)
People sometimes argue that I’m too harsh on this. But currently I think the harms from people being flirted with who dont’ want to be are greater than the harms from those who would have their freedom curtailed, so I suggest we try it.
I unironically support people who have been harmed gossiping about people who have done so. If you hear a bad rumour about someone, by all means check it, but I think it’s okay to share what someone has said to you.
There are costs to this in terms of community trust so consider carefully if rumours are true, but I still think we undergossip tbh.
There should be a clear process for what happens around bad behaviour in relation to EAGs in particular and a way for people to be forgiven of bad behaviour (given credible change and timescales based on badnesss). EA should not operate on reasonable doubt, but on balance of harms (and I say this as someone who sometimes falls afoul of this stuff—but the harms to all involved matter equally, where as “beyond reasonable doubt” generally ignores harms to the accuser imo)
Scandal markets. I unironically think there should be a manifold market on whether any EA above a certain reputation level is found guilty of harassment by and independent investigator. Then people can share their information by betting privately. Investigations happen at random
This sounds mechanistic and weird, but imagine if it was normal, would we remove it? I doubt it
Prediction markets are distributed whistleblowing
“What if the accused manipulates the market?” This increases liquidity and draws attention to the market
“Wouldn’t it feel awful/be tasteless for powerful people to have markets on whether they would harass someone?” I think the status quo is worse. I don’t mind putting additional burdens on people in positions of power. And I am confident that this would decrease the likelihood of some big scandal such as destroys other communities.
Because I believe you can’t advocate for this without having a market yourself, mine is here.
Polling seems like a good idea
I’m worried a lot of this is missing the point, and potentially missing important solutions. I’m going to use EAG for my examples here, as I think it is the strongest case of what I’m describing, but I think my argument generalises to a lot of scenarios and spaces in the EA community.
In my mind, there are two competing things going on here:
At an EAG, you are likely to meet people who are at a similar stage in their life to you, who have similar interests, and who are likely to be both intelligent and altruistic, both attractive qualities. If you meet one of these people, and they feel similarly about you, you could enjoy some flavour of romance together, and it would be mutually fulfilling. Things being mutually fulfilling between parties is self-evidently a good thing.
At an EAG, some people, primarily women, have bad experiences as a result of others’ romantic attention. These experiences can range from uncomfortable to traumatic. I think these negative experiences can then be grouped into two further categories:
Those that are are the result of malicious intent
Those that are the result of power dynamics, and can arise despite positive intentions.
I think your solutions are primarily concerned with the 2a category, and when reading it I was reminded of this comment, which I think puts it better than I could. There are people with malicious intent in every community, and I don’t think EA requires any particularly novel solution to deal with them. I agree with Isabel in that I’m also worried that when these threads come up, people will spend their efforts trying to either gauge the size of the problem, or theorise the optimal solution, rather than take any meaningful action.
I think 2b can be equally as damaging, and more should be done about it. Because EA is such a small, well-resourced community, there are especially strong power dynamics at play between individuals. As discussed in the blog post linked above, the EA community does not have strong boundaries between professional and romantic lives, in fact it seems especially tolerant of this intermingling—I claim this is a strongly negative thing. If someone a prospective future employer/grantmaker/”senior leader” starts flirting with me at EAG, even if they are being incredibly respectful and only have good intentions, I am under a lot of pressure to cooperate, even if that’s not what I want at all. If at the start I do genuinely reciprocate that attraction, and we engage in some kind of romantic interaction, and I later change my mind, there is again a huge pressure on me not to leave the arrangement, even though that’s what I want to do.
I’m not suggesting that EAs shouldn’t date one another, but I am suggesting a much stronger acknowledgement of power dynamics at play, both on an individual community level. Due to the lack of community emphasis, I suspect many beneficiaries of power dynamics in these situations do not think of themselves that way, and so may inadvertently do harm (this isn’t aimed at you personally—I don’t know whether you are or aren’t aware of this). It seems plausible to me that this would also help with 2a, as well as make the community feel more inclusive.
I’m worried your subdivision misses a significant proportion of harms that don’t fall into either category. For instance, interactions that don’t involve malice or power dynamics and are innocuous in isolation but harmful when repeated. This repetition can be made more likely by imbalanced gender ratios.
I think being flirted with during the day at an EAG like Nathan discussed above is a good example of this. If you’re flirted with once over the weekend, perhaps it’s fine or even nice, especially if it’s from the person you found most interesting. But if you’re flirted with several times, you may start to feel uncomfortable.
Well if a conference has 3x more men than woman and 1-on-1s are matched uniformly at random, then women have 3x more cross-gender 1-on-1s than men. Assuming all people are equally likely to flirt with someone of a different gender than them, it’s very possible that the average man receives a comfortable amount of flirting while the average woman receives an uncomfortable amount.
And it probably gets worse when one considers that these are random variables and we don’t care about the average but rather about how many people exceed the uncomfortable threshold and to what degree. And perhaps worse again if certain “attractive” people are more likely to receive flirting.
Overall, my point is that behaviors and norms that would be fine with balanced gender ratios can be harmful with imbalanced ones. Unfortunately, we have imbalanced ones and we need to adapt accordingly.
[edit: addressed]
Thanks for pointing this out—I’d pasted the wrong link, and have edited my original comment.
My sense is that gauging the size of the problem and thinking of good solutions is a useful thing to do.
Also I link the intermingling of romantic and personal lives, but I don’t like the harm it causes. And I think we can attempt solutions that attack the specific harms without damaging other benefits.
Punishments for people who make people uncomfortable at EAGs seem like a good idea
idk about “punishments” exactly; I would like EAG organizers to prioritise preventing harm, rather than acting as a justice system. Preventing harm is sometimes going to mean making clear to people that they should stop doing what they’re doing, and sometimes going to mean temporarily or permanently excluding people. These things look like punishments but I don’t know if I’d describe them as such.
Transparent process with room for forgiveness but that considered harms to all parties (rather than underrating accused)
Scandal markets are a good idea
Gossip is good
How about an opt-in speed dating event in the evening? That way the 40+% of women who desire flirts can obtain them, and there is no need or excuse to flirt with people during professional 1-on-1s.
If the conference organizers aren’t comfortable organizing a speed dating event, maybe one of the women who wants to be flirted with could step up and organize it unofficially. Could do lottery admission to keep the gender ratio even.
Edit: An EA matchmaking service is another idea
2nd edit: Amanda Askell says she likes ambiguity. Maybe you women should put your heads together on this
I’m sure they’ll discuss it at the next big meeting.
I apologize for making so many edits instead of submitting separate comments the way Nathan did. Based on checking the vote tallies on this comment repeatedly, I think it got most agreevotes after the first edit and before the second one. (I believe the agreevote was at around +8 at one point.) Suggesting that matchmaking is the idea that people like the most. Also, by “maybe you women should put your heads together on this” I was essentially suggesting a panel or focus group. I find myself increasingly unenthusiastic about participating in this thread. I think it could use a little more assumption-of-good-faith and sense of humor instead of what feels like eagerness to take offense.
Thank you for posting this. I was so sad to see the recent post you linked to be removed by its author from the forum, and as depressing as the subject matter of your post is, it cheers me up that someone else is eloquently and forcefully speaking up. Your voice and experience are important to EA’s success, and I hope that you will keep talking and pushing for change.
Thank you! Glad it resonated.
I’d counter that the focus on race and gender is very US-centric rather than culturally universal. I volunteer at a local charity, gender proportions are heavily skewed towards women being the bigger group. I neither find it a problem nor think any diversity measures should be introduced. It also seems fairly intuitive to me that it is the people who are the most privileged that can focus on such problems as AGI Safety and existential risk rather than those who struggle financially to live on the week to week basis.
Hey Maya! I just wanted to thank you for sharing your experience!
I’m sure it wasn’t easy to write it up and it took a lot of courage, and I’m really glad you did it!
Speaking for myself, sometimes I fear compassion will be used as an attack to push for concessions, so before I outwardly express it, I check whether I agree with criticisms. In that sense, discussion can be me taking something more seriously, not less. Now I’m not saying that’s helpful but I do think there are different communication styles at play here.
I’m sad to hear that you’ve felt the way you describe.
I genuinely find this fascinating. I don’t think I’ve ever felt worried expressing empathy would be used as a push for concessions, and haven’t wanted for it with this intent. I think your experience might be common though, perhaps among men in particular, and I think it we should talk about it more. Thanks for putting this out there.
Yeah, I think this article is a bit of a case in point. What is the author wanting if not significant changes and what are many comments rejecting if not discussions of whether that is reasonable.
Discussion can be compassionate. Disagreement can be compassionate. In fact, I’d argue that failing to have compassion and empathy for someone making a point is going to pretty seriously impair your ability to engage with the point, and even if you do you’re going to have a hard time communicating about it in a way that will be heard. I think seeing these things as in tension is a mistake.
I would like a polling question on this. I think that say 10 − 30% of women have had 2 or more belittling experiences at an EAG and that’s bad. But I read this paragraph and it seems alien to me. What % of women+nb folks have this experience in EA?
Re: “But I read this paragraph and it seems alien to me. What % of women+nb folks have this experience in EA?
‘I could tell you how tears streamed down my face as I read through accounts of women who have been harmed by people within the Effective Altruism community.’”
In the interest of reducing alienation, here’s some anecdata and context. Maya’s reaction wasn’t alien to me at all.
Among my female friends, having this type of reaction at some point was basically a developmental milestone. It wasn’t unique to EA. I expect such a survey would be more useful if you did some sort of national poll of women and compared it to women in EA.
Most (85%?) compassionate women I know reached a point in our teens or 20s where we got properly distressed and angry over gender-based* injustices that we and others had experienced. Most of us had experienced something bad or knew someone who did by then and it sucked.
If we happened to be involved in a specific “Good Community” that claimed some moral high ground (eg a church, an EA group, an “honorable family” etc), we hoped bad treatment happened less there and that we were safer there. Finding out, inevitably, that even in such a community, some of our peers and moral heroes also demean, harass, or rape each other is really awful. Our level of emotionality varied by personality type, but it sucked for all of us.
For Maya, this happened at its height about EA. For me, it happened about church. I’m no longer involved in religion. By the time I got to EA though, I expected some bad behavior even from “good people,” so I was just impressed the EA community health team existed to deal with it and didn’t have Maya’s intense reaction. I still do have a visceral reaction to others’ stories of harm though.
I don’t know if there base rate of problems is higher or lower in EA groups compared to other communities. I think the resources dedicated to good responses is higher than in other communities, which I feel good about.
I think we should try to track our base rates if we can, and plan to always dedicate resources to things like the Community Health team’s prevention and response efforts, because that is the price of admission for running healthy human groups.
I don’t mean to be callous or cynical, but I unfortunately now regard gender-based* violence as a gross and terrible part of being human that is present everywhere.
So, all communities and conscientious community members will need to contend with this unfortunate aspect of reality at some point and learn to talk about it in a healthy way, despite the pain. We’ll all need to have conversations like this sometimes, again and again, to address the pain involved, and it will bring up varying degrees of emotion and discomfort with that emotion, that are all pretty “normal”. It’s the price of admission for being alive and part of a social species.
Maybe, hopefully, the base rate of problems stays low within our microcultures and gets lower across the centuries as we keep learning how to human.
I hope saying this doesn’t belittle Maya’s concerns at all. They’re real and I’m glad she raised them so people could respond.
Fwiw, some of the discourse about these issues does seem more pointed, more naïve, more fearful, and less obviously compassionate in EA than eg in my church (maybe because of the gender and rationality skew in EA), so talking about it on the Forum felt worse than talking about it at church overall and has sometimes made me emotional. However, most EAs are less shame-laden than discourse in church, and more nuanced than discourse on eg. Twitter, so sometimes the Forum is preferable. I am more invested invested in EA communities so I’m willing to put more effort into these conversations than I would elsewhere.
To avoid intense frustrations though, I usually choose to talk about this stuff only with EA women or 1:1 with EA men rather then dealing with a Forum furor.
(*tbc, I see the same experiences happen for other characteristics that people realize they may unfairly targeted for, like race, sexuality, neurodivergence, etc).
What specific sort of things would you like to see that would make you feel like you were in a more compassionate environment?
On the Forum? Or IRL?
In real life, I’ve selected to be around very compassionate people in EA and outside EA.
On the Forum… more men who “translate” experiences into ones that other men understand and don’t feel threatened by might help. I’ve noticed Will Bradshaw does this sometimes. Ozzie too. AGB sometimes.
Kirsten, Ivy, and Julia Wise do it often too. I know that for a lot of women, it’s really frustrating to be treated so skeptically when we raise personal experiences or views that vary from men’s experiences.
When I’m 1:1 with my hyper-rational or autistic male friends in person, we figure out how to understand each other compassionately, so I know these individuals don’t usually mean to be as callous as they sound online.
When I’m with my EA women friends, we talk about our personal experience and the broader social issues with tons of nuance and we appreciate that no one is shamed for not sounding Tribe-y enough (for any given tribe).
But on the Forum, it’s just so often super annoying to engage. I try to simply talk about my actual life sometimes, but I know I’m going to ping someone’s “woke” alarm and end up in a stupid thread of comments.
So I consider the Forum fine for a certain kind of information exchange but mostly a lost cause for mutual understanding of anything that hits interpersonal emotional chords. I only comment here about that sort of thing when I have a lot of downtime and extra emotional bandwidth.
Kind of both here. Do you think that general activities towards building compassion are also helping here if this seems to be the thing you value in people towards making it a more comfortable environment for you? Like I’m wondering if this might be an unintended effect of general compassion building for animal welfare, and if interventions there might have some overlap in those best meant for this.
Sorry for my unfamiliarity, but could you explain what you mean by “translate experiences”? I feel as if I’ve probably interacted with what you’re talking about, but am not sure what exactly that is mapping onto.
I also hear you and am sorry that that has been your experience of the forum. But I really think it might be worth reconsidering that stance, because I think there’s a real chance it has changed in a significant way here, so maybe try it out again and see how it goes, in some small way that wouldn’t be too inconvenient for you if it hasn’t improved? I motivate it only because I think having as many perspectives on here as possible is a great thing, and that generally the emotional things we really care about are some of the best things to interact over.
Also don’t feel the need to respond to questions (even like the ones I’m asking) if it helps increase bandwidth for other things that don’t drain you similarly.
You asked about translation. I feel tired trying to explain this and I know that’s not your fault! But it’s why I just don’t think the Forum works well for this topic.
My guess is that talking about “women’s issues” on the Forum feels as similarly taxing to me as it does for most AI safety researchers to respond to people whose reaction to AGI concerns is, “ugh, tech bros are at it again” or even a well-intentioned, “I bet being in Silicon Valley skews your perspective on this. How many non-SV people have the kinds of concerns you mention?”
Most of us are tired of that naïve convo, esp with someone who thinks they have an informed take. Where do you even start?
Someone they trust has to say, “I’m like you and I’m concerned about this; it’s not just tech bro hype, and here’s why.” They have to translate across the inferential distance, ignorance, trust gap, and knowledge gap. They need to have the patience, time, and investment in bridging the gap.
Agreed that it would be very helpful to have a widely distributed survey about this, ideally with in-depth conversations. Quantitative and qualitative data seem to be lacking, while there seems to be a lot of anecdotal evidence. Wondering if CEA or RP could lead such work, or whether an independent organization should do it.
I would mainly like it to be easy to fill out so that the results are representative. I think it’s pretty easy for surveys like this to end up only filled in by people with the strongest opinions.
If it’s worth anything I would expect that figure to be between 28-43%. (I’m anchoring on your estimate, I would probably have guessed somewhere around 40-55% if I hadn’t read your comment )
I’ve been away from the Forum and just saw this comment. When you say “that figure”, what are you referring to?
What systems/solutions currently exist for “dealing with” misconduct, harassment, or assault after it happens?
What systems should exist?
I feel some hesitation about solutions that involves handing the power to blacklist or “punish” people to one agency.
But it’s really hard for individuals to publicly post about other people’s problematic behavior.
A friend of mine in the EA community told me they had been sexually harassed and stalked by another EA member and were considering posting on social media about it. I encouraged them to post do so. They were scared of potential backlash so they didn’t.
But I didn’t post about it at all. It feels inappropriate for me to do so on someone else’s behalf, especially since I’m not particularly wrapped up in Y’s life.
I wonder if scandal markets could potentially be useful (as Scott Alexander recently discussed in a recent thread), or something else inspired by scandal markets.
I think it’s plausible that we could use scandal markets on high-profile people in the EA community.
Scott writes, and I pretty strongly agree:
“I’m tired of bad things happening, and then learning there was a “whisper network” of people who knew about it all along but didn’t tell potential victims. It’s unreasonable to expect suspicious to come out and make controversial accusations about powerful people on limited evidence. But a prediction market seems like a good fit for this use case.”
But the majority of the people who do/will do/have done problematic things are unlikely high-profile enough to have a scandal market made about them.
It seems plausible to me that a well-designed system would be able to effectively deal with the kinds of issues the OP talks about and things like financial misconduct.
But I feel like there are a few challenges:
I can’t think of any community that effectively deals with misconduct that isn’t also authoritarian-esque (the CCP comes to mind). (That being said, please comment what other communities or systems exist that effectively deal with misconduct).
And a well designed system should reflect that different problematic actions require different responses.
I think this points to the weakness of a centralized system: most people agree that things like rape should lead to removal from the community. But a lot of things are debateable (like making a ranked list of women someone wants to hook up with), and if CEA or whomever implemented some response as “the authority”, it would almost certainly be opposed by some for being too lenient and by some for being too harsh.
It almost feels like making public knowledge of these kinds of things is the right thing to do, because then people will react accordingly.
But simply saying “we’re going to publicize every distasteful things others do so so that people can decide for themselves how they should respond” feels bad for a lot of reasons. For one thing, it would erode trust between members if people felt like they might be publicly outed for small infractions.
See this post: The community health team’s work on interpersonal harm in the community
I actually think people should be complaining to, or even complaining about, the community health team significantly more than they are. People on that team are paid to address problems like misconduct/harassment/assault. Complaints like Maya’s should be a key performance metric for them.
In my view, there should be a stronger default of people like Maya contacting the community health team to say “hey, I heard about women getting ranked in a way that made me uncomfortable”. And the community health team privately contacting the rankers to say “hey, you aren’t helping our goal of a warm professional community that welcomes a wide variety of people and incentivizes them to care about doing good over being hot”. Some might find this draconian—to clarify, I don’t think disciplinary action is justified here. I just think these conversations would be positive expected utility if done well.
I like your recommendations, and I wish that they were norms in EA. A couple questions:
(1) Two of your recommendations focus on asking EAs to do a better job of holding bad actors accountable. Succeeding at holding others accountable takes both emotional intelligence and courage. Some EAs might want to hold bad actors accountable, but fail to recognize bad behavior. Other EAs might want to hold bad actors accountable but freeze in the moment, whether due to stress, uncertainty about how to take action, or fear of consequences. There’s a military saying that goes something like: “Under pressure, you don’t rise to the occasion, you sink to the level of your training.” Would it increase the rate at which EAs hold each other accountable for bad behavior if EAs were “trained” in what bad behavior looks like in the EA community and in scripts or procedures for how to respond, or do you think that approach would not be a fit here?
(2) How you would phrase your recommendations if they were specifically directed to EA leadership rather than to the community at large?
These are both very important questions—for (1), I think it depends on the circumstance in all honesty. For example, the same way that volunteers are often trained before EAGs and EAGxs, I could see participants receiving something (as part of the behavior guidelines) outlining scenarios and describing why they were an example of inappropriate or appropriate behavior. However, I think it would be extremely difficult to “train” all members of the EA community as people are involved in many different capacities. For (2), I think that, despite all situations involving interpersonal harm and conflict being unique and complex, it could be useful to have more transparency in some areas. I don’t mean naming specific individuals and discussing all the details of each case, I more mean something like (X action is unaccpetable and will result in Y consequence if found to be true). Another note—my suggestions were aimed towards EA community members because I truly believe that, often, people simply do not understand how their actions/words make others feel. I hope that by raising awareness of this people will be motivated to change themselves without necessitating external conflict (although I understand that’s not always the case).
I’ve heard a number of stories of women feeling uncomfortable in EA spaces and they sadden me every time.
I feel the way you do. I feel your pain. Hugs and solidarity.
For the first and second example you listed, I think they fail the gender-reversal test. If it had been a woman who said she’d arranged a one-on-one with a man cause he was handsome, nobody would feel upset. Similarly if a group of girlfriends were privately ranking which of the men in the area they’d most like to sleep with.
Interestingly, this actually happened at an EA workplace of mine once. I talked to the people involved and told them how it made me feel. They seemed surprised, then felt guilty, and after some discussion and debate (we are EAs after all), they decided to not do it anymore.
I think this was more just a matter of low EQ and not thinking things through, rather than an objectification of women.
Personally, I would be extremely upset, and report it to the community health team
Thanks for sharing this. I think people have a tendency to overgeneralise about what “men” or “women” care about when having these conversations.
I notice I’m confused. If a woman said “Ooh, he’s attractive. I should set up a one-on-one with him”, you would report that to the community health team?
Why?
This seems like ordinary and harmless behavior. Maybe not the most strategic way to have a good conversation or get a good long term partner, but hardly a threat to the community.
(Although this is only one sentence. Maybe she only made this judgment after seeing that he had oodles of EA Forum karma, which is obviously the only correct way to evaluate mate quality 😛 )
I’m struggling to understand the thought process that would lead to this being reported.
I’m sure other people have answers about why they’d prefer not to have people book meetings based on attraction, but I’d like to say I support this kind of thing being reported to the Community Health team. The EAG team have repeatedly asked people not to use EAG or the Swapcard app for flirting. 1-1s at EAG are for networking, and if you’re just asking to meet someone because you think they’re attractive, there’s a good chance you’re wasting their time. It’s also sexualizing someone who presumably doesn’t want to be because they’re at a work event. Reporting this kind of breach of EAG rules seems entirely appropriate!
https://twitter.com/amylabenz/status/1558435599668895745?s=46&t=unZ0UrHR9pJNN03keeNXcw
Oh yeah, if you’re referring to it being at conferences, I can see that.
I don’t agree, but I can at least see why. Thanks for clarifying.
As Kirsten mentioned, the context of it being an EA conference is key.
I would assume it was a joke, if she was serious I would tell her not to, if she did it I would report it.
Because EAG(x) conferences exist to enable people to do the most good, conference time is very scarce, misusing a 1-1 slot means someone is missing out on a potentially useful 1-1. Also, these kinds of interactions make it much harder for me to ask extremely talented and motivated people I know to participate in these events, and for me to participate personally. For people that really just want to do the most good, and are not looking for dates, this kind of interaction is very aversive.
Thankfully, in my experience, it’s not ordinary, the vast majority of people schedule 1-1s at EAGs to discuss ways to do more good. Also, as we can see from these posts and my personal reaction, it’s not always harmless. I really value EAG time! I really don’t want to ask my most altruistic and talented friends to come to EAGs and then have them hit on, especially young ones that are choosing careers! There are other conferences and meetups for people that are looking for that.
I don’t share your belief that asking people for 1-on-1′s at EAGs only because you find them attractive is bad in general (although I’m open to saying its sometimes or even often wrong). I would like to understand your perspective though. Some questions:
1. What fraction of men/women that go to these events would you predict to prefer people not to do this? I’d be interested to see some data on this and let community norms be influenced by that.
2. How much is deceit the problem for you, where someone asks for a 1-on-1 pretending that they are interested in the other person for professional reasons? For example, what if my message clearly indicates that I’m not interested in the other person purely for networking or professional reasons, but it says something like:
“Hey, you seem cool, I think we share some interests in x,y, z (which aren’t professional/career/impact-related topics)! Would you be interested to have a quick chat about x,y or z at some point? No worries if you’d prefer to focus exclusively on more focused networking, that’s totally understandable.”
If I write a message like that because I find someone attractive (in some form), does that seem wrong to you? :) Genuinely curious about your reaction and am open to changing my mind, but this seems currently fine to me. I worry that if such a thing is entirely prohibited, so much value in new beautiful relationships is lost.
Yes, you’re still contributing to harm (at least probabalistically) because the norm and expectation is currently that EAG / swapcard shouldn’t be used as a speed-dating tool. So if you reaching out only because you find them attractive despite that, you are explicitly going against what other parties are expecting when engaging with swapcard, and they don’t have a way to opt-out of receiving your norm-breaking message.
I’ll also mention that you’re arguing for the scenario of asking people for 1-1s at EAGS “only because you find them attractive”. This means it would also allow for messages like, “Hey, I find you attractive and I’d love to meet.” Would you also defend this? If not, what separates the two messages, and why did you choose the example you gave?
Sure, a new beautiful relationship is valuable, but how many non-work swapcard messages lead to a new beautiful relationship? Put yourself in the shoe of an undergrad who is attending EAG for the first time, wishing to learn more about a potential career in biosecurity or animal welfare or AI safety. Now imagine they receive a message from you, and 50 other people who also find them attractive. This doesn’t seem like a good conference experience, nor a good introduction to the EA community. It also complicates the situation with people they want to reach out to as it increases uncertainty around whether people they want to meet with are responding in a purely professional sense, or whether they are just opportunistic. Then there’s an additional layer of complexity when you add in things around power dynamics etc. Having shared professional standards and norms goes some way to reducing this uncertainty, but people need to actually follow them.
If you are worried that you’ll lose the opportunity for beautiful relationships at EAGs, then there’s nothing stopping you from attending something after the conference wraps up for the day, or even organising some kind of speed-dating thing yourself. But note how your organised speed-dating event would be something people choose to opt in to, unlike sending solicitation DMs via an app intended to be used for professional / networking purposes (or some other purpose explicit on their profile—i.e. if you’re sending that DM to someone whose profile says “DM me if you’re interested in dating me”, then this doesn’t apply. The appropriateness of that is a separate convo though).
Some questions for you:
You say you’re “open to changing your mind”—what would this look like? What kind of harm would need to be possible for you to believe that the expected benefit of a new beautiful relationship isn’t worth it?
What’s the case that it’s the role of CEA and EAG to facilitate new beautiful relationships? Do you apply this standard to other communities and conferences you attend?
I’ll also note Kirsten’s comment above, which already talks about why it could be plausibly be bad “in general”:
”The EAG team have repeatedly asked people not to use EAG or the Swapcard app for flirting. 1-1s at EAG are for networking, and if you’re just asking to meet someone because you think they’re attractive, there’s a good chance you’re wasting their time. It’s also sexualizing someone who presumably doesn’t want to be because they’re at a work event.”
And Lorenzo’s comment above:
”Because EAG(x) conferences exist to enable people to do the most good, conference time is very scarce, misusing a 1-1 slot means someone is missing out on a potentially useful 1-1. Also, these kinds of interactions make it much harder for me to ask extremely talented and motivated people I know to participate in these events, and for me to participate personally. For people that really just want to do the most good, and are not looking for dates, this kind of interaction is very aversive.”
Before EAGSF this year, (on Twitter) I mentioned putting this on your SwapCard profile as a way to prevent the scenarios above where people ask others for meetings because they are romantically interested in them. So, instead, they could contact them off-site if interested and EAGs would hopefully have more people just focused on going to it for career reasons. My thought was that if you don’t do something like this, people are just going to continue hiding their intentions (though I’m sure some would still do this regardless).
I was criticized for saying this. Some people said they have an uncomfortable feeling after hearing that suggestion because they now have it in their minds that you might be doing a 1-on-1 with them because you find them attractive. Fair enough! Even if you, let’s say, link to a dating doc off-site or contact info that they can reach after the conference. I hoped that we could make it more explicit the fact that people in the community are obviously looking to date others in the community and are finding that very difficult. Instead, my guess is that we are placed in a situation where people will set-up 1-on-1s because they find someone attractive even if they don’t admit it. I do not condone this, and it’s not something I’ve done (for all the reasons listed in this thread).
Personally, I do not plan to ask anyone out from the community at any point. Initially, I had hoped to find someone with similar values, but I just don’t think there is any place it seems appropriate. Not even parties. It’s just not worth the effort to figure out how to ask out an EA lady in a way that’s considered acceptable. This might sound extreme to some, but I just don’t find it worth the mental energy to navigate my way through this and just want to be in career-mode (and, at most, friendship-mode) when engaging with other EAs. And, more importantly, there’s too much work and fun mixed, and it just leads to uncomfortable situations and posts like this.
I’m not making a judgement on what others should do, but hopefully whichever way the community goes, it becomes more welcoming for people who want to do good.
Thanks for the thoughtful reply! I think I actually agree with many of your points.
The strong disagreement with my comment definitely makes me think that I’m likely wrong here. I might have revised my position a bit and I suspect that if I’d be more careful and precise in stating what I tend to believe now, we wouldn’t disagree that much. So let me do that:
1. It seems ~always wrong or inappropriate to ask someone for a EAG 1-on-1 if that’s purely out of sexual attraction. (The “~” is there for weird edge cases)
I’m less sure if its always wrong to accept an invitation for a meeting if you find yourself having these motivations. What if there’s a plausible case for them benefitting from meeting you, but on introspection, you don’t think that’s motivating you to a significant extent?
I think the sexual motivations make this behaviour feel especially aversive and I’m a little less confident about the case where the attraction is purely non-sexual
I’m open to saying that because any of the above is wrong in most cases we should have a norm against doing any of this, but I think that needs more argument than I have seen so far (I feel generally a bit puzzled/worried that people seem to take such strong stances here on the basis of what seem to me like at best moderately strong arguments.)
(Meta-comment: in practice, I would imagine that its almost always a mix of different motives, like at least in my case I think attraction is often partially based on shared intellectual interests, a shared commitment to improve the world etc. )
2. It does not seem generally wrong to me to ask someone for an EAG 1-on-1 if that’s to a significant extent because you find them attractive (in a non-sexual way), but also for various other reasons like shared interest in some cause areas. In fact that seems largely fine to me.
Denying this seems like a strong claim for which I haven’t seen sufficiently compelling arguments. Why is this generally harmful in expectation or what are the overriding non-consequentialist considerations against this?
>> I’ll also mention that you’re arguing for the scenario of asking people for 1-1s at EAGS “only because you find them attractive”. This means it would also allow for messages like, “Hey, I find you attractive and I’d love to meet.” Would you also defend this?
I don’t think one thing straightforwardly implies the other; I think different norms might apply for what kind of motivations are appropriate and what ways of expressing them are. I do think you are pointing to an inconsistency here because I don’t think such a message would be appropriate at all and I also don’t want people to be actively deceptive about their motives for meeting someone. Maybe you’re right and the only way to resolve this is to say that its wrong in general to ask someone when you’re motivations are purely attraction based. I think there might be some edge-cases here, but I’m fine saying that this is roughly right.
Unfortunately I think I’m going to check out of the conversation here. I appreciate the engagement and the real-time updates, but I get the sense that this isn’t going to be a very productive use of time.
Here are some quick thoughts, hastily written:
RE: 1 and 2) generally
Basically I think this is all super susceptible to motivated reasoning, such that you might take actions that feel totally fine to you but still comes across poorly to the person you meet. Exactly what counts as “comes across poorly” is going to vary between individuals and context, and I don’t want to answer on behalf of all women here.
Here are some potentially useful heuristics:
Are you risking pushing any boundaries, or making any requests that you wouldn’t make if the person in question was otherwise identical but unattractive to you?
Are your actions clearly distinguishable from someone like this?
What would happen if everyone justified the same kinds of actions in the way you did? Would this be a safer, more welcoming community?
Imagine you have a 17yo sister going to a conference for the first time, looking to meet people in the field. You care a lot about her and you feel pretty protective. What kind of people would you feel most comfortable with? Are you the kind of person you’d trust her with? It shouldn’t take a hypothetical younger sister to prompt the kind of empathy that’s required here, but some people I know find hypotheticals like this useful.
RE: 2) more specifically
Again, I disagree. Lets say we have already established EAGs as a place for professional interactions and networking, and this is my expectation going in. And lets say I only want to meet people who are interested in me in a purely professional capacity. Let’s say I don’t want to second guess whether these people wanting to talk to me are interested in my work or something else. How do I make sure I don’t receive a message from people who might reach out to me “to a significant extent” because they find me attractive? (say, because I don’t want them to start hitting on me mid 1-1, or make me feel like this isn’t a professional meeting?)
I found this part of the comment slightly frustrating because you’ve basically just repeated the same premise and changed it from requesting 1-1s “only because you find them attractive”, to requesting 1-1s if the reason is “to a significant extent because you find them attractive”.
The same issues clearly still apply, just to a slightly less problematic extent. What’s next, you’re going to come back and ask me “what if the attractive part is just to a moderate extent”? I guess it feels like you’re not actually really engaging with the points I raised, so I’m pretty uncertain about what are you trying to achieve here?
Also, I don’t see how the burden of proof on me to deny a claim that you haven’t justified? You’re the one that’s come along with a new claim and just said “in fact that seems largely fine to me”. Presumably I can just say “Well, in fact that seems largely not fine to me”. You’re the one suggesting the existing norms are overly restrictive in some way, so you’re the one that should justify that claim, but I don’t actually think you’ve done this. So again—what’s the case that it’s the role of CEA and EAG to facilitate new beautiful relationships? Do you apply this standard to other communities and conferences you attend?
End of the day—if you’re asking for 1-1s that you otherwise wouldn’t because you want the potential for a new beautiful relationship (which is the only reason you’ve given so far for endorsing this approach), but the other person doesn’t, then you are going into the conversation with different incentives in mind. I’m not here to tell you how to live your life, but this is definitely well within the kind of behaviour that some women would find uncomfortable. EAGs aren’t about facilitating you to meet people you find attractive, and you might disagree with this, but you haven’t made a case for why you think this would be better on net / in expectation.
Yes, the inconsistency I’m pointing at here is that the position you’re trying to defend does in fact allow the message you don’t find appropriate. My guess to the cause of the inconsistency is probably because the comment you suggested shares more in common with similarly phrased messages that are used in a professional networking context, and is more likely to be misunderstood as something more innocuous or professional. Otherwise, the message you consider inappropriate is less deceptive and is intended to achieve the stated purpose of creating new beautiful relationships, no?
Thanks for the reply. Just one comment, because you said you didn’t want to engage more and I feel similar:
>>Also, I don’t see how the burden of proof on me to deny a claim that you haven’t justified? You’re the one that’s come along with a new claim and just said “in fact that seems largely fine to me”. Presumably I can just say “Well, in fact that seems largely not fine to me”. You’re the one suggesting the existing norms are overly restrictive in some way, so you’re the one that should justify it, but I don’t actually think you’ve done this. So again—what’s the case that it’s the role of CEA and EAG to facilitate new beautiful relationships? Do you apply this standard to other communities and conferences you attend?
I think the burden of proof is clearly on you because denying 2) seems to me like an apriori (to knowing the details of the discussed actions) extremely unlikely claim: take any other kind of action, how often can we really say that literally all actions of that kind are wrong? Not even with lying or stealing is that true. Denying a universal statement of that kind is, I think, a prior extremely likely (at least if the set of actions is large). I think this is clearest if you are sympathetic to some form of consequentialism. That’s why I think 2) doesn’t need much argument in its favor ,but your position needs very strong arguments to be plausible.
Thank you for writing this. I’m sure it was very difficult to do, and so I really appreciate it.
I strongly agree with this. Have you seen Michel’s post on emotional altruism? It doesn’t get to your points specifically, but is similarly the need for more open emotion in the movement.
I also to add something that, in my experience, cannot really be ignored when speaking about the expression of emotion in particular in EA. EA has a lot of people who are on the autism spectrum who may relate to emotion differently, particularly in the way they speak about it. There are others who aren’t on the spectrum but similarly have a natural inclination to be less or differently publicly emotional. EA/rationality can feel rather welcoming (like “my people”) to people like this (which is good—welcomingness of people whose brains work differently is good) and this may produce a feedback loop. This is not at all to deny your recommendations on this particular area. Rather, it is to acknowledge that some proportion (far from all) of what you call the “emotions problem” is probably just people being themselves in a way we should find acceptable, which means that I am a bit more confused about how to best address it.
thanks for pointing this out—I think this is a key point AND I think it is inflected by gender. My guess (not being an expert on autism, but being somewhat of an expert on gender) is that women who are autistic are more likely to learn, over time, how to display and react to emotion “like normal people”, because women build social capital through relational and emotional actions. Personal experience (I am a woman, to a first degree approximation): as a child I did not really understand emotion / generally felt aversive when other people expressed it. Over time I learned how to feel / respond to others’ emotions in a socially normative way, through observation and self-reflection and learning.
This is not to say that those of us in EA who are naturally different w.r.t. our emotional processing should feel bad/abnormal, but to say that EA would be a more welcoming community, especially to women, if people in EA learned how to process and respond to “normative” emotional expressions. Someone above said that EAs see debate as an expression of caring, and I (a) am the same way and (b) understand that most people are not! I’ve learned to ask “are you looking for discussion and finding solutions together, or are you not ready for that yet?” (Similarly, people with more normative emotional expression entering EA should learn to ask/adapt to the person they’re talking to.) I’ve been in spaces that I think are very good at this and have a cultural norm of it.
This is an important consideration, thanks for bringing it up! I pretty much agree with all of it.
Thanks for sharing. I think EAs are only ethically better than other people under consequentialist ethics, but are just as bad as anyone else when it comes to virtues and obeying good social rules, which is sad, because we can and should do better.
I don’t think this is true. Not sure how you’d measure or verify/refute this, but I suspect that the average EA man objectifies women much less than the average non-EA man. It’s just that we have an imbalanced gender ratio so these incidents are disproportionately concentrated onto few women, which is really unfair to them.
Tacking on with Robi, I strongly disagree with this, and would point to the fact that there are deontologists and virtue ethicists in our midst who mark their own moral life by the things you speak of EAs being “just as bad” at. But in the spirit of trying to understand your position further, what specific things do you think EAs are just as bad at?
I thought I would surface some of the points from the post and allow people to express opinions on them. I know this can seem off, but I think cheaply allowing us to see what the community thinks about stuff is useful.
Agrevote if you think this is good/fine, disagreevote if you think it’s bad.
Imagine the opposite situation—a group of women talking about in detail about which men they would want to hook up with.
Agreevote if you think that would be good/fine, disagreevote if you think that’s bad.
Agrevote if you think this is good/fine, disagreevote if you think it’s bad.
I have heard 2+ accounts of this (heck, as I’ve apologised for before, I’ve done it), so I think it’s pretty common.
My stance is that EAGs should have a high penalty for making people feel they aren’t valued for their work. People can take the risk if they want to but there should be a high penalty if people are bad at it. Social gatherings and afterparties are, in my opinion, the place to flirt without a risk of some kind of community sanction.
Agreevote if you think the nasty comments were from people in the EA community, disagreevote if you think they weren’t.
Agreevote if you think the actions here are, on balance bad, disagreevote if you disagree.
Here’s a relevant tweet thread.
One response from Ollie:
I feel like we need stronger communication coming from event organizers regarding these things. Even though it doesn’t affect me personally.
I overall agree with the ideas presented in this post and I think they deserve more attention. I think the above part is especially true. Its true that discriminatory tendencies in a community doing good don’t erase its overall positive impact. HOWEVER. It does, how you state, exclude some people from helping. And if that “some people” is 50% (in some countries more) of college graduates, that seems like a real big problem.
Thank you for writing this!
Hi, thank you for your post, and I’m sorry to hear about your (and others’) bad experience in EA. However, I think if your experience in EA has mostly been in the Bay Area, you might have an unrepresentative perspective on EA as a whole. Most of the worst incidents of the type you mention I’ve heard about in EA have taken place in the Bay Area, I’m not sure why.
I’ve mostly been involved in the Western European and Spanish-speaking EA communities, and as far as I know there have been much less incidents here. Of course, this might just be because these communities are smaller, or I might just not have heard of incidents which have taken place. Maybe it’s my perspective that’s unrepresentative.
In any case, if you haven’t tried it yet, consider spending more time in other EA communities.
Your comment (at least how it’s read as, maybe different from your intentions) reads as “that’s a particularly problematic location, just go to a different one”.
That doesn’t solve the problem. That doesn’t hold the Bay * or any community accountable or push for change in a positive direction. I think that sort of logic is a common response to what Maya writes about and doesn’t help or make anything better.
*and this is coming from an ex-Berkeley community builder
I agree that would be an unhelpful takeaway from this post/these experiences
I have only been to the bay area once, and I felt a culture shock from the degree of materialism and individualism that I experienced in the community. On one occasion, I tried to call it out publicly and got rebuffed by a group.
However, I do think it’s unfair that the bay area is presented as representative of the wider EA movement, in a way that, for example, ea berlin—wouldn’t be.
I haven’t really spent time with the community there, so I’m curious about the individualist & materialist point. Could you expand on that a bit more?
What is it about the Bay area that makes these issues more prevalent or severe, if they are? Seems worth finding out if we want to push for change in a positive direction.
My thesis here revolves around the overlap between tech and EA culture and how this shapes the demographics. We should expect higher rates of youth, whiteness, maleness, and willingness to move for high pay in the Bay Area because of the influx of people moving for tech jobs in the past 10 years. There could also be some kind of weird sexual competition exacerbated by scarcity.
Here are some other unusual things about the Bay Area which may contribute to the “vibes” mentioned:
Founder effects: Bay Area EA organizations tend to be more focused on AI and therefore look to hire tech-types, growing the presence of people who fit this demographic (these orgs also could have been founded in the Bay because of these demographics, it’s unclear to me which came first)
Extremely high wealth inequality and the correlation of wealth with other things EAs select for (e.g. educational attainment) likely means EA in the Bay selects much harder for wealth than in other places
Racism has a profound influence US society. In my experience, people who are unfamiliar with both the history and modern day effects of race in America (or are from more homogenous countries) are worse at creating welcoming spaces and seem to underappreciate the value of creating diverse groups
There is a high prevalence and acceptance of hookup culture and casual sex
There’s high tolerance for non-traditional relationships by broader society
The US is one of the most individualistic cultures in the world according to cultural psychology measures
Overall, the Bay Area is much unlike the rest of the world according to most demographic criteria, and it’s plausible that different outreach strategies are needed there in order to find driven and altruistic people from with a diversity of ideas and approaches to doing good.
Really appreciated this comment and found it interesting, thanks!
My guess is that it’s because the bay area has a lot of professional power entangled in it such that power dynamics emerge much more easily in the bay than elsewhere.
Hi Maya,
Thanks for sharing your experience! I hope it improves.
I really like this point.
Thank you, Maya, for sharing your concerns about some of the challenges you’ve experienced in the EA space. It can feel especially disappointing when those we hold in high regard prove to be all-too-human.
It’s a truism to state that the challenges you point out are occurring with regularity in and out of the EA community. However, as part of an EA ecosystem minority group, you bear an undue share of the burden.
As you point out, attracting a more diverse group to the EA community is critical to keeping EA relevant in the future. Diversity is not a “nice-to-have.” Diversity is a mission imperative if we are to collectively maximize effectiveness.
To be clear, attracting and developing diversity will create friction in the short term. In the best case, that will manifest as low-grade friction in perpetuity as new voices are introduced to challenge our thinking. The resulting conversations, thought processes and solutions will be more effective in the long term.
The movement may well be near an inflection point regarding accountability. Holding certain influencers or leaders in an “untouchable” status will only harm the EA community in the long term. Accountability is a baseline assumption that must be met for EA principles to be considered credible.
When talking about Sam Bankman-Fried I read a bunch of times the claim that EA failed because it didn’t put sufficient effort into checking his background. It might be worthwhile to fund a new organization, ideally as independent as possible from other orgs whose sole reason for existance is to look into the powerful people in EA and criticize them when warrented.
While it might be great if CEA would be able to fill that role, they happen to be an org that in the past didn’t honor a confidentiality promise when people came to them with critizism of powerful people in EA and don’t think this was enough of a problem to list it on their mistakes page.
You can actually see a description of what happened on CEA’s Mistakes page.
Okay, it’s good to see that it’s finally there as it wasn’t the last time I publically complained about it. At the time it seemed like apologizing deep in a comment thread was the only action that CEA felt warrented.
You say that like it is a bad thing. I would suggest reading [Against Empathy](https://en.wikipedia.org/wiki/Against_Empathy#Empathy_versus_Compassion) to understand why it is a very good thing.
We need less emotion-driven decisions and more rational decisions, that is the whole point of EA.
The world, in general, is less rational and more emotion-driven than most of us would consider optimal.
EA pushes back against this trend, and as such is far more on the calculating side than the emotional side. This is good—but the correct amount of emotion is not zero, and it’s quite easy for people like us who try to be more calculatory to over-index on calculation, and forget to feel either empathy OR compassion for others. It can be true that the world is too emotion-driven, while a specific group isn’t emotion-driven enough. Whether that’s true of EA or not...I’m not sure.
I am unpleasantly surprised by what I hear in this and other posts that make it sound like there is a sort of EA “party scene” or something. It sounds like EA men (and maybe others) need to focus a lot more on working their butts off every single day to help others, and a lot less (or ideally not at all, but that can be tough) on trying to get laid/dating.
Yeah, how dare they not dedicate every waking hour to helping others-the audacity!
In all seriousness, pretty sure the problem here are things like people who don’t respect others’ boundaries/not recognizing power dynamics, the culture that normalizes this, and the institutions that don’t adequately mitigate this risk, not the fact that people trying to do good in their life can also spend parts of their life socializing.
That goes without saying.