Hubris and coldness within EA (my experience)
Hi all.
Like a lot of people that have had a connection to EA I am appalled by the close connection between the FTX scandal and EA. But not surprised.
The EA community events I attended totally killed my passion for EA. I attended an EA global conference in London and left feeling really really sad. Before the conference I was told I was not important enough or not worth the time to get career advice. One person I’d met before at local EA events made it clear that he didn’t want to waste time talking to me (this was in the guide btw to make it clear if you don’t think someone is worth your time). Well it certainly made me unconfident and uncomfortable to approach anyone else. I found the whole thing miserable. Everyone went out to take photo for the conference and I didn’t bother. I don’t want to be part of a community that I didn’t feel happy in.
On a less personal level, I overheard some unpleasant conversations about how EA should only be reserved for the intellectual elite (whatever the fuck that is) and how diversity didn’t really matter. How they were annoyed that women got talks just for being women.
Honestly, the whole place just reeked of hubris—everyone was so sure they were right, people had no interest in you as a person. I have never experienced more unfriendly, self-important, uncompassionate people in my life (I am 31 now). It was of course the last time I was ever involved with anything EA related.
Maybe you read this and can dismiss it with yeah but issues are too important to waste time with petty small talk or showing interest in others. Or your subjective experience doesn’t matter. Or we talk about rationality and complex ideas here , not personal opinions.
But that is the whole point I’m trying to make. When you take away the human element, when you’re so focused on grandiose ideas and certain of your perfect rationality, you end up dismissing the fast thinking necessary to make good ethical decisions. Anyone that values human kindness would run a mile from someone that doesn’t have the respect to listen to someone talking to them and makes clear that their video game is valued above that person. Similarly to the long history of Musk’s contempt for ordinary people.
EA just seems so focused on being ethical that it forgot how to be nice. In my opinion, a new more inclusive organisation with a focus on making a positive impact needs to be created—with a better name.
- EA & LW Forums Weekly Summary (7th Nov − 13th Nov 22′) by 16 Nov 2022 3:04 UTC; 38 points) (
- EA & LW Forums Weekly Summary (14th Nov − 27th Nov 22′) by 29 Nov 2022 22:59 UTC; 22 points) (
- EA & LW Forums Weekly Summary (14th Nov − 27th Nov 22′) by 29 Nov 2022 23:00 UTC; 21 points) (LessWrong;
- EA & LW Forums Weekly Summary (7th Nov − 13th Nov 22′) by 16 Nov 2022 3:04 UTC; 19 points) (LessWrong;
- 16 Nov 2022 3:27 UTC; 5 points) 's comment on Selective truth-telling: concerns about EA leadership communication. by (
- 15 Nov 2022 21:21 UTC; -38 points) 's comment on Selective truth-telling: concerns about EA leadership communication. by (
I imagine it feels challenging to share that and I applaud you for that.
While my EA experiences have been much more positive than yours, I do not doubt your account. For many of the points you mention, I can see milder versions in my own experience. I believe your post points towards something important.
Hi James,
Thanks for writing this—its difficult/intimidating to write and post things of this nature on here, and its also really important and valuable. So thanks for sharing your experience.
Please don’t read this response as being critical/dismissive of your experiences—I have no doubt that these dynamics do exist, and that these types of interaction do happen (too frequently), in EA spaces. It makes me unhappy to know that well-intentioned people who want to make a different in the world are turned off by interacting with some people in the EA community, or attending some EA events.
I do want to say though, for fairness sake, that as a member of an ethnic, religious, and geographical minority in the EA community, I feel valued and respected, and that I don’t think the attitudes or opinions of the people you’re reporting in your post are that common in the greater community, and that (the vast majority of the EAs I know) would be upset to hear another EA behave the way you’re reporting they did.
^This preempts what is the overall theme of the ideas I had when reading your post: that we make a mistake of thinking about the EA community, and EA events, as monolithic or homogenous (in some ways—it is obviously homogenous in many ways). These aren’t directed at you, but they’re relevant here.
1. Specifically about EA events:
People attend EA events (especially EAG(x)s for many different reasons. Some people go to expand their network in a specific way/in a specific domain. Others go to further their work or certain objectives and are singularly focused on doing so. Others attend because they value the social and communal spirit of being in a big gathering of altruistically motivated people. However, in my opinion, we should not lose track of the fact that these events exist to improve/enhance attendees positive impact on the world, and to improve the wellbeing of the beings we serve—those suffering in the developing world, animals in factory farms and elsewhere, and the disenfranchised yet to be. We shouldn’t be viewing conferences primarily as places for the EA community to congratulate and celebrate ourselves and have a jolly good time. Given how limited/scarce time is at these events, I do think its reasonable for people to be mindful of the way that they use their time, and be open in communicating when they think an interaction isn’t producing value (to other people, not just the participants of the interaction). But the way that they do that can vary in appropriateness and It’s hard to see a reason that someone does this in a way that insults the other person when a non-insulting alternative could have easily been deployed.
2. Generally about the EA community:
There are people from overlapping communities, sub-groups, and differently-motivated backgrounds in the EA community—yet alone people with differing moral schools of thought, cause-area interest, and needs of the EA community. Not to try and caricature you or try and psychologically analyse you, but the types of complaints in your post point to the types of deficiencies which would be most noticed by someone who would highly value the social and communal nature of the EA community, which many people do. However it’s easy to forget that not everyone cares about the community—many people who are in the EA community care about the community and its network for purely instrumental reasons (in that its only valuable because it helps them achieve their goals). I’m sorry that the community interactions you had were so negative and not what you’d want them to be like. However, there are lots of other places where ‘nice’ people abound that you could be part of at the same time as being part of the EA community. One thing I worry about is EAs trying to use the EA community/ecosystem to fulfil every possible social function/need, because its clearly not set up to do that. Please don’t abandon EA values or goals just because of these interactions—being an EA is about how you live your life and how you behave and treat others.
(again, I really do mean this all in the most understanding and sympathetic way—I hope it comes across, but I apologise if it doesn’t).
I was on the fence between posting this under my name vs. using an anonymous account. I decided to go ahead, because this is something I’ve discussed with other folks and it’s something I feel pretty strongly about. I wanted to write this comment both to validate your experience and to say a few words about how I see the path forward.
I’ve had those experiences too: feeling dismissed, shut down, or like I’m not worth someone’s time.
But—and maybe this is because I have a stubborn, contrary, slightly masochistic, “oh yeah? I’ll show you” streak—I stuck around. I’m not saying that this is the only way to go; if hanging out with other people in the EA community is causing you pain, I don’t want that for you and it is 100% OK to go and do your own thing.
But if you can: stick around.
Because here’s the thing: not everyone is like that. I’d go so far as to say that folks with the attitude above are in the minority. There are SO many humane, warm, kind people in this movement. There are people with a sense of humor and a healthy bit of self-doubt and a generous willingness to meet others where they are. When I hang out with them, I feel inspired to work harder and do more good and to continue to be part of this community. And I’ve made it my task to find those people, encourage them, and make sure they stick around too.
If you (and I’m addressing anyone reading this, not just James) have a vision for what you want a given community to look like, you can stick around and help bring it to life. We get to create the communities we want to be a part of—how awesome is that? For my part, that’s what I’m striving to do. And I’m here to encourage others to do the same.
Posting from an alt account...
Definitely feel like hubris and elitism damage new ideas being explored within the EA space. Speaking from my experience with a new idea that I founded a non-profit to promote, I have found the EA community generally unhelpful, with a few notable exceptions.
I was encouraged when I heard 80k and other sources discuss the value of exploration in conjunction with exploitation. This would mean if there isn’t evidence to support a new idea or intervention, but there is a plausible mechanism of impact, search costs are usually warranted. However, when discussing my idea, typically there was an exultation of “red-teaming” with very little discussion of development of the idea or empirical validation. I know that strongly evaluating the possibe limitations and downsides of new ideas is indispensable, but the degree to which this is valorized over idea development is absurd with respect to new ideas.
My experience interacting briefly with people with some power and influence in EA was rather disappointing as well. Always had the impression that organizations thought the existing thought leaders knew essentially all of the areas in which fruitful interventions might be found or ideas had merit. As far as EA using its resources to aid projects, the central consideration has seemed to be the connections one has made. A defense of this allocation can be made in that people with good ideas will eventually become known within EA, but the process is slow and selects for those with networking skills and patience.
When applying for grants with EAIF, a declination was not met with any explanation. The reasoning was that they did not have time. The notion that EAIF lacks the resources to hire sufficient staff that one could explain deficiencies in a grant proposal is absurd. It seems to me that either they they don’t hold people trying to contribute to EA in new ways in high regard, or that an explanation would potentially render them in some way accountable.
My local EA group is nice, but the focus seems to be valorizing EAs heroes rather than support ideas of the members. They often are unwilling to dedicate any thought or time to new members’ ideas. I would think that such groups could be working together to explore and develop high EV potential ways to better the world. Instead, my experience has been that of a fan club of thought leaders.
I cannot help but feel concerned that there are many people with awesome ideas that could produce high EV that lack my stubornity and/or confidence in my idea. I am a very strong believer in the core of EA: using reason to ascertain how to do the greatest good and doggedly pursuing it. I would think it would be immensely high EV to cultivate new ideas, evaluate the means and costs of empirical testing, and helping EAs, in fact implement tests.
EA purports to value new ideas, but it appears in action often unhelpful and even smothering with regard to their development.
Can I simply concur by point out an example from just yesterday. A woman by the name of Keerthana Gopalakrishnan posted on the forum relating her experience of being sexually harassed at EA events. The general response was not one of empathy, or an attempt to understand how the movement might address this serious problem. Instead, it was to prod, poke and question, invigilating her every claim as if it were a philosophy essay. The apparent assumption being that there is only one genre of writing—analytic, rationalistic—and anything short of that, no matter how important, is beneath consideration. I think it was nothing short of disgraceful.
“In my opinion, a new more inclusive organisation with a focus on making a positive impact needs to be created—with a better name.”
This seems right to me, and it should be focused on recruiting in a different part of the population, and having a very different culture than EA. Of course the end goal should still be doing lots of good in the world as efficiently as possible.
At the same time, EA and the EA brand should remain, and without any big cultural changes (ie still mostly utilitarian, nerd, philosophy grad, using explicit EV calculations, and jargon heavy).
We need more than one effectiveness focused altruism brand.
Related: There is EA the actual movement, and EA the philosophy. I wonder how much we are losing out on by not having a clear line between the two. Maybe internally this distinction can be carefully navigated, but to an outsider it is one and the same. I wonder if that might be one of the things that could be improved about EA.
I’m not sure I agree with this. As far as I can tell the EA community has always been quite focused on being inclusive, kind and welcoming—see for instance this and this post from CEA, which are both years old. I’m very sorry to hear about the OP’s experiences of course, and honestly surprised personally since my own experience has been a lot more positive. However, this doesn’t automatically imply to me that we need a whole new community or something to that effect.
I would see this more as presenting an opportunity to improve our culture and amend any failures that our currently happening despite the efforts of a lot of community leaders. I don’t think there’s a ‘fundamental flaw’ in how the EA community is trying to operate in that respect. Also it seems to me that distancing the EA brand in this way you’re suggesting would potentially incentivize it to become even less human and amiable—because then it would be distinguished by being the ‘weird, rationalist / philosophical community’. (Not to mention that it would seemingly decrease opportunities for collaboration with the ‘other community’ and create confusion for those looking to get involved in EA.)
Edit: Just to be clear, I’m not making any general claims here about how successful the EA community has been in implementing the ideals I mentioned above. Obviously this post points to updating against that.
My view is that there should be a ‘weird, rationalist/ philosophical altruistic community’ that is allowed to be as inhuman, non-amiable, etc as it wants to be, and that is a good thing because it is an useful sort of place for certain types of people to find each other and interact, and because it will come up with ideas that wouldn’t be found if this group was thoroughly mixed with other sub groups of people.
I’m very sorry to hear you’ve had these experiences, thanks for sharing them though.
My experiences have been very different to this but I can (sadly) see how this might happen.
If you want to chat I’d be more than happy to some time and can promise to be welcoming and hope I can help in any way I can.
Feel free to DM me 😀
I’d be interested to know if there’s any psychological research on how niceness and being ethical may be related.
For example, prior to the FTX incident, I didn’t usually give money to beggars, on the grounds that it was ineffective altruism. But now I’m starting to wonder if giving money to beggars is an easy way to cultivate benevolence in oneself, and cultivating benevolence in oneself is an important way to improve as an EA.
Does walking past beggars & rehearsing reasons why you won’t give them money end up corroding your character over time, such that you eventually become comfortable doing what Sam did?
H/T these tweets.
There is a plethora of research on the subject, including a growing body of evidence which suggests we are born with a sense of compassion, empathy, and fairness. Paul Bloom has done some amazing research with babies at the Yale psych lab, and more recently the University of Washington published a study suggesting altruism is innate.
A brief overview of Paul Bloom’s work:
The Moral Life of Babies, Yale Psychology Professor Paul Bloom finds the origins of morality in infants
More on the study from the University of Washington:
Additionally, I’m pasting part of an article here that I found interesting, and it has several relevant studies linked within. I hope this helps.
On the flip side, here is an interesting article by a Stanford professor explaining why arrogance is the biggest risk to unethical behavior for organizations:
I don’t think that not giving beggars money corrodes your character, though I do think giving beggars money improves it. This can easily be extended from “giving beggars money” to “performing any small, not highly effective good deed”. Personally, it was getting into a habit of doing regular good deeds, however small or “ineffective” that moved me from “I intellectually agree with EA, but...maybe later” to “I am actually going to give 10% of my money away”. I still actively look for opportunities to do small good deeds for that reason—investing in one’s own character pays immense dividends over time, whether EA-flavored or not, and is thus a good thing to do for its own sake.
Truly sorry to hear about your experience and thank you for sharing it.
Relevant?
Some chaotic thoughts on this.
I somewhat agree with you. I think the specific way in which some EAs interact with other people, especially at conferences, can be very off-putting. As a community organizer myself and someone who thinks that expanding the EA community is important, I think people really should work on their communication skills and manners if we want to expand as a community. From personal experience—I am a member of a minority myself, someone who joined EA about a year ago, I am also a community organizer. 70% of my interactions with EA people have been positive, but 30% were not-so-great.
I have brought some friends interested in EA conferences and some of them did not have the most positive experience. Some of the most common complaints included people seeming argumentative and wanting to argue and check every statement another person would make. Now, while it is important to examine the line of thought of people, it can also come off as socially unacceptable. There are some things that can be challenged—but also some things that are too personal to people’s lived experiences. Being ‘socially awkward’ or introverted is not an excuse to hurt someone’s feelings or to come off as rude. If you want more people to relate to your ideas, you probably should learn how to make them relatable to them and be more open to experiences or ideas that don’t seem rational to you.
Telling someone you don’t want to waste your time on them just like that is plain rude, no possible approach or personality trait can justify this behavior. It’s really easy to say “Hey I am so sorry, I have a very tight schedule and don’t have time for any more 1-on-1s, I hope you understand!”. You can literally copy a polite sentence and respond to everyone with that if you are not interested…
Making others feel they are not important enough is literally a recipe for pushing people away from the EA movement. No one has a monopoly on the movement, even people who think they are ‘very high-up’, ‘important’, and ‘in the mentoring position’. It’s frankly very annoying that some people think they are much better than others, and other people with insecurities fuel that belief and make these people believe they are so-so important. This top-to-down approach often does not work. Most of the people who think they are so much more important than you are probably not.
I agree with you as often I have felt some weirdness from certain EAs. I think part of it comes from the fact that several of these EAs are born and raised in a context where they don’t experience much oppression and have the privilege of thinking about several issues that affect some people only theoretically. I am not a cultural leftist at all, but I do think that insensitivity to some experiences is a problem in EA at times.
That being said, it’s just one part of people in the EA who act the way you and I experienced. There are also some people who navigate social contexts better and talking with them has made my experience in the EA way more enjoyable.
Maybe some of the things that we could do as a community is have people who would train people on communications or have more workshops related to how to communicate with others in and out of the community more effectively. It is quite unbelievable that we need training on communications just to make someone from the US or Germany understand that someone who is a woman, queer, coming from Eastern Europe, the Middle East, or South Asia, will probably experience things differently and their Western explain-it-all attitude would not be the most effective.
Hi James, thanks for sharing this. As others have said, it is a difficult thing to do. I’m actually weirdly looking forward to the EA criticisms that will come out of this FTX business. You often hear of the abstract need for criticism and “red-teaming” but not much about the actual criticisms.
I think your story chimes with a bigger difficulty in the EA movement : how small-scale effectiveness measures (ie not talking to junior EAs) end up stymying the movement on a larger scale (being unfriendly and putting people off).
I’m also worried about whether a utilitarian movement really can value integrity, friendliness etc. I can see how it might see the value in appearing to have integrity or appearing to value diversity. But if those things get in the way of effectiveness, won’t they be covertly canned?
I’m a 30y.o. in London and consider myself fairly friendly. If you want to talk about stuff, get in touch.
I’m confused about this. A lot of criticisms and red-teaming occurred during the recent competition. Maybe you could clarify what you meant?
Perhaps this was unfair of me. I mean as a casual user of EA social media spaces before last week, I came across non-strawman criticisms, or even expressions of personal doubt, quite rarely. Like any movement, I think there’s a hidden pull to virtue-signal (even when this is explicitly recognised as a danger), and it certainly seems like the FTX thing has given more people confidence to air reservations they had been keeping to themselves (and I don’t mean the people saying “I saw this coming and didn’t tell anyone”).
Thanks to pointing me to the red-teaming contest. I read the summaries of the 3 top winners, and I guess I was using the wrong definition of red-teaming in my comment here. I’m interested in fundamental criticisms of EA as a philosophy and as a movement. Not necessarily because I’m looking to disavow EA, but because a) I want to know how best to communicate it to a sceptical audience and b) I think such criticisms can be useful in deciding what to prioritise in meta-EA.
I guess the way I see it, the more intellectually solid a movement is, the more effort it is to produce a solid criticism. So if a movement is intellectually solid, a lot of the criticism on social media will end up being very bad b/c social media pushes towards lower effort than other formats such as the EA forum.
(Another way of putting this: If you’re going to go to all the effort of making a proper critique, why post it on fb vs the EA forum where you’ll geet deeper engagement?).
I’ve had a similar awful experience although with less direct negative feedback. Without going to to too much detail, I have deep domain expertise in a top EA cause area. While some of the EAs I’ve met working on this problem are brilliant, some are dangerously naive and overconfident to the point of likely causing immense harm over their careers. Unfortunately some of these people work for funders. I have tried to get excited about doing more in the EA community for years but just couldn’t overlook these seemingly obvious errors in judgement.
Side note: not everyone talks openly about their wealth and there are many people who could be medium-major donors who EA regularly burns through these bad professional interactions. There are a lot of people like me who are highly sympathetic to the EA worldview with ~tens of millions of net worth that could plausibly become EA donors or at least % pledgers. The hubris around these two highly volatile illiquid funding sources has been astonishing to witness as an outsider.
I’ve definitely seen well-meaning people mess up interactions without realizing it in my area (non-EA related). This seems like a really important point and your experience seems very relevant given all the recent talk about boards and governance. Would love to hear more of your thoughts either here or privately.
Thank you for your post, you’ve articulated what several others have privately shared with me. It is precisely because of the attitude you described that these individuals are reluctant to address this issue within the community.
While there are many good, kind, and intelligent people in EA, there are also some who are not.
In my humble opinion, EA might benefit from spending more time studying and embodying the moral principles of virtue ethics, and less time on utilitarianism and the rationalist community.
“How you treat the one reveals how you regard the many, because everyone is ultimately a one. ”
Stephen R. Covey, author of “7 Habits of Highly Effective People”
Thank you for sharing. I have had similar experiences myself. Since the FTX saga, I have found myself pouring over Twitter/EA Forums and I have realized that this is, in part, due to wanting to find comments like yours that would confirm what I had already known deep down:
1. Doing good better doesn’t necessarily equate to being a good human being. I’m sure there are good EA humans who are looking to do the most good with their resources, but it seems to me that the former outweighs the latter when it comes down to it.
2. Intellectual snob/superiority is seen as something to be proud of. EA has grown from originally focusing on global health and poverty to animal welfare to longtermism—quite a mixed bag actually. It seems like you have to accept the priorities that are presented by lead EA orgs/thinkers to qualify as EA (degrees of variations are ok as long as you still subscribe to them in general) . Any disagreement you would have would be seen as you not coming from the same intellectual depths of the Oxbridge/Ivy league philosophers and economists who dominate the EA movement.
I want to add: I’ve had a few similar experiences of being rudely dismissed where the person doing the rude dismissing was just wrong about the issue at hand. I mean, you, dear reader, obviously don’t know whether they were wrong or I was wrong, but that’s the conclusion I drew.
Furthermore, I think Gell-Mann amnesia is relevant here: the reason I’m so confident that my counterpart was wrong in these instances is because I happened to have a better understanding of the particular issues—but for most issues I don’t have a better understanding than most other people. So this might be more common than my couple of experiences suggest.
I’ve had a roughly equal number of good experiences working with EAs, and overwhelmingly good experiences at conferences (EAGx Australia only).
Thank you for sharing your perspective, I have heard many people sharing similar intuitions. And I agree, EA events often lack a certain humanity. Even local groups are supposed to be outreach vehicles first, and only some try to also be communities of mutual support.
That said, less official community-driven grassroots events feel much better. I’ve been to a bunch here in Germany. But yes, EA-branded events can feel inhumanely corporate and soulless.
I try to do my part by bringing humanity where I can and I encourage anyone reading this to be thoughtful in their interactions with the community.
Really sorry to hear about your experiences. I hope that if you attend further events, that you have better experiences in the future.
I think that it is valid for EA to think carefully about its target audience and whether we want to target “intellectual elites” or more broadly, but that doesn’t mean that people should be rude at events and I’m sad whenever I hear people discuss this in an unproductive way.