Epistemic Status: Similar to my other rants / posts, I will follow the investigative strategy of identifying my own personal problems and projecting these onto the EA community. I also half-read a chapter of a Brene Brown book which talks about belonging, and I will use the investigative strategy of using that to explain everything.
I’ve felt a decreased sense of belonging in the EA community which leads me to the inexorable conclusion that EA is broken or belonging constrained or something. I’ll use that as my starting point and work backwards from there.
Prioritising between people isn’t great for belonging
The community is built around things like maximising impact, and prioritisation. Find the best, ignore the rest.
Initially, it seemed like this was more focused on prioritising between opportunities, eg. donation or career opportunities. Though it seems like this has in some sense bled into a culture of prioritising between people, and that doing this has become more explicit and normalised.
Eg. words I see a lot in EA recruitment: talented, promising, high-potential, ambitious. (Sometimes I ask myself, wait a minute… am I talented, promising, high-potential, ambitious?). It seems like EA groups are encouraged to have a focus on the highest potential community members, as that’s where they can have the most impact.
But the trouble is, it’s not particularly nice to be in a community where you’re being assessed and sized up all the time, and different nice things (jobs, respect, people listening to you, money) are given out based on how well you stack up.
Basically, it’s pretty hard for a community with a culture of prioritisation to do a good job of providing people with a sense of belonging.
Also, heavy tailed distributions—EA’s love them. Some donation opportunities/ jobs are so much more impactful than the others etc. If the thing you’re doing isn’t in the good bit of the tail, it basically rounds to zero. This is kind of annoying when by definition, most of the things in a heavy tailed distribution aren’t in the good bit.
A sense of belonging seems nice, but maybe it’s a nice to have, like extra leg room on flights or not working on weekends. Fun, but not necessary if you care about having an impact.
I think my take is that for most people, myself included, it’s a necessity. Pursuing world optimisation is only really possible with a basis of belonging.
Here’s a nice image from Brene Brown’s book which I’ve lightly edited for clarity.
I think the EA community provides some sense of belonging, but probably not enough to properly keep people going. Things can then get a bit complicated, with EA being a community built around world optimisation.
If people have a not-quite-fully-met need to belong, and the EA community is one of their main sources of a sense of belonging, they’ll feel more pressure to fit in with the EA community—eg. by drinking the same food, espousing the same beliefs, talking in the same way etc.
I don’t understand how to belong to something as massive and distributed as EA. Instead I belong to little pieces: wee communities nested inside the movement. I belong to my org. I belong to EA Bristol. I belong to EA twitter and DEAM, though I often wish I didn’t (which breaches your children’s definition). Common themes: co-location or daily group chats, shared memories and individuation, lulz. Which of these are you not getting?
[I see that many people ‘belong to’ structures as massive and nebulous as EA, e.g. religions. But I don’t really get it.
I’m not the ideal person to talk about this because I have an anomalously low need for belonging and don’t really get it in general.]
Brene Brown asked some eighth graders to come up with the differences between ‘fitting in’ and ‘belonging’.
Some of their things:
Belonging is being somewhere where you want to be, and they want you. Fitting in is being somewhere where you want to be, but they don’t care one way or the other.
Belonging is being accepted for being you. Fitting in is being accepted for being like everyone else.
I always find it a bit embarassing when eighth graders have more emotional insightfulness than I do, which alonside their poor understanding of Bayesianism, is why I tend to avoid hanging out with them.
I’ve had experiences of both belonging and fitting in with EA, but I’ve felt like the fitting in category has become larger over time, or at least I’ve become more aware of it.
Belonging, fitting in, and why do EAs look the same?
There’s this thing where after people have been in EA for a while, they start looking the same. They drink the same huel, use the same words, have the same partners, read the same econ blogs… so what’s up with that?
Let’s take Brene Brown’s insightful eighth graders as a starting point
Belonging is being accepted for being you. Fitting in is being accepted for being like everyone else.
Things are good and nice hypothesis: EAs end up looking the same because they identify and converge on more rational and effective ways of doing things. EA enables people to be their true selves, and EAs true selves are rational and effective, which is why everyone’s true selves drink Huel.
Cynical hypothesis: EAs end up looking the same because people want to fit in, and they can do that by making themselves more like other people. I drink Huel because it tells other people that I am rational and effective, and I can get over the lack of the experience of being nourished by reminding myself that huel is scientifically actually more nourishing than a meal which I chew sat round a dinner table with other people.
Fitting in with one group makes it harder to fit in with other groups + me being annoying
One maybe sad thing on the cynical hypothesis is that the strategy for fitting in in one group, eg. adopting all these EA lifestyle things, decreases the fit in other groups, and so increases the dependence on the first group… eg. the more I ask my non EA friends what their inside views on AI timelines are the more they’re like, this guy has lost the plot and stop making eye contact with me.
(In my research for this post I asked a friend ‘Did I become more annoying when I got into the whole EA stuff? It would be helpful if you could say yes because it will help me with this point I’m trying to make’ And he said ’Well there was this thing where you were a bit annoying to have conversations with about the world and politics and stuff because you had this whole EA thing and so thought that everything else wasn’t important and wasn’t worth talking about because the obvious answer was do whatever is most effective… but tbh otherwise not really, you were always kind of annoying)
I found this post harder to understand than the rest of the series. The thing you’re describing makes sense in theory, but I haven’t seen it in practice and I’m not sure what it would look like.
What EA-related lifestyle changes people would other people find alienating? Veganism? Not participating in especially expensive activities? Talking about EA?
I haven’t found “talking about EA” to be a problem, as long as I’m not trying to sell my friends on it without their asking first. I don’t think EA is unique in this way — I’d be annoyed if my religious friends tried to proselytize to me or if my activist friends were pressuring me to come and protest with them.
If I talk about my job or what I’ve been reading lately in the sense of “here’s my life update”, that goes fine, because we’re all sharing those kinds of life updates. I avoid the EA-jargon bits of my job and focus on human stories or funny anecdotes. (Similarly, my programmer friends don’t share coding-related stories I won’t understand.)
And then, when we’re not sharing stories, we’re doing things like gaming or hiking or remembering the good times, all of which seem orthogonal to EA. But all friendships are different, and I assume I’m overlooking obstacles that other people have encountered.
EA and Belonging word splurge
Epistemic Status: Similar to my other rants / posts, I will follow the investigative strategy of identifying my own personal problems and projecting these onto the EA community. I also half-read a chapter of a Brene Brown book which talks about belonging, and I will use the investigative strategy of using that to explain everything.
I’ve felt a decreased sense of belonging in the EA community which leads me to the inexorable conclusion that EA is broken or belonging constrained or something. I’ll use that as my starting point and work backwards from there.
Prioritising between people isn’t great for belonging
The community is built around things like maximising impact, and prioritisation. Find the best, ignore the rest.
Initially, it seemed like this was more focused on prioritising between opportunities, eg. donation or career opportunities. Though it seems like this has in some sense bled into a culture of prioritising between people, and that doing this has become more explicit and normalised.
Eg. words I see a lot in EA recruitment: talented, promising, high-potential, ambitious. (Sometimes I ask myself, wait a minute… am I talented, promising, high-potential, ambitious?). It seems like EA groups are encouraged to have a focus on the highest potential community members, as that’s where they can have the most impact.
But the trouble is, it’s not particularly nice to be in a community where you’re being assessed and sized up all the time, and different nice things (jobs, respect, people listening to you, money) are given out based on how well you stack up.
Basically, it’s pretty hard for a community with a culture of prioritisation to do a good job of providing people with a sense of belonging.
Also, heavy tailed distributions—EA’s love them. Some donation opportunities/ jobs are so much more impactful than the others etc. If the thing you’re doing isn’t in the good bit of the tail, it basically rounds to zero. This is kind of annoying when by definition, most of the things in a heavy tailed distribution aren’t in the good bit.
Is belonging effective?
A sense of belonging seems nice, but maybe it’s a nice to have, like extra leg room on flights or not working on weekends. Fun, but not necessary if you care about having an impact.
I think my take is that for most people, myself included, it’s a necessity. Pursuing world optimisation is only really possible with a basis of belonging.
Here’s a nice image from Brene Brown’s book which I’ve lightly edited for clarity.
I think the EA community provides some sense of belonging, but probably not enough to properly keep people going. Things can then get a bit complicated, with EA being a community built around world optimisation.
If people have a not-quite-fully-met need to belong, and the EA community is one of their main sources of a sense of belonging, they’ll feel more pressure to fit in with the EA community—eg. by drinking the same food, espousing the same beliefs, talking in the same way etc.
I don’t understand how to belong to something as massive and distributed as EA. Instead I belong to little pieces: wee communities nested inside the movement. I belong to my org. I belong to EA Bristol. I belong to EA twitter and DEAM, though I often wish I didn’t (which breaches your children’s definition). Common themes: co-location or daily group chats, shared memories and individuation, lulz. Which of these are you not getting?
[I see that many people ‘belong to’ structures as massive and nebulous as EA, e.g. religions. But I don’t really get it.
I’m not the ideal person to talk about this because I have an anomalously low need for belonging and don’t really get it in general.]
Belonging vs. fitting in
Brene Brown asked some eighth graders to come up with the differences between ‘fitting in’ and ‘belonging’.
Some of their things:
Belonging is being somewhere where you want to be, and they want you. Fitting in is being somewhere where you want to be, but they don’t care one way or the other.
Belonging is being accepted for being you. Fitting in is being accepted for being like everyone else.
I always find it a bit embarassing when eighth graders have more emotional insightfulness than I do, which alonside their poor understanding of Bayesianism, is why I tend to avoid hanging out with them.
I’ve had experiences of both belonging and fitting in with EA, but I’ve felt like the fitting in category has become larger over time, or at least I’ve become more aware of it.
Belonging, fitting in, and why do EAs look the same?
There’s this thing where after people have been in EA for a while, they start looking the same. They drink the same huel, use the same words, have the same partners, read the same econ blogs… so what’s up with that?
Let’s take Brene Brown’s insightful eighth graders as a starting point
Things are good and nice hypothesis: EAs end up looking the same because they identify and converge on more rational and effective ways of doing things. EA enables people to be their true selves, and EAs true selves are rational and effective, which is why everyone’s true selves drink Huel.
Cynical hypothesis: EAs end up looking the same because people want to fit in, and they can do that by making themselves more like other people. I drink Huel because it tells other people that I am rational and effective, and I can get over the lack of the experience of being nourished by reminding myself that huel is scientifically actually more nourishing than a meal which I chew sat round a dinner table with other people.
Fitting in with one group makes it harder to fit in with other groups + me being annoying
One maybe sad thing on the cynical hypothesis is that the strategy for fitting in in one group, eg. adopting all these EA lifestyle things, decreases the fit in other groups, and so increases the dependence on the first group… eg. the more I ask my non EA friends what their inside views on AI timelines are the more they’re like, this guy has lost the plot and stop making eye contact with me.
(In my research for this post I asked a friend ‘Did I become more annoying when I got into the whole EA stuff? It would be helpful if you could say yes because it will help me with this point I’m trying to make’ And he said ’Well there was this thing where you were a bit annoying to have conversations with about the world and politics and stuff because you had this whole EA thing and so thought that everything else wasn’t important and wasn’t worth talking about because the obvious answer was do whatever is most effective… but tbh otherwise not really, you were always kind of annoying)
I found this post harder to understand than the rest of the series. The thing you’re describing makes sense in theory, but I haven’t seen it in practice and I’m not sure what it would look like.
What EA-related lifestyle changes people would other people find alienating? Veganism? Not participating in especially expensive activities? Talking about EA?
I haven’t found “talking about EA” to be a problem, as long as I’m not trying to sell my friends on it without their asking first. I don’t think EA is unique in this way — I’d be annoyed if my religious friends tried to proselytize to me or if my activist friends were pressuring me to come and protest with them.
If I talk about my job or what I’ve been reading lately in the sense of “here’s my life update”, that goes fine, because we’re all sharing those kinds of life updates. I avoid the EA-jargon bits of my job and focus on human stories or funny anecdotes. (Similarly, my programmer friends don’t share coding-related stories I won’t understand.)
And then, when we’re not sharing stories, we’re doing things like gaming or hiking or remembering the good times, all of which seem orthogonal to EA. But all friendships are different, and I assume I’m overlooking obstacles that other people have encountered.
(Also, props for doing the research!)