Also: Pardon the long comment, I didn’t have the time to write a short one. No one is under any obligation to address everything or even most things I said when writing replies.
During the past 4 years of being involved in-person with EA, my instinctive reaction to this problem has been to mostly argue against it whenever anyone tells me they personal think like this.
I think I can ‘argue’ convincingly against doing the things on your list, or at least the angst that comes associated with them.
I talk about how you should compare yourself to reality, not to other people. I talk about the fact that if I generalise your reasoning, this means you think all your friends should also feel bad about themselves—and also (to borrow your example) so should Carl Shulman. He could be 100x more impactful. He’s not the single most impactful person, and this is a heavy-tailed world, so apparently he should feel like barely a percentage point of worth when comparing himself to some other people.
I bring up Sam Altman’s line, where he says the most impactful people “Spend ~1 year exploring broadly, ~4 years relentless focus executing on the most interesting direction, repeat” which is in direct conflict with “[constantly] obsess about their own personal impact and how big it is” and “generally feel miserable about themselves because they’re not helping the world more”. Altman’s quote is about allocating time to different kinds of thinking and not thinking all the thoughts all the time.
In line with the latter, I often try to identify the trains of thought running through someone’s mind that are causing them to feel pain, and try to help them bucket times for dealing with them, rather than them being constant.
I have conversations like this:
I hear that you are constantly worrying about how much money you’re spending because you could be donating it. I think your mental space is very important, so let me suggest that instead of filling it with this constant worry, you could set a few hours aside every month to figure out whether your current budget is okay, and otherwise not think about it.
Would you trust that process? Can we come up with a process you would trust? Do you want to involve your friends in your process to hold you accountable? Do you want to think about it fortnightly? Do you want to write your conclusion down on paper and stick it to your wall?
It’d be good to have a commitment like this to rely on. Rather than comparing your moral worth to a starving African child every time you’re hungry and need food, I want you to be able to honestly say to yourself “I’ve set time aside for figuring out what financial tradeoffs I can make here, and I trust myself to make the right call at that time, so thinking about it more at this moment now isn’t worthwhile, and I follow what I decided to do.”
And yet somehow, given what has always felt to me like a successful attempt to clearly lay out the considerations, the problem persists, and people are not reliably cured after I talk to them. I mean, I think I have helped, and helped some people substantially, but I’ve not solved the general problem.
When a problem persists like this, especially for numerous people, I’ve started to look instead to incentives and social equilibria.
Incentives and Social Equilibria in EA
Here is a different set of observations.
A lot of the most successful parts of EA culture are very mission-oriented. We’re primarily here to get sh*t done, and this is more central than finding friends or feel warm fuzzies.
EA is the primary place in the world for smart and young people who reason using altruism and empathy to make friends with lots of others who think in similar ways, and get advice about how to live life and live out their careers.
EA is new, it’s young, and it’s substantially built over the internet, and doesn’t have many community elements to it. It’s mostly a global network of people with a strong intellectual and emotional connection, rather than a village community where all the communal roles can be relied on to be filled by different townsfolk (caretakers, leaders, parents, party organisers, police, lawyers, etc).
The majority of large EA social events are often the primary way many people interact with people who may hire them in the future, or that they may wish to hire. For many people who identify as “EA”, this is also the primary environment in which they are able to interact with widely respected EAs who might offer them jobs some day. This is in contrast with parties within major companies or universities, where there is a very explicit path in your career that will lead to you being promoted. In OpenPhil’s RA hiring round, I think there were over 1000 applications, of which I believe they have hired and kept 4 people. Other orgs hiring is similarly slow. This suggests that in general you shouldn’t expect to be able to have a career progression within orgs run by the most widely respected EAs.
Many people are trying to devote their entire lives to EA and EA goals, and give up on being committed members of other cultures and communities in the pursuit of this. (I was once at a talk where Anna Salamon noted, with sadness, that many people seem to stop having hobbies as they moved closer into EA/Rationality.)
This puts a very different pressure on social events. Failing to impress someone at a party / other event sometimes feels not merely like a social disappointment, but also one for your whole career and financial security and social standing among your friends and acquaintances. If the other people you mainly socialise with also attend those parties (as is true for me), in many ways these large events set the norms for social events in the rest of your life, with other things being heavily influenced by the dynamics of what is reward/punished in those environments.
I think this puts many people in bad negotiating positions. With many other communities (e.g. hobby communities built around sports/arts etc, professional communities that are centuries old like academia/finance/etc) if the one you’re in isn’t healthy for you, it’s always an option to find another sport, or another company. But, speaking personally, I don’t feel there are many other communities who are going to be able to proactively deal with the technological challenges of this century, who are smart and versatile and competent and care enough about humanity and its future to work on the existential problems. I mean, it’s not like there aren’t other places I could do good work, but I’d have to sacrifice a lot of who I am and what I care about to feel at home within them. So leaving doesn’t tend to feel like much of an option (and I didn’t even write about all the evolutionary parts of my brain screaming at me to never do anything socially risky never mind decide to leave my tribe).
So the standards of the mission are used as the standards of the community, and the community is basically hanging off of much of the mission, and that leads people to use the standards for themselves in places one would never normally apply those standards (e.g. self-worth and respect by friends).
Further Thoughts
Hmm, on reflection, something about the above feels a bit stronger than the truth (read: false). As with other healthy professional communities, I think in many parts of EA and rationality the main way to get professional respect is to actually build useful things and have interesting ideas, far more than having good social interactions at parties[1]. I’m trying to talk about the strange effects it has when there’s also something like a community or social group built around these groups as well, that people devote their lives to, that isn’t massively selective—insofar as it’s not just the set of people who work full-time on EA projects, but anyone who identifies with EA or likes it.
I think it is interesting though, to try to think of a fairly competent company with 100s of employees, and imagining what would happen if a group of people tried to build their entire social life around the network inside of that company, and genuinely tried to live in accordance with the value judgements that company made, where the CEO and top executives were the most respected. Not only was this community inside the company, but lots of other people who like what the company is doing would turn up to the events, and also be judged precisely in accordance with how much utility they’re providing the company, and how they’re evaluated by the company. And they’d keep trying to get hired by the company, even though there are more people in the community than in the company by like 10x, or maybe 100x.
I think that’s a world where I’d expect to see blogposts, by people in both the community and throughout the company, that saying things like “I know we all try to judge ourselves by where we stand in the company, but if you die having never become a top executive or even getting hired, maybe you shouldn’t feel like your life has been a tragic waste?” And these get mixed into weird, straightforwardly false messages that people sometimes say behind closed doors just to keep themselves sane like “Ah, it only matters how much you tried, not whether you got hired” and “Just caring about the company is enough, it doesn’t matter if you never actually helped the company make money.”
When the company actually matters, and you actually care about outcomes, these memes are at best unhelpful, but when the majority of community members around the company can’t do anything to affect the trajectory of the company, and the community uses this standard in place of other social standards, these sorts of memes are used to avoid losing your mind.
--
[1] Also with EA (much more than with the LessWrong in-person diaspora) has parts that aren’t trying to be a community or a company, but are trying to be a movement, and that has further weird interactions with the other parts.
“But EA orgs can’t be inclusive, so we should have a separate social space for EA’s that is inclusive. Working at an EA org shouldn’t be the only option for one’s sanity.”
Epistemic Status: Thinking out loud.
Also: Pardon the long comment, I didn’t have the time to write a short one. No one is under any obligation to address everything or even most things I said when writing replies.
During the past 4 years of being involved in-person with EA, my instinctive reaction to this problem has been to mostly argue against it whenever anyone tells me they personal think like this.
I think I can ‘argue’ convincingly against doing the things on your list, or at least the angst that comes associated with them.
I talk about how you should compare yourself to reality, not to other people. I talk about the fact that if I generalise your reasoning, this means you think all your friends should also feel bad about themselves—and also (to borrow your example) so should Carl Shulman. He could be 100x more impactful. He’s not the single most impactful person, and this is a heavy-tailed world, so apparently he should feel like barely a percentage point of worth when comparing himself to some other people.
I bring up Sam Altman’s line, where he says the most impactful people “Spend ~1 year exploring broadly, ~4 years relentless focus executing on the most interesting direction, repeat” which is in direct conflict with “[constantly] obsess about their own personal impact and how big it is” and “generally feel miserable about themselves because they’re not helping the world more”. Altman’s quote is about allocating time to different kinds of thinking and not thinking all the thoughts all the time.
In line with the latter, I often try to identify the trains of thought running through someone’s mind that are causing them to feel pain, and try to help them bucket times for dealing with them, rather than them being constant.
I have conversations like this:
And yet somehow, given what has always felt to me like a successful attempt to clearly lay out the considerations, the problem persists, and people are not reliably cured after I talk to them. I mean, I think I have helped, and helped some people substantially, but I’ve not solved the general problem.
When a problem persists like this, especially for numerous people, I’ve started to look instead to incentives and social equilibria.
Incentives and Social Equilibria in EA
Here is a different set of observations.
A lot of the most successful parts of EA culture are very mission-oriented. We’re primarily here to get sh*t done, and this is more central than finding friends or feel warm fuzzies.
EA is the primary place in the world for smart and young people who reason using altruism and empathy to make friends with lots of others who think in similar ways, and get advice about how to live life and live out their careers.
EA is new, it’s young, and it’s substantially built over the internet, and doesn’t have many community elements to it. It’s mostly a global network of people with a strong intellectual and emotional connection, rather than a village community where all the communal roles can be relied on to be filled by different townsfolk (caretakers, leaders, parents, party organisers, police, lawyers, etc).
The majority of large EA social events are often the primary way many people interact with people who may hire them in the future, or that they may wish to hire. For many people who identify as “EA”, this is also the primary environment in which they are able to interact with widely respected EAs who might offer them jobs some day. This is in contrast with parties within major companies or universities, where there is a very explicit path in your career that will lead to you being promoted. In OpenPhil’s RA hiring round, I think there were over 1000 applications, of which I believe they have hired and kept 4 people. Other orgs hiring is similarly slow. This suggests that in general you shouldn’t expect to be able to have a career progression within orgs run by the most widely respected EAs.
Many people are trying to devote their entire lives to EA and EA goals, and give up on being committed members of other cultures and communities in the pursuit of this. (I was once at a talk where Anna Salamon noted, with sadness, that many people seem to stop having hobbies as they moved closer into EA/Rationality.)
This puts a very different pressure on social events. Failing to impress someone at a party / other event sometimes feels not merely like a social disappointment, but also one for your whole career and financial security and social standing among your friends and acquaintances. If the other people you mainly socialise with also attend those parties (as is true for me), in many ways these large events set the norms for social events in the rest of your life, with other things being heavily influenced by the dynamics of what is reward/punished in those environments.
I think this puts many people in bad negotiating positions. With many other communities (e.g. hobby communities built around sports/arts etc, professional communities that are centuries old like academia/finance/etc) if the one you’re in isn’t healthy for you, it’s always an option to find another sport, or another company. But, speaking personally, I don’t feel there are many other communities who are going to be able to proactively deal with the technological challenges of this century, who are smart and versatile and competent and care enough about humanity and its future to work on the existential problems. I mean, it’s not like there aren’t other places I could do good work, but I’d have to sacrifice a lot of who I am and what I care about to feel at home within them. So leaving doesn’t tend to feel like much of an option (and I didn’t even write about all the evolutionary parts of my brain screaming at me to never do anything socially risky never mind decide to leave my tribe).
So the standards of the mission are used as the standards of the community, and the community is basically hanging off of much of the mission, and that leads people to use the standards for themselves in places one would never normally apply those standards (e.g. self-worth and respect by friends).
Further Thoughts
Hmm, on reflection, something about the above feels a bit stronger than the truth (read: false). As with other healthy professional communities, I think in many parts of EA and rationality the main way to get professional respect is to actually build useful things and have interesting ideas, far more than having good social interactions at parties[1]. I’m trying to talk about the strange effects it has when there’s also something like a community or social group built around these groups as well, that people devote their lives to, that isn’t massively selective—insofar as it’s not just the set of people who work full-time on EA projects, but anyone who identifies with EA or likes it.
I think it is interesting though, to try to think of a fairly competent company with 100s of employees, and imagining what would happen if a group of people tried to build their entire social life around the network inside of that company, and genuinely tried to live in accordance with the value judgements that company made, where the CEO and top executives were the most respected. Not only was this community inside the company, but lots of other people who like what the company is doing would turn up to the events, and also be judged precisely in accordance with how much utility they’re providing the company, and how they’re evaluated by the company. And they’d keep trying to get hired by the company, even though there are more people in the community than in the company by like 10x, or maybe 100x.
I think that’s a world where I’d expect to see blogposts, by people in both the community and throughout the company, that saying things like “I know we all try to judge ourselves by where we stand in the company, but if you die having never become a top executive or even getting hired, maybe you shouldn’t feel like your life has been a tragic waste?” And these get mixed into weird, straightforwardly false messages that people sometimes say behind closed doors just to keep themselves sane like “Ah, it only matters how much you tried, not whether you got hired” and “Just caring about the company is enough, it doesn’t matter if you never actually helped the company make money.”
When the company actually matters, and you actually care about outcomes, these memes are at best unhelpful, but when the majority of community members around the company can’t do anything to affect the trajectory of the company, and the community uses this standard in place of other social standards, these sorts of memes are used to avoid losing your mind.
--
[1] Also with EA (much more than with the LessWrong in-person diaspora) has parts that aren’t trying to be a community or a company, but are trying to be a movement, and that has further weird interactions with the other parts.
Related: https://forum.effectivealtruism.org/posts/rrkEWw8gg6jPS7Dw3/the-home-base-of-ea
“But EA orgs can’t be inclusive, so we should have a separate social space for EA’s that is inclusive. Working at an EA org shouldn’t be the only option for one’s sanity.”