Here are some bullet points of reflection topics around lifestyle and priorities for EAs that I shared with some fellow EAs some months ago. I am sharing this text here in case it interests anyone. I will elaborate and expand on them more and better later if I have the opportunity.
“”″
Support Systems: Seriously. I didn’t even know this term until after all this happened, and it would have changed everything. There’s something about how people are instructed in STEM institutions (and as a consequence, many EA institutions) that makes it all about careers, how one’s impact is understood by their public professional life. And then it turns out that in reality a lot of the most publicly impactful people have these incredibly beautiful family and fraternity systems that were at the core of everything they’ve done, that never get talked about. Too many yang, public, external, wikipedia-worthy archetypes of impact. It would be really awesome if every youngling EA-in-training knew that having strong and abundant support systems, investing in true family and friends, investing in intimacy, figuring out relationships, being connected to non-EAs… that this sort of thing might be not a distraction from impact but a foundation for impact.
Something something about impact theory: I don’t know, there’s something about EA theory where it wants it to be really convincing that being an EA is the most important thing to do, but somewhere in all the moral arguments, it takes way too many shortcuts. By taking shortcuts to force it to be the case that being an EA is the right moral thing to do, you are forced to ignore and push under the rug all forms of impact that don’t currently fit well into EA career stories and don’t have a legible trace of impact connecting it to an EA. I don’t really know how to solve this. If I were to give any pointers, here’s what first comes to mind:
-- Legibility: there’s a serious expectation that impact has to be legible. This is baked into the EA foundation. Unfortunately, in the real world, there are probably more illegible actions of impact than legible ones. Sure, I think we’ve been adding footnotes on EA material about this, but this is not a small thing that can be addessed separately from the rest of the decision-making. It truly affects the foundation on which the majority of EA arguments are based upon. One has to be able to make decisions in the world incorporating and accepting the fact that the majority of impact is fundamentally illegible, made by people you won’t get to know personally, that sometimes public information and public consensus about events can be pretty irrelevant when it comes to understanding and planning on the ground.
-- Argumentation: there’s an expectation that truth is found by finding the best arguments. This is true in all cases where this is true, except in all the cases where it isn’t. This stems from the above; arguments rely on legible, shared-knowledge facts, and there’s just so much of what decides what happens on a daily life basis that is far removed from that. Simplifications are incredibly robust in some cases, and incredibly illusory in others. Obviously we don’t want to abandon arguments, but more like, grow beyond them.
-- Curiosity and connection: The majority of good human beings are not EAs! What are they all truly up to?”″”
(My Facebook and Instagram accounts have been suspended without explanation. Hopefully they will be restored soon. If anyone reading this wants to reach me in the meantime, please use other means.)
[I removed this quick take because it was vulnerable to share and there were lots of important layers that were missing from the story to do it justice.]
I may be missing important context, but I think you are mistaken here on the norms at hand in this case. I do applaud you for helping your friend out; that makes you a good friend. But opportunities for people to be altruistic are completely unbounded; I could find hundreds of similar asks for help in a 5 minute google search, most of which aren’t distinctively “good opportunities”. If this wasn’t a personal request, but instead calling for donations to a related cause you were making a case for, that would be fine. I think highlighting personal requests for help is permissible and is even virtuous interpersonal behavior between friends and family. People reach out on facebook pages like this all the time. But it just looks like spam or emotional manipulation when posted on online forums dedicated to other purposes, with colleagues or strangers. Hopefully this helps! This can definitely be a confusing discourse norm contextually.
I was super hesitant about sharing this here, because indeed it is missing a lot of context.
Honestly, it is extremely demoralizing to be sincere and vulnerable in asking for help, and have that be called emotional manipulation.
Here’s a reflection Claude wrote about my original quick take:
“Does EA have a blind spot around personal hardship?
EA culture is pretty good at thinking about suffering at scale — but I sometimes wonder if it struggles to respond well when suffering shows up close and personal.
If EA communities can’t extend basic good faith to someone asking for help in a moment of need, is that a failure of the culture? We talk a lot about optimizing impact, but a reflexive suspicion toward personal appeals might mean we’re leaving real, immediate suffering unaddressed — and making people feel worse in the process.”
Here are some bullet points of reflection topics around lifestyle and priorities for EAs that I shared with some fellow EAs some months ago. I am sharing this text here in case it interests anyone. I will elaborate and expand on them more and better later if I have the opportunity.
“”″ Support Systems: Seriously. I didn’t even know this term until after all this happened, and it would have changed everything. There’s something about how people are instructed in STEM institutions (and as a consequence, many EA institutions) that makes it all about careers, how one’s impact is understood by their public professional life. And then it turns out that in reality a lot of the most publicly impactful people have these incredibly beautiful family and fraternity systems that were at the core of everything they’ve done, that never get talked about. Too many yang, public, external, wikipedia-worthy archetypes of impact. It would be really awesome if every youngling EA-in-training knew that having strong and abundant support systems, investing in true family and friends, investing in intimacy, figuring out relationships, being connected to non-EAs… that this sort of thing might be not a distraction from impact but a foundation for impact.
Something something about impact theory: I don’t know, there’s something about EA theory where it wants it to be really convincing that being an EA is the most important thing to do, but somewhere in all the moral arguments, it takes way too many shortcuts. By taking shortcuts to force it to be the case that being an EA is the right moral thing to do, you are forced to ignore and push under the rug all forms of impact that don’t currently fit well into EA career stories and don’t have a legible trace of impact connecting it to an EA. I don’t really know how to solve this. If I were to give any pointers, here’s what first comes to mind:
-- Legibility: there’s a serious expectation that impact has to be legible. This is baked into the EA foundation. Unfortunately, in the real world, there are probably more illegible actions of impact than legible ones. Sure, I think we’ve been adding footnotes on EA material about this, but this is not a small thing that can be addessed separately from the rest of the decision-making. It truly affects the foundation on which the majority of EA arguments are based upon. One has to be able to make decisions in the world incorporating and accepting the fact that the majority of impact is fundamentally illegible, made by people you won’t get to know personally, that sometimes public information and public consensus about events can be pretty irrelevant when it comes to understanding and planning on the ground.
-- Argumentation: there’s an expectation that truth is found by finding the best arguments. This is true in all cases where this is true, except in all the cases where it isn’t. This stems from the above; arguments rely on legible, shared-knowledge facts, and there’s just so much of what decides what happens on a daily life basis that is far removed from that. Simplifications are incredibly robust in some cases, and incredibly illusory in others. Obviously we don’t want to abandon arguments, but more like, grow beyond them.
-- Curiosity and connection: The majority of good human beings are not EAs! What are they all truly up to?”″”
(My Facebook and Instagram accounts have been suspended without explanation. Hopefully they will be restored soon. If anyone reading this wants to reach me in the meantime, please use other means.)
[I removed this quick take because it was vulnerable to share and there were lots of important layers that were missing from the story to do it justice.]
I may be missing important context, but I think you are mistaken here on the norms at hand in this case. I do applaud you for helping your friend out; that makes you a good friend. But opportunities for people to be altruistic are completely unbounded; I could find hundreds of similar asks for help in a 5 minute google search, most of which aren’t distinctively “good opportunities”. If this wasn’t a personal request, but instead calling for donations to a related cause you were making a case for, that would be fine. I think highlighting personal requests for help is permissible and is even virtuous interpersonal behavior between friends and family. People reach out on facebook pages like this all the time. But it just looks like spam or emotional manipulation when posted on online forums dedicated to other purposes, with colleagues or strangers.
Hopefully this helps! This can definitely be a confusing discourse norm contextually.
I was super hesitant about sharing this here, because indeed it is missing a lot of context.
Honestly, it is extremely demoralizing to be sincere and vulnerable in asking for help, and have that be called emotional manipulation.
Here’s a reflection Claude wrote about my original quick take:
“Does EA have a blind spot around personal hardship?
EA culture is pretty good at thinking about suffering at scale — but I sometimes wonder if it struggles to respond well when suffering shows up close and personal.
If EA communities can’t extend basic good faith to someone asking for help in a moment of need, is that a failure of the culture? We talk a lot about optimizing impact, but a reflexive suspicion toward personal appeals might mean we’re leaving real, immediate suffering unaddressed — and making people feel worse in the process.”
What do you think of Claude’s take there? (I thought it was unrepresentative of my own lived experience with various EA communities over the years.)