I’ve been thinking about this quite a bit recently. It’s not that I see myself as a “mediocre” EA, and in fact I work with EAs, so I am engaging with the community through my work. But I feel like a lot of the attitudes around career planning in EA sort of assume that you are formidable within a particular, rather narrow mould. You talk about mediocre EAs, but I’d also extend this to people who have strong skills and expertise that’s not obviously convertable into ‘working in the main EA cause areas’.
And the thing is, this kind of makes sense: like, if you’re a hardcore EA, it makes sense to give lots of attention and resources to people who can be super successful in the main EA cause areas, and comparatively neglect people who can’t. Inasmuch as the community’s primary aim is to do more good according to a specific set of assumptions and values, and not to be a fuzzy warm inclusive space, it makes sense that there aren’t a lot of resources for people who are less able to have an impact. But it’s kind of annoying if you’re one of those people!
Or like: most EA jobs are crazy competitive nowadays. And from the point of view of “EA” (as an ideology), that’s fine; impactful jobs should have large hiring pools of talented committed people. But from the point of view of people in the hiring pool, who are constantly applying to and getting rejected from EA jobs—or competitive non-EA jobs—because they’ve been persuaded these are the only jobs worth having, it kinda sucks.
There’s this well-known post ‘don’t be bycatch’; I currently suspect that EA structurally generates bycatch. By ‘structurally’ I mean ‘the behaviour of powerful actors in EA is kinda reasonable, but also it predictably creates situations where lots of altruistic, committed people get drawn into the community but can’t succeed within the paradigms the community has defined as success’.
a lot of the attitudes around career planning in EA sort of assume that you are formidable within a particular, rather narrow mould
This idea is something I’ve contemplated previously, but I really like that you put it into words.
If you will indulge me in rambling/ranting a little, I remember looking at 80k’s guidance on careers in the area of Improving China-Western coordination a few years ago. China is an area that I know a bit about and wanted to make a core of my career.[1] I was disappointed that most of their recommendations were not realistic for someone who wasn’t an ‘elite.’[2] I came away with the general impression that the authors didn’t really grasp the realities and challenges involved in a non-Chinese person building a career in China, but in retrospect it could have simply been written with Ivy League grads in mind in which case I was not the target audience. Many of the options listed that struck me as unavailable/unrealistic would be much more feasible if I had an undergraduate degree from Princeton or Yale or Stanford. So as a person who already had a decent amount of China-relevant knowledge and experience, I basically came away understanding that “there isn’t any way for me to contribute to this area” and that I would have to have some sort of an in with a professional network in order to contribute professionally.
Examples of these include recommendations to work as a foreign journalist (which is very competitive and prestigious in China), apply for top scholarships for special master’s degrees in China (one scholarship require applicants to be younger than 29, the other prefers candidates younger than 25), work at a think tank, work for the Ford Foundation, work for the Gates Foundation, work at top Chinese companies, and so on.
I’d also extend this to people who have strong skills and expertise that’s not obviously convertable into ‘working in the main EA cause areas’.
I think this is a key part. “Main EA cause areas” does centre a lot on a small minority of people with very specific technical skills and the academic track record to participate in (especially if you’re taking 80k Hours for guidance on that front)
But people can have a lot of impact in areas like fundraising with a completely different skillset (one that is less likely to benefit from a quantitative degree from an elite university) or earn well enough to give a lot without having any skills in research report writing, epidemiology or computer science.
And if your background isn’t one that the “do cutting edge research or make lots of money to give away” advice is tailored to at all, there are a lot of organizations doing a lot of effective good that really really, really need people with the right motivations allied to less niche skillsets. So I don’t think people should feel they’re not a ‘success’ if they end up doing GHD work rather than paying for it, and if their organization isn’t particularly adjacent to EA they might have more scope to positively influence its impactfulness.
Also, people shouldn’t label themselves mediocre :)
There’s people who are good at EA-related thinking and people who are less good at that.
There’s people who are good at accumulating resume padding, and people who are less good at that.
Although these are correlated, there will still be plenty of people who are good at EA thinking, but bad at accumulating resume padding. You can think of these people as having fallen through the cracks of the system.
Advances in LLMs give me the impression that we’re around ~2-5 years out from most EA orgs becoming much better at correctly identifying/drawing talent from this pool e.g. higher-quality summaries of posts and notes, or tracing upstream origins of original ideas.
I’m less optimistic about solutions to conflict theory/value alignment issues, but advances in talent sourcing/measurement might give orgs more room to focus hiring/evaluation energy on character traits. If talent is easy to measure then there’s less incentive to shrug and focus on candidates based on metrics that historically correlated with talent e.g. credentials.
I’ve been thinking about this quite a bit recently. It’s not that I see myself as a “mediocre” EA, and in fact I work with EAs, so I am engaging with the community through my work. But I feel like a lot of the attitudes around career planning in EA sort of assume that you are formidable within a particular, rather narrow mould. You talk about mediocre EAs, but I’d also extend this to people who have strong skills and expertise that’s not obviously convertable into ‘working in the main EA cause areas’.
And the thing is, this kind of makes sense: like, if you’re a hardcore EA, it makes sense to give lots of attention and resources to people who can be super successful in the main EA cause areas, and comparatively neglect people who can’t. Inasmuch as the community’s primary aim is to do more good according to a specific set of assumptions and values, and not to be a fuzzy warm inclusive space, it makes sense that there aren’t a lot of resources for people who are less able to have an impact. But it’s kind of annoying if you’re one of those people!
Or like: most EA jobs are crazy competitive nowadays. And from the point of view of “EA” (as an ideology), that’s fine; impactful jobs should have large hiring pools of talented committed people. But from the point of view of people in the hiring pool, who are constantly applying to and getting rejected from EA jobs—or competitive non-EA jobs—because they’ve been persuaded these are the only jobs worth having, it kinda sucks.
There’s this well-known post ‘don’t be bycatch’; I currently suspect that EA structurally generates bycatch. By ‘structurally’ I mean ‘the behaviour of powerful actors in EA is kinda reasonable, but also it predictably creates situations where lots of altruistic, committed people get drawn into the community but can’t succeed within the paradigms the community has defined as success’.
This idea is something I’ve contemplated previously, but I really like that you put it into words.
If you will indulge me in rambling/ranting a little, I remember looking at 80k’s guidance on careers in the area of Improving China-Western coordination a few years ago. China is an area that I know a bit about and wanted to make a core of my career.[1] I was disappointed that most of their recommendations were not realistic for someone who wasn’t an ‘elite.’[2] I came away with the general impression that the authors didn’t really grasp the realities and challenges involved in a non-Chinese person building a career in China, but in retrospect it could have simply been written with Ivy League grads in mind in which case I was not the target audience. Many of the options listed that struck me as unavailable/unrealistic would be much more feasible if I had an undergraduate degree from Princeton or Yale or Stanford. So as a person who already had a decent amount of China-relevant knowledge and experience, I basically came away understanding that “there isn’t any way for me to contribute to this area” and that I would have to have some sort of an in with a professional network in order to contribute professionally.
China was the main focus of my bachelor’s degree, which included learning the language. I had lived in Beijing for about eight years at that point.
Examples of these include recommendations to work as a foreign journalist (which is very competitive and prestigious in China), apply for top scholarships for special master’s degrees in China (one scholarship require applicants to be younger than 29, the other prefers candidates younger than 25), work at a think tank, work for the Ford Foundation, work for the Gates Foundation, work at top Chinese companies, and so on.
I think this is a key part. “Main EA cause areas” does centre a lot on a small minority of people with very specific technical skills and the academic track record to participate in (especially if you’re taking 80k Hours for guidance on that front)
But people can have a lot of impact in areas like fundraising with a completely different skillset (one that is less likely to benefit from a quantitative degree from an elite university) or earn well enough to give a lot without having any skills in research report writing, epidemiology or computer science.
And if your background isn’t one that the “do cutting edge research or make lots of money to give away” advice is tailored to at all, there are a lot of organizations doing a lot of effective good that really really, really need people with the right motivations allied to less niche skillsets. So I don’t think people should feel they’re not a ‘success’ if they end up doing GHD work rather than paying for it, and if their organization isn’t particularly adjacent to EA they might have more scope to positively influence its impactfulness.
Also, people shouldn’t label themselves mediocre :)
There’s people who are good at EA-related thinking and people who are less good at that.
There’s people who are good at accumulating resume padding, and people who are less good at that.
Although these are correlated, there will still be plenty of people who are good at EA thinking, but bad at accumulating resume padding. You can think of these people as having fallen through the cracks of the system.
Advances in LLMs give me the impression that we’re around ~2-5 years out from most EA orgs becoming much better at correctly identifying/drawing talent from this pool e.g. higher-quality summaries of posts and notes, or tracing upstream origins of original ideas.
I’m less optimistic about solutions to conflict theory/value alignment issues, but advances in talent sourcing/measurement might give orgs more room to focus hiring/evaluation energy on character traits. If talent is easy to measure then there’s less incentive to shrug and focus on candidates based on metrics that historically correlated with talent e.g. credentials.