Getting People Excited About More EA Careers: A New Community Building Challenge
Summary
The discussion around the current EA org hiring landscape has highlighted that some people may disproportionately favour working at explicit EA organisations over other EA careers. Current community building may contribute to this effect in two ways: by rewarding people for wanting to work at EA orgs disproportionately and by providing less assistance in finding work at non-EA organisations. This may lead to reduced motivation for people who fail to secure jobs at explicit EA organisations, and talent misallocation. Strengthening ‘core’ community building within professional groups and finding new signals to quickly demonstrate EA commitment and understanding may help to make a larger variety of careers exciting to committed EAs and thus reduce the effect.
Introduction
It has become evident in the past few months that many highly qualified applicants are unable to land a job at explicitly EA-motivated organisations. The recent EA Forum post of an unsuccessful applicant has provided an illustrative example of the the situation and received strong resonance.
In the discussion that followed several commenters stated a significant preference to work at EA organisations rather than pursuing further skills building or relevant work at other institutions. For example Max Daniel writes:
“I echo the impression that several people I’ve talked to—including myself—were or are overly focussed on finding a job at a major EA org. This applies both in terms of time spent and number of applications submitted, and in terms of more fuzzy notions such as how much status or success is associated with roles. I’m less sure if I disagreed with these people about the actual impact of ‘EA jobs’ vs. the next best option, but it’s at least plausible to me that (relative to my own impression) some of them overvalue the relative impact of ‘EA jobs’.”
The OP of the post also shared this impression:
“My other top options [...] felt like “now I have to work really hard for some time, and maybe later I will be able to contribute to X-risk reduction”. So still like I haven’t quite made it. In contrast, working for a major EA org (in my imagination) felt like “Yes, I have made it now. I am doing super valuable, long-termism-relevant work”.”
It is great news that many people are very excited to work at explicit EA organisations. However, there is a sense that jobs at EA-motivated employers are disproportionately desirable to some relative to the expected impact of these options. In the discussion there seemed to be little disagreement with the claim that there are many options at non-EA employers that could well be competitive with the jobs at explicit EA organisations in terms of expected value. I.e. there are many career paths outside EA organisations that would be competitive even if you have secured a job offer from a major EA org, and even adjusting for the risk that you end up being less successful in that path than you’d hope. However, there still appears to be a substantial preference on a more ‘fuzzy’ level to work at these explicit EA organisations. To be clear: it is awesome that working at EA orgs is desirable to many, but concerning that they may find other EA careers less appealing for the wrong reasons.
If this is the case then it may lead to a misallocation of talent within the community, and reduced motivation for many other EA career paths as they are perceived (or felt) to be less appealing. In this post I want to explore whether some aspects of the EA community contribute to this effect and how community building efforts could help reduce it. In other words, I want to start a conversation on how we can make EA roles outside explicit EA organisations similarly exciting and emotionally appealing to committed EAs.
There are many more issues that arise from the current hiring situation that I am not looking into here (What to do with all the people? Does the high number of applicants affect the counterfactuals for working at an EA org? What emotional effects does a very meritocratic and competitive community have on some members?) which may well be more important, as I am sure others will write more on these (see also: 1 2 3).
The cause
Some reasons for this “EA org bias” have been discussed in the comments (e.g. here and here). Surely an individual’s preferences for a role at an EA org will be influenced by many factors. However, I want to highlight two aspects where the current community may disproportionately favour careers at EA orgs.
Social incentives favour working at explicit EA organisations
In recent years EA community building efforts have focused on building a core community of very motivated and well coordinated individuals. As a side effect, there is an increased need to quickly determine how committed to EA a community member is, and how sophisticated their understanding of the core concepts is.
For example, if I were looking for collaborators on a project within, say, biosecurity or politics, I would need to know if this collaborator shared the same motivation and commitment as I do before sharing confidential details about the project. Likewise, group leaders may spend more effort on members that they perceive to be very committed to EA and have a sound understanding of the concepts, for example by inviting them to more advanced discussion groups, retreats or referring them to others in the community.
Working at explicit EA organisations, or at least telling others that you want to, is a fantastic way to signal both commitment to and understanding of EA. For example, if you go to EA Global and introduce yourself as an FHI researcher, you’ll have an easier time finding potential collaborators than if you work in a non-EA think tank, even if you are equally committed and nuanced in your understanding of EA concepts, and possibly even if your think tank is more influential than FHI. If you tell others in your local group you’re applying to CEA and turn down your offer from Bridgewaters, they’ll probably rank you highly in their impact report and safely consider you a community building ‘success’. If you had ‘just’ started an economics PhD, they might be less sure.
It is noteworthy that the effect goes both ways: community builders will use the desire to work at an EA organisation as a quick proxy for commitment & understanding, and members will use it as a fast way to signal these.
Relatedly, if you’re very excited about EA, you’ll probably want to spend time with people who are at the cutting edge and discuss the ideas. After reading about the ‘core’ and ‘periphery’ EA community, and hearing that EA is very much trust- and network-based, it’s understandable many want to become part of this elusive ‘core’ community. However, the only clear access point to this core community is working at an EA organisation.
Finally, I suspect there is some sense that you sometimes feel like having to ‘justify’ your career choices to other community members if you’re going down some more speculative paths. The easiest way to avoid having to do this is to follow a standard EA career path, such as working at an explicitly EA-motivated employer.
It requires less independent work to want to apply to EA organisations
Another quote from the discussion:
“Somehow, EA positions were the positions I heard about. These were the positions that 80000 hours emailed me about, that I got invitations to apply for, that I found on the websites I was visiting anyway, and so on. Of course, the bar for applying for such a position is lower than if you first have to find positions yourself.”
I actually think 80,000 Hours does a decent job at including many different roles on their job board and linking to relevant resources in their career reviews. However it’s still the case that it requires significant additional effort for some career paths. If you want to work at EA orgs, then you have to learn all about EA. If you want to work at non-EA orgs, you’ll have to both learn all about EA to know what to look for and then learn all about this other field you’re getting into to figure out what the actually good opportunities are. Plus, if you’re unsure which EA org to work for lots of people will have good advice. This looks different if you’re looking for the best option e.g. within biosecurity.
I don’t think this is an issue of people being lazy. After learning all the EA concepts, thinking hard about cause prioritisation, doing tests on what types of work would fit you etc. it is not a small ask to do a big survey of this non-EA field to figure out the most promising opportunities. Having a mindset of looking for the very best option does not make this task less daunting.
At the same time, if community members build expertise in these fields early they may have a particularly large impact by helping others to follow them in these paths.
Current community building probably does not help to reduce the effect. For example, speaker events or discussion groups are more likely to focus on questions around cause prioritisation and community building (domain of EA orgs), and rarely around how to land a marketing job in a big corporation or an economics PhD. Moreover, some students will stop going to career-based societies, which will in turn mean they know less about non-EA careers. I suspect most EA career advice is now given by group leaders (regardless whether full-time or volunteer), who also know a lot more about EA organisations than promising professors in academia or excellent corporations to work at for skills building.
The problem
The described situation may, among other things, lead to two related negative effects:
Reduced motivation. If someone incorrectly thinks or feels working at an explicit EA organisation is much higher impact or more desirable than the alternatives, being rejected by these organisations may be particularly painful and dissatisfying. Being less excited about the other options may then also reduce their motivation to work hard in these careers, reducing their overall impact. Or, in the worse case, they might not work hard to figure out the next best option at all as they suspect ‘real’ EA careers are out of their reach anyways and their impact does not matter much.
Talent misallocation. If someone would be very excited to work at an EA organisation, this might result in them being less excited in taking other training opportunities. For example, imagine a budding cause prioritisation researcher who is rejected in 2019 by top EA research organisations and thus chooses to work at a different explicit EA organisation. However, this organisation focuses on a different cause area or does not provide good mentorship. It seems the budding researcher could be better off doing a great economics or philosophy PhD with a leading professor if she didn’t have this preference for working at an EA organisation.
A major caveat here is that it’s very unclear to me how significant this “EA org bias” actually is, i.e. how pervasive it is among applicants and community members, and how strongly individuals experience it. In reality the situation will rarely be as clear cut and it will be hard to judge for each individual case whether there is in fact an “EA org bias” at work. I believe my final conclusions are probably robustly good regardless but it’s hard to sense how urgent or important resolving the issue is.
Below I list four examples of areas that I perceive to be presently undervalued among recent highly dedicated EA university graduates. Each probably deserves more justification than I give, but I hope they can illustrate the discussion. Also, I have actually no idea what the majority of highly dedicated EA university graduates are thinking (apart from being one myself), so I may get this totally wrong.
Graduate studies with great professors or at top universities in subjects like economics and philosophy (for cause prioritisation research), public policy, international relations and psychology (for policy) or synthetic biology and epidemiology (for biorisk). 80,000 Hours has been talking a lot about this being a good idea, though my impression is it has resonated less than the advice about working for EA organisations.
Skills building in non-EA organisations such as start-ups, management consultancies or big corporations to develop operations, marketing and/or management skills. For example, a significant fraction of current CEA staff has had substantial professional experience before joining.
Earning to give seems to be unpopular among highly dedicated EA university graduates. I share the concern put forward repeatedly by several here in this forum (e.g. in this thread) that EtG has somehow become ‘uncool’. However, quantitative trading is still one of 80,000 Hours priority paths. The career path also has a remarkably high upside potential, making it in principle competitive with most other options.
A final category includes speculative, non-standard or niche career paths. If people disproportionately aim for standard roles at EA organisations, we may be missing out on unconventional opportunities that might have high expected impact or exploration value, or contradict mainstream EA community beliefs. The more discussed examples include expertise in nanotechnology, geoengineering or attempts at institutional reform. I suspect there are more interesting examples that, by their very nature of being niche examples, I do not know about.
As a side note, we do not appear to have this problem in some other areas, like technical AI safety. Machine learning PhDs or jobs at leading AI research organisation (e.g. DeepMind) seems to be clear paths, can adsorb a lot of people and have a lot of community status associated. I’m unsure how people interested in AI policy or China feel, as these have received significant attention but the concrete career paths seem a bit less clear than EA orgs or technical AI safety.
Some ideas to improve
Strong professional ‘core’ communities
If working at explicit EA organisations is currently perceived to be the only way to join the ‘core’ community, the natural response is to build and foster more ‘core’ EA communities around people working in other fields.
An example where this may already be partly in place is EA London’s finance community. A graduate can look forward to taking earning to give roles in London and hope to meet key donors in the community and have interesting discussions there.
However, this is not yet the case if you work for the WHO to become a biosecurity specialist, if you’re working in a tech start-up in Berlin for skills building or if you’re doing a geoengineering PhD at a university outside an EA hub.
Surely local EA groups are trying to partly provide for this, and they will be part of the answer. However, it seems unlikely we’ll have a sufficient density of community members in more than a small number of places.
A possible way of looking at this is to see the goal of getting people to take EA jobs as composed of ‘push’ and ‘pull’ factors. The ‘push’ is the original community building that makes people want to contribute in the first place, and it is what a lot of community building efforts have focused on so far. The comparatively neglected ‘pull’ side is where we make EA careers as attractive as possible.
EA organisations appear to be good at the ‘pull’ side of things, and that is great. Building strong communities around people working in not explicitly EA organisations could be one way to improve the ‘pull’ strength of these career paths.
Surely this is not the only benefit of building good professional networks, but I suspect an underappreciated aspect of it.
An obstacle is that, unlike in local groups, it seems like nobody’s ‘job’ to worry about the EA community within their professional sphere. I hope that some people will take initiative anyways, and luckily I can think of a few people who are already doing this to some extent.
Finding new signals for commitment & understanding
There is a number of ways we can create new signals in the community. Some ideas include
Retreats. Retreats have become a popular activity of local groups. CEA also runs retreats for certain groups occasionally, but apart from the AI safety camp I am not aware of more initiatives that target professional groups. I would imagine these would be useful e.g. among people working in government or policy, PhD students in econmics/philosophy looking to do cause prio work, people working at corporations or start-ups to build their skills, and many more fields. Being invited to attend such a retreat, or knowing about them and deciding to come, could serve as a decent signal. The main issue, again, seems to be that right now few people would feel responsible to organise such an event (unlike in local groups, which usually have some sort of hierarchy in place).
Workshops. These are similar to retreats, but maybe more appealing. For example, our graduate students looking for cause prioritisation work could spend a few weeks each year working together on their cause prio ideas.
Fellowships. I’m thinking along the lines of Open Phil’s AI fellows program. The GPI fellowships or Open Phil’s grants for people working on GCBRs seem like a great ideas along these lines in other areas. However, most GPI fellowships are only available to Oxford students (though the Global Priorities Fellows are another excellent example), and Open Phil’s program didn’t aim to solicit applications from people who had funding secured independently. There seems to be a lot of opportunity to do interesting things here, though I am aware that if the existing organisations were to provide these fellowships this would come with substantial opportunity costs.
Relevant side projects. EA-relevant projects may also be a way to build close connections with collaborators within and outside of EA organisations, create some direct value and demonstrate your commitment and understanding. For example, graduate students with relevant subject expertise may be able to write relevant analyses within their area of expertise, or people training in operations roles may be able to help with community events or infrastructure.
These are more thought as a starting point for discussion than a complete analysis. I’m sure others will have more good ideas.
Many thanks to Alex Barry, James Aung, Jan Brauner and Eve McCormick for providing feedback on earlier drafts of this post.
- 26 Mar 2021 6:35 UTC; 4 points) 's comment on Propose and vote on potential EA Wiki entries by (
Part of this is caused by (over)use of a metric called impact-adjusted significant career plan changes. In a way, you get exactly what you optimise for. Quoting from 80k website
Scoring the options you recommend
Skills building in non-EA organisations such as start-ups … scores either 0.1 or 1, so 10x or 100x less valuable, in comparison to changing plan to work in GiveWell
Earning to give … scores 10x less valuable
Speculative options … 10x −100x less valuable
It’s worth to emphasise the metric which is optimised is changing plans. How the difference between where someone actually switched, vs. switched just the plan, is handled, is likely inconsistent across places and orgs.
Taken literally, the best thing for a large number of people under this metric is to switch plan to working for OpenPhil, and consider other options as failure.
Taken literally, the best thing to do for student group community builders is to convince everyone to switch plans in this way, and count that as success.
So it is not a bias. Quite the opposite: it is a very literal interpretation of the objective function, which was explicitly specified several years ago.
The meta-level point people should take from this is:
If you are in a position of influence, you should be super-careful before you introduce anything like a quantitative metric into EA culture. EAs love measuring impact, are optimisers, and will Goodhart hard
I think IASPCs handle these things well, and think there’s some misinterpretation going on. What makes a strong plan change under this metric is determined by whatever 80,000 Hours thinks is most important, and currently this includes academic, industry, EA org and government roles. These priorities also change in response to new information and needs. The problem Sebastian is worried about seems more of a big deal: maybe some orgs / local groups are defining their metrics mostly in terms of one category, or that it’s easy to naively optimise for one category at the expense of the others.
The part about counting impact from skill-building and direct work differently simply seems like correct accounting: EA orgs should credit themselves with substantially more impact for a plan change which has led to impact as one which might do so in the future, most obviously because the latter has a <100% probability of turning into the former.
I also think the metric works fine with Sebastian’s point that quant trading can be competitive with other priority paths. You seem to imply that the use of IASPCs contradicts his advice, but you point to a non-priority rating for ‘earn to give in a medium income career’, which is not quant trading!† 80,000 Hours explicitly list quant trading as a priority path (as Seb pointed out in the post), so if an org uses IASPCs as one of their metrics they should be excited to see people with those particular skills go down that route. (If any readers land quant jobs in London, please do say hi :) )
I agree that misapplication of this or similar metrics is dangerous, and that if e.g. some local groups are just optimising for EA-branded orgs instead of at least the full swathe of priority paths, there’s a big opportunity to improve. All the normal caveats about using metrics sensibly continue to apply.
All views my own.
†As a former trader, I felt the need to put an exclamation mark somewhere in this paragraph.
Ultimately the more you ground the metric in “what some sensible people thing is important and makes sense right now”, the more nuance it has, and the more is it tracking reality. The text in my quote is verbatim copy from the page describing the metric from 2015, so I think it’s highly relevant for understanding how IASPCs were understood. I agree that 80k career guides as a whole actually has much more nuance, and suggests approach like “figure out what will be needed in future and prepare for that”.
The whole accounting still seems wrong: per definition, what’s counted is … caused them to change the career path they intend to pursue, … ; this is still several steps away from impact: if someone changes their intentions to pursue jobs in EA orgs, it is counted as impact, even if the fraction of the people making such plans who will succeed is low.
For specificity, would you agree that someone who was 2 years away from graduation in 2016, deciding to change career plan to pursuing a job in CEA, would have been counted as impact 10, while someone switching from a plan going to industry to pursuing PhD in econ would have been counted as 1, and someone deciding to stay in, let’s say, cognitive neuroscience, would have been counted as 0?
Hi Jan,
I just wrote a bit more about how we measure IASPCs in another comment on this thread. We don’t use a precise formula and the details are important so I can’t say exactly how we’d rate a particular change at this level of generality.
That said, we take into account someone’s degree of follow through when we score their plan change, such that very few of our highest rated plan changes (rated-100 or more) are from people who are not currently doing impactful work.
Of the rated 10’s, our analysis in 2017 found:
30% have changed their plans but not yet passed a major “milestone” in their shift. Most of these people have applied to a new graduate programme but not yet received an offer.
30% have reached a milestone, but are still building career capital (e.g. entered graduate school, or taken a high-earning job but not yet donated much).
40% have already started having an impact (e.g. have published research, taken a non-profit job).
I agree if we didn’t take follow through into account it would lead to some scores that were far removed from expected impact such as the hypothetical you’ve described.
Hope this clarifies things.
Hey Jan and Howie,
thanks very much for the clarifying discussion. The fact that there is this discussion (also looking at the high number of votes for the comments) illustrates that there is at least some confusion around rating EA org vs. non-EA org careers, which is a bit concerning in itself.
FWIW my original claim was not that people (neither 80k nor community members) get the rational analysis part wrong. And a career path where actual impact is a few years off should totally get a reduced expected value & rating. (My claim in the initial post is that many of the other paths are still competitive with EA org roles.) There is little actual disagreement that quant trading is a great career.
My worry is that many soft factors may cause people to develop preferences that are not in line with the EV reasoning, and that may reduce motivation and/or lead to people overly focused on jobs at explicit EA employers.
Also, you lack a ‘stamp of approval’ from 80k when you pursue some of these careers that you kind of don’t need when doing a ‘standard’ path like working at CEA/FHI/80k/OPP or do a top ML PhD, even if all of them were rated 10. (In coaching days this was better, because you could just tell your doubting student group leader that this is what 80k wants you to do :) )
Hey Sebastian,
I’m sympathetic to your comment. The fact that (I think) 80k is not making this particular mistake in its IASPC system does not imply that there’s nothing to be concerned about. I think your post as well as some of the comments in other threads do a good job of laying out many of the factors pushing people toward jobs at explicitly EA orgs.
Hi Jan, thanks for your thoughts. Kit’s response is fairly close to our views.
The most important thing we want to emphasize is that at 80,000 Hours we definitely don’t think that working at an EA org is the only valuable thing for people to do. I think that taken as a whole, our writing reflects that.
The best way to quickly get sense of our views is reading through our high impact careers article, especially the list of 10 priority paths. Only one of these is working at an EA org.
I think our job board, problem profiles, podcast and so on give a similar sense of how much we value people working outside EA orgs.
A second key point is that when we score plan changes, we do not have a strict formula. We score changes based on our overall views of which paths are high-impact and assess many of the plan changes, especially the larger ones, on an individual basis, rather than simply putting them in a category. As an approximation, those we most prioritise are those represented by the 10 priority paths.
Of our top rated plan changes, only 25% involve people working at EA orgs
Fortunately, scoring on a case by case basis makes our scoring less vulnerable to Goodharting. Unfortunately, it means that it’s difficult for us to communicate exactly how we score plan changes to others. When we do so, it’s generally a few sentences, which are just aimed at giving people a sense of how impactful 80,000 Hours is as an organisation. These explanations are not intended to be career advice and it would be a shame if people have been taking them as such.
The specific sentences you quote are a bit out of date and we explain the categories differently in a draft of our annual review, which we hope to publish in the coming months. For example, we often score a plan change as rated-10 if somebody takes up a particularly valuable skill-building opportunity within one of our priority paths.
I hope that helps answer your concern!
For what it’s worth, given how few EA orgs there are in relation to the number of highly dedicated EAs and how large the world outside of EA is (e.g. in terms of institutions/orgs that work in important areas or are reasonably good at teaching important skills), 25% actually strikes me as a high figure. Even if this was right, there might be good reasons for the figure being that high, e.g. it’s natural and doesn’t necessarily reflect any mistake that 80K knows more about which careers at EA orgs are high-impact, can do a better job at finding people for them etc. However, I would be surprised if as the EA movement becomes more mature the optimal proportion was as high.
(I didn’t read your comment as explicitly agreeing or disagreeing with anything in the above paragraph, just wanted to share my intuitive reaction.)
Thank you for your comments here, they’ve helped me understand 80K’s current thinking on the issue raised by the OP.
Thanks for the thoughts, Max. As you suggest in your parenthetical, we aren’t saying that 25% of the community ought to be working at EA orgs. The distribution of the plan changes we cause is also affected by things like our network being strongest within EA. That figure is also calculated from a fairly small number of our highest impact plan changes so it could easily change a lot over time.
Personally, I agree with your take that the optimal percentage of the community working at EA orgs is less than 25%.
To clarify the concern, I’m generally not much more worried about how you use it internally, but about other people using the metric. It was probably not clear from my comment.
I understand it was probably never intended as something which other should use either for guiding their decisions or evaluating their efforts.
Have you thought about making this into a post? This is the first I’ve heard about this and find it really compelling and interesting and totally worth a larger discussion.
[My personal opinion here; not speaking on behalf of 80k, where I work]
Fwiw I personally get particularly excited when I meet somebody who’s working on a problem I consider a priority and/or shares my values but is working towards them from outside of an EA org. Somebody at a place like a non-EA think tank is particularly likely to able to teach me a lot because their network and knowledge of whatever area they’re working on is less likely to overlap with my own than that of somebody working at an EA org.
I strongly second this view. Based on my experience working at a foundation (and talking to many global health/development researchers outside the “core” of EA), and my experience meeting many people at EA Global, MIRI workshops, etc., I find that I’m especially excited to meet someone from outside who has a fresh perspective on an old topic, or helps me learn about some new corner of the high-impact research world.
(Also, they sometimes have a lot more raw experience; a magazine editor just learning about EA may know more about effective public communication than people who work in communications within EA orgs, because they’ve been working in that field since before EA existed in contexts where they were exposed to an audience 100x the size of what most EA orgs deal with.)
If I were to meet a fairly new UN employee at EA Global, I’d have just as many questions for them as for, say, a fairly new GiveWell researcher. The latter may work in an organization that is more tightly aligned with my values, but the former may have a sharper view of what the world looks like “up close”.
I’ll cross-link to a comment I just made on the original “EA jobs” thread, arguing for a point I expect to spend a lot of time expressing in the near future: Earning-to-give is, in fact, cool and respectable and worthy of admiration, even if it doesn’t happen to be the highest-impact career you can find.
I haven’t heard many people try to conflate “impact” with “coolness”, and I try not to do so myself. Even if your job isn’t at the top of the 80,000 Hours board, that doesn’t mean you aren’t doing something incredible with your life, or that your efforts don’t matter in the grand scheme of things.
It is true that some work saves more lives in expectation, or further boosts “the odds of a flourishing future”, etc. But it’s not like we spend all our time reproaching ourselves for not starting multibillion-dollar companies or becoming World Bank executives, even though those “jobs” are probably higher-impact than Open Phil jobs.
If 100 people apply for a research role, all of whom really want to help the world as much as possible, and only 10 people get that role, does that imply that we’ve now somehow sorted those 100 people into “coolest” and “less cool”? If someone was having a bad week and submitted a poor work test, are they “less cool” than they would have been in the counterfactual world where their brain was firing on all cylinders and they got the job?
We should be working on important problems and using money to implement promising solutions. In the distant future, when we’ve seen how it all played out, we’ll have a good sense for whose work turned out to be “most impactful”, or which donor dollars made the biggest difference. But whether we’re eating algae in a world ruled by ALLFED or celebrating Aaron Hamlin’s election as world president via approval voting, I hope we’ll still keep in mind that every person who worked or donated, every person who did their best to help the world through evidence and reason, was a part of the grand story.
I think another potential cause I have at least observed in my self is risk aversion. EA organisations are widely thought of as good career paths which does make it easier to justify to others but also to your self. If I pursue more niche roles I am less certain that they will be high impact because I am relying on only my own judgment. This does justify some preference for EA organisations but I agree there is probably an over emphasis on them in the community.
Thanks for sharing, I suspect this might be somewhat common. I’ve speculated about a related cause in another comment.
Post about niche skills (which include some career paths) at this URL: https://forum.effectivealtruism.org/posts/32dPBBXA2neyLfi3A/what-skills-would-you-like-1-5-eas-to-develop
Nice ideas here Sebastian. I wanted to clarify what you mean by professional core groups, the example you gave of EA London’s finance community sounds like a professional group within a local group. In my view the current challenge is that many cities don’t even have community groups at all much less the ability to subdivide based on profession.
I think it makes sense for EA to build community both along the lines of geography but also in professions (without respect to geography) e.g. EA’s in Healthcare. Of these two I think the priority should be the former because it gives people a far stronger sense of engagement and community.
You correctly identified the difficulty in building sustainable local groups is that no one is responsible for maintaining them. EA should move towards setting up professional community builders in key cities to keep EA’s spiritually tied to the movement even if they aren’t working in the top orgs. I imagine a lot of people are hesitant at this idea because they view it as wasted resources but I suspect that’s wrong and that these groups will become net financial contributors to the movement.
Hey, I’m thinking of professional ‘groups’ or strong networks without respect to geography, though I would guess that some professions will cluster around certain geographies. E.g. in finance you’d expect EAs to be mainly in London, Frankfurt, New York etc. And it would be preferable for members to be in as few locations as possible.
I agree that local groups are very important, and plausibly more important, than professional groups. However, local groups work largely by getting members more involved in the community and providing ‘push’ factors to go into EA careers. I think the next frontier of community building will be to add these ‘pull’ factors. We have made a lot of progress on the local groups side, now it is time to think about the next challenge.
Re professional community builders: this is already happening & good. But they are largely working on getting members more engaged, rather than building strong professional ‘core’ communities (though some people do work in this direction, it is not a main focus).
I suspect the driving force will be volunteers at the start, similar to how student groups got started initially. These would be people that are already well-connected and have some experience in their field. This would also get around the issue that EA orgs may currently not have resources for such projects. I doubt funding will be an issue hif the volunteers meet these properties.