Pitfalls to consider when community-building with young people

This is a quick outline of two worries that come up for me when I consider EA’s focus on community-building amongst university-age people, sometimes younger. I am mostly focussed on possible negative consequences to young people rather than EA itself. I don’t offer potential solutions to these worries, but rather try to explain my thinking and then pose questions sitting at the top of my mind.

Intro

At a past lunch with coworkers, I brought up the topic of, “Sprout EAs”. Currently, this is the term I’m using to describe people who have spent their entire full-time professional career in the EA ecosystem, becoming involved at university-age, or occasionally, high school-age.[1]

Anyways, there are two things I worry about with this group:

Worry one: Sprout EAs stay in EA because it is often easier to stay in things than to leave, especially when you’re young

There’s your standard status quo bias that can get particularly salient around graduation time. At that point, many people are under-resourced and pushing towards more stable self-reliance, uncertain what next steps to take, relatively early in their journey of life and their professional career. Many undergraduate students are familiar with the, “unsure what to do next? Just do grad school!” meme, because when so much of your adult life is ahead of you and you’re confused, it’s enticing to do more of what you know.

In a similar vein: I think those entering the professional world, who have become heavily embedded in EA during their time as a student, have a lot of force behind them pushing them to remain in the EA ecosystem. Maybe this doesn’t really matter, because maybe lots of them will find jobs they really enjoy and have an impact and develop into their adult life, and it’s all good. And also, maybe it’s kind of a moot point because, you have to choose something. This is just a fact about life and being young, and how is anyone supposed to address the reality that, “young people have to make choices and there are lots of uncontrollable factors influencing those choices.”

But, if EA is going to put concerted effort into community building on university campuses, and sometimes with high school students, these are probably important dynamics to think about. Additionally, EA has some unique and potent qualities that can grab young people:

  • It can offer a very clear career-path, which is incredibly comforting

  • It can offer a sense of meaning

  • It can offer a social community

All these things have the potential to make “off-boarding” from EA extra difficult, especially at a time in life when people generally have less internal, social, experiential, and material resources. I worry about young people who could gain a lot of personal benefit from “off-boarding” or just distancing a bit more from EA, yet struggle to do so (for reasons of the flavour described above) or forget this is even an option/​find it too mentally aversive to consider.

Worry two: EA offers young people things it isn’t “trying to” or “built to,” which can lead to negative outcomes for individuals

I think this is an important point that can get muddled. There’s the thing EA “actually is,” which is debatable and a bit abstract. It’s a community, an idea, maybe a question? It’s not a solved, prescriptive, infallible philosophy. It is, maybe, a powerful framework with a highly active professional and social community built around it, attempting to do good. But the way it can hit people differs quite a bit. No one can control if EA fills holes in people’s lives, even if that isn’t an express or even desirable goal.

On one level, EA can easily hit as a straightforward career plan and life purpose that young people can scoop up and run with, if they’re positioned to do so. That anyone can scoop up, of course. But young people, being young and often more impressionable, less established, etc., can be particularly positioned to scoop. I don’t know how to avoid that or if avoiding it is even possible. However, this reality leads to a whole host of outcomes, many of which are not that concerning, and some that are a little concerning.

Example of something not concerning to me: Undergraduate student A hears about EA and gets really excited. They like biology already, but weren’t sure what career path to pursue. They decide to do biosecurity research at graduate school based on 80,000 hours.[2] They join an EA-aligned biosecurity lab, they enjoy it and are good at their work. They conduct rigorous research and also live their life, and that’s that. Maybe they just, “scooped up the idea and ran,” without thinking too hard about whether they’re a longtermist or what their cause prioritization is, etc., and really, that’s completely fine. It’s not to say they didn’t think at all, but perhaps, in no small part, they chose biosecurity research because they liked the sense of community and meaning that came with their work, they were planning to do something research-related anyway, and they didn’t scrutinize that too much.

Also not concerning: Undergraduate student B hears about EA and gets really excited. They become actively involved in their local EA group and make lots of friends. They think deeply about the core principles of EA and potential career paths they could pursue. They spend one summer completing a research internship at their university and another summer completing a research internship on existential risk. After weighing many factors, they decide to build career capital by working for the US government. They really enjoy participating in EA (through meet-ups and online spaces) and find it personally fulfilling, but have also tried to stress-test it with friends outside EA. They live with a friend from university who is also interested in a policy career, but not particularly EA.

Example of something more concerning: Undergraduate student C hears about EA and gets really excited. They go to all their university’s EA meet-ups, and soon, all of their friends are “EAs”. They graduate and aren’t sure what they want to do, but they know they want it to be at an explicitly “EA org”. This feels emotionally important to them because EA has sort of totalised their social environment and mental space. They’ve really internalised the idea that having an impact is imperative and the way to do that is to do EA labeled things. They live in an EA house with people working at various EA organisations. They…..

  1. Keep trying to get a job at an EA organisation (in research or operations, whatever really), and it’s hard, so they accept sub-optimal work environments.

  2. Can’t find a position at an explicitly EA-aligned organisation and this is really upsetting. They overextend themselves throughout the job-hunting process and take on significant personal costs, well past the point at which they should have pivoted to investigating roles that are not explicitly EA-aligned.

  3. Over-optimise on, “getting an EA job” and learn how to, “talk the talk”. They’re good at sounding as though they have reasoning transparency and have thought deeply about their values. On some level, they genuinely have. But the core thing driving them is, “get an EA job” (whether conscious or not, probably not).

Numbers 1 and 2 are damaging to individuals and number 3 could be damaging to the community/​epistemic environment. I have no idea how frequently these things are happening. There is anecdotal evidence that 1 has happened at least a handful of times. I can think of a few examples that map onto 2 pretty well, though maybe less extreme.

3 is one outcome that could lead to the slow erosion of epistemic rigour over time. This concern is certainly not novel and has been discussed at length in different places (e.g. Bad Omens in Current Community Building).

I also want to caveat: worries 1 and 2 are surely present in other (many?) professional spaces. There is an extent to which these realities are unavoidable. But they are worries nonetheless and worth thinking about.

Questions

As a result of these worries, the questions I have are:

  1. Are we considering these types of dynamics when deciding how to structure community-building efforts targeted at “young people”? I’m especially worried about potential programs targeted at high school students, for whom I imagine this is all further amplified (generally speaking).

  2. How much of these worries are just, “the reality of life and the world and trying to do things” versus, “dynamics we could better consider and try to avoid”? In what ways do we currently encourage or guard against these dynamics, if any?

  3. How can we all keep some of these dynamics in mind when offering advice to more junior people interested in working within EA? Especially those of us explicitly working on community-building. What does that look like?

  1. ^

    “Sprout EA” is not a great term, I’m sorry. I tried asking ChatGPT and it suggested “Eternal Change Agents” and “Continuous Impact Enthusiasts, among others, which are bad. My friend Kirsten suggested “Career EAs,” but then the acronym is CEAs and abbreviations are only meant to bear so many loads (ideally, one, but EA tends to push it). Let me know if you have any ideas. The actual term I like is “EA Babies,” but comes across infantilising :(

  2. ^

    My friend wanted to point out that biosecurity is, in fact, an interdisciplinary cause area and you should check out, Biosecurity needs engineers and materials scientists and Laboratory Biorisk Management Needs More Social Scientists.