Community building in effective altruism (panel discussion)
Introduction
This post is a write-up of a panel discussion on community building held at EA Global: Boston 2023. The panel was moderated by Gabe Mukobi. Gabe currently leads Stanford AI Alignment and previously led Stanford Effective Altruism.
The three panelists were:
Jessica McCurdy, Head of Groups, Centre for Effective Altruism
Kuhan Jeyapragasan, Co-founder, Cambridge Boston Alignment Initiative
Alix Pham, Co-director, EA Switzerland
Below is a transcript of the discussion, which has been lightly edited for clarity. The panelists covered five main topics:
Their career paths and current roles
Community building’s definition and scope
How community building contributes to career growth and skill development
Pros and cons of EA and cause-specific community building
‘Big-tent’ versus ‘small-tent’ EA
Career paths and current work
Gabe: Please tell us a little bit about how you got into your current roles. What are you doing now and why is that work important to you? Why is it impactful? Jessica, do you want to start?
Jessica: Yes, thanks. Most of my history is around university community building, so my answers here will be informed by that (although now I’m also thinking more about city and national groups, as well as EA virtual programs).
I got involved in community building through my university group. I just found EA to be this amazing thing, where people care about impartiality and doing as much good as possible, are really smart, and are great to be around. Also, you can focus on EA as a full-time job and have a really big impact on the world. That’s all very exciting, and after testing my potential fit in different areas, I found I really got a lot of energy from and excelled in the community-building sector.
I’ve also seen a lot of impact stories coming out of the things that I’m doing, which is really cool — especially if you’re trying to work on longtermist causes, since feedback loops are hard there. One example is a retreat that I ran a few years ago and that led to possibly two counterfactual Open Philanthropy hires. It’s quite exciting to see really talented people getting motivated to go into different things.
I also believe that community building is often quite counterfactual. People are very influenced by their circumstances — the other people they hang out with and who happens to recommend opportunities. So, I think those of us doing community building are really well-placed to position others.
Kuhan: I’ll focus on two particular areas of interest that I’ve been exploring recently. One is policy and government engagement. Like many others, I think I’ve been pretty surprised by how seriously the US and UK governments have been taking AI. I’ve been trying to think through the implications of that for community building and strategy for people interested in improving the long-term future. So, I’m currently contracting with the Horizon Institute to run an AI policy workshop, and I have also been thinking about other field-building efforts that might be especially high-leverage. For example, I’ve been concerned about the upcoming US presidential election for a while, which could have several implications for AI safety, especially if the timeline to “crazytown AI” is on the shorter end. Field building might involve AI safety advising for the Trump campaign, or other work to help position us for AI safety regardless of the results of the election.
Here’s another angle I’ve been thinking about more: as AI gets more mainstream, perhaps resources should shift away from education and outreach and more towards supporting people who are already concerned, helping them turn that concern into the most impactful outcomes. That could entail more mid- to late- career field building. What does better infrastructure look like to support people who are already concerned about these issues?
Lastly, if mainstream society is going to get more and more concerned about certain important topics, then on the margin, other topics that are less concerning to non-impartial altruistic actors might be especially important for people in the EA community to focus on, such as digital sentience, AI-human cooperation, or the other weirder things that we think are important but most people don’t.
Alix: At EA Switzerland we’re doing national community building. That means we support city and university groups, as well as cause-specific groups. We also run countrywide events, such as retreats and online programs.
We see EA and community building as two pipelines. There’s the “EA principle pipeline” for building the ideas, concepts, and frameworks that we use. There’s also the cause-specific pipeline, which is more about the work itself. We want to help kickstart, support, and build bridges between the cause-specific pipeline and the EA principle pipeline.
I think in the end it boils down to the people whom you support. And what I really love about my job as a community builder is that we’re able to empower people, and thus projects, to have more impact. One interesting lesson that I’ve learned — even though it’s been said before — is that people need permission. It’s really important to give people permission to validate their choices and affirm their agency so that they can initiate change and realize their potential. I think it’s super powerful to remember that.
However, there’s something I’ve been confused and conflicted about. Community building is about giving people space and a social ecosystem. They should feel like they belong. It shouldn’t feel like there’s a lot of gatekeeping. It should be inclusive. We should welcome people to join EA. At the same time, EA is about finding uniquely good opportunities and people who can have tremendous impact. And this, by definition, is exclusive. I have found it very hard to navigate these opposing forces in EA community building.
For example, I’ve noticed that when people learn about existential risks, they discover the inclusive part — they join conferences and hear about the topics. Then they want to scale up, contribute, and participate. But the field seems extremely competitive. It’s very hard to find places where you can actually skill up and contribute. And I don’t think EA is exclusive because we want it to be. It’s just that maybe we don’t have enough opportunities to offer. And then I feel we’re missing out on some talent, people, and opportunities just because we don’t have enough space for them. Therefore, I wish more of us doing community building at the national, city, and university levels would focus more on intermediate opportunities — stuff that would allow us to accept more people, probably on a local level. That would allow us to give space to those people and not lose them just because they cannot find their path.
I’d also like to give a shoutout to community building. It’s hard to retain people in community-building jobs. People usually want to do direct work. But community building is really rewarding. As Jessica said, it has short feedback loops. So that’s kind of nice. You also get to upskill on a very large number of fronts. I can’t even begin to enumerate all of them. And it’s the backbone of the community. I think there’s a need for people who are here to prepare the ground for others to do direct work.
I’ve also been thinking about how people approach community building depending on where they are. I might be wrong about this, but it feels like there are some cultures that are overrepresented on the EA Forum, and in grantmaking and research organizations. It’s probably a founder’s effect and the result of English being a common native language. I’m afraid that if that’s true, the general community is missing perspective from the broader community. That could be an obstacle to reaching epistemic pluralism and getting a variety of worldviews and opinions within EA that would probably improve the way we’re doing things as a community. I can’t think of easy actions to take to improve this. But I think we might need to think about it and work on it. It’s an issue that I wanted to bring up here.
To conclude, I think community building is about accompanying people on their EA journey. It’s about finding the right balance between being all-welcoming and identifying unique talents and trying to find the right opportunities for them. But in the end, I’m just also very happy and grateful that we can build this thriving and impactful community where people can hopefully find their calling.
Community building’s definition and scope
Gabe: Great. Thank you all. There are definitely a lot of hot topics to talk about — and we’ll dive into those.
To get us started — and especially for the newer people — it’s important to build some context. Maybe a lot of people are uncertain about what community building means. When I was getting involved in EA, it seemed like maybe it was just about having a lot of one-on-ones, or running a local group, and that was all. But what is the range of activities? Could you expand our Overton window of what community building can be?
Kuhan: Sure. For better or for worse, I often just think of my job as doing as much good as I can. I think broadly about what actions would be useful for someone to take and then narrow that down to the actions I can take specifically. For example, I wasn’t a student at Harvard or MIT, but I decided to move to Boston because I thought it would be more impactful for me to work on community building there than at Stanford, where things were more established and we had great successors like Gabe to continue making sure things went well.
I think a lot of impact comes from noticing inefficiencies — and there are more of those in the impact market, since financial profit isn’t on the line. I just try really hard to figure out what those inefficiencies are and be proactive about picking up the million-dollar impact bills lying on the ground. You can get surprisingly far by doing straightforwardly useful things and by trying hard.
So, I think a lot of the value in community building is about finding those impact goals and combining them with specific implementation details to actually make good things happen in the world that wouldn’t otherwise.
Gabe: That sounds awesome, but also kind of vague. Could you give us some more examples?
Kuhan: Yeah, I think starting the AI Alignment group at MIT and doing university community building seems to have a pretty strong track record. MIT’s EA and AI safety scene had not been super active for the last several years, and since I was friends with a few students at MIT, I thought I could take my experience from Stanford and transport that over to MIT. I’ve done pretty straightforward, obvious things like running programming, emailing all the students about AI safety and EA, having a lot of one-on-ones, and giving presentations. As a result, we’ve been able to grow the community pretty quickly. Also, I’d like to give a shoutout to HAIST, the Harvard AI safety team for providing a lot of the programming that we actually just copied to get started.
Jessica: I can add to that. One of the ways I think about community building is as “meta EA,” which is just community building on the ground and introducing people to EA ideas, and having one-on-one conversations that are quite influential for people. Then there’s “meta-meta EA,” which is about organizing people to do those things. I think you can actually get a lot of leverage by doing something like seeding a university group or advising the people who are on the ground to do this meta work, just to have more coordination and gather more lessons learned.
You can also target different areas of the pipeline. There’s the big end of the funnel in which we’re doing broad outreach. I think you have to be quite careful with your broad outreach. I don’t necessarily recommend that all university students be trained to do broad outreach. That requires a certain skill set and strategy. Also, it’s probably only cost-effective if you actually have a very big scope.
Then you can go into the middle of the funnel, which includes things like having people come to this EA Global event. Running EAG and the EA Forum are types of community building where we’re fostering the ideas within EA. There are also things like seeding opportunities. EA Switzerland has done a good job of this — they’ve provided really good opportunities for people to get a taste of research roles and work on projects. That’s another type.
Then, way deep down in the funnel, there are things like working to connect people with specific jobs — recruiting or headhunting.
Kuhan: Another concrete example I forgot to mention is from the time when I was running Stanford EA and SERI (Stanford Existential Risks Initiative). At the time there were far fewer summer opportunities, internships, and research programs than there are now. I noticed that as a big bottleneck for more junior people to get more involved. Part of me was like, “Who am I to run a research program? I’m not a researcher.” I felt like there were a lot of insanely difficult and complex questions that I had a hard time understanding. But also, I thought, “If not me, then who?” I had a pretty unique opportunity at Stanford, which is one of the very few institutions with an official academic institution focused on high-priority issues.
So, one summer, we ran a virtual research program with about 70 participants, which was maybe too many, but we had pretty impactful outcomes. For example, compute governance work was largely started by Lennart Heim’s SERI summer internship. So, identifying big bottlenecks and noticing unique opportunities (given your circumstances to address them) can be really valuable.
Gabe: Yeah, I really like the perspective of thinking about where the bottlenecks are, where the challenges are, and just going to solve those. A lot of people think of community building as something where you take a bottom-up approach and keep running something that already exists or make incremental improvements, when really, you can take a top-down approach and actually solve the big problems in the world.
Did you have more to add, Alix?
Alix: Yeah, I’m thinking about how important it was for us to realize the comparative advantages that we have in a specific location. Spending time thinking about that is quite important. Or, if that doesn’t yield anything significant, then maybe consider where the best place to answer your questions is.
We offer group support, because we’re a national community-building organization. We support local groups. We have ecosystem support, which involves supporting charity. We also have Swiss AI safety camps and support several alternative protein projects. So, coordinating those people and projects and providing them with our operational infrastructure is a big part of our work.
Then, as Jessica said, we also provide individual support and career advising; we connect people with resources, with other people, and with opportunities. We do less of the outreach and communications, because I think those things are hard to do. You need to think hard about it to do it well.
How community building contributes to career growth and skill development
Gabe: Speaking of filling gaps in the system and helping with people’s careers, a lot of people may think of community building as more of an opportunity to exploit impact now rather than one where you explore and develop yourself. But in your introductions, you mentioned that community building can actually help people develop a lot of critical skills and grow in their careers. Can you talk more about that?
Alix: Sure. I think we’re probably all examples of that. When you volunteer or work full-time on community building, you realize that you need to learn about multiple different cause areas just to be able to talk about them with people who come to you for career advice. You work with them to find out more about various areas and understand who the stakeholders are.
You also learn how to run an organization and do legal, financial, and administrative work. When you’re early in your career like many of us, those are skills that are useful for the rest of your life — and that you often don’t get to learn elsewhere. You also learn about strategy and operations. For example, you might think about who the right board of directors might be for your organization. There are many things like that. You don’t think about them when you accept this kind of role, but you end up doing them anyway, and you learn a lot.
Kuhan: To follow up on that, one framing of community building that I like is accruing flexible resources. Money is one example of a flexible resource; you can use money to buy bed nets or to fund someone to do research on AI safety, or to do anything in between. But, you know, motivated people who are trying to figure out how to do the most good and who then actually do it represent an even more flexible resource, because they can earn money and then donate it to effective causes. In addition, people can do direct work, advocacy, or community building. So, in terms of the explore-exploit tradeoff, I think of community building as one of the best ways to build capacity to do more exploration in the future.
Jessica: I often joke that I learned way more practical skills by running my EA university group than I did in my classes. I think that’s probably true. I learned how to do project management and people management. And I learned so many important things about how to actually make things happen. And those feel like quite useful and transferable skills to have, along with entrepreneurial types of skills. So, if you want to go into other areas, running a really good EA group shows that you have agency and can get things done. That’s a really desirable quality. People often ask us to recommend people for jobs.
Another option is to skill up in a certain area. Maybe you want to learn a lot more about AI. You could do that on your own. Or you could do that with other people at the same time, and then you’re not only learning, but also doing community building.
So, I think there are a lot of transferable skills. Many people who’ve done community building have then gone on to other things, including direct work in areas like technical AI safety or starting their own charity. Some people stay in community building, too.
Gabe: I’m just curious: Jessica, do you think the degree to which people are building skills changes as they get into the more meta levels of EA community building that you mentioned earlier?
Jessica: Meta-meta community building often involves more people management and maybe more strategy, although I would argue that even on the ground at a university, or in a small group, you should be thinking about strategy quite hard. Some strategies are way more impactful than others. And it’s really useful to find out how you could be having 10 times as much impact, and I think there are relatively simple ways to do that.
Pros and cons of EA and cause-specific community building
Gabe: Next, I’d like to get some hot takes. To get started, maybe a particularly hot topic now is EA community building versus cause-specific community building. But AI safety in particular has been highlighted a lot. Jessica and Kuhan, you actually wrote some dueling, yin-and-yang posts about this on the Forum (see Kuhan’s post here and Jessica’s here). Would you like to get us started on your takes on how people should think about the relationship between EA and AI safety community building?
Jessica: I don’t know about “dueling” —
Gabe: Maybe it’s more that they’re complementary.
Jessica: I’ve been thinking about this quite a lot. And I basically think it comes down to doing what you excel at. I think you should think a lot about what you believe the biggest problems in the world are and have your own views on this. Let’s say you believe that AI safety is the most important thing. That doesn’t necessarily mean you should automatically go into cause-specific community building. EA has a really great track record and also offers a lot of flexibility and principles that have led to many great developments in AI safety. For some people, however, it might make sense to go directly into AI safety.
Kuhan: I think since writing the post that Gabe was referring to, I’ve probably moved more toward thinking that EA community building has real value and creates something special that you don’t get from cause-specific movement building. Specifically, I’ve been surprised at how much the people who have gotten really involved in our AI safety group also got into EA afterwards, or were already EAs (and that’s what led to them getting involved in AI safety). I think there’s something about taking ideas seriously and actually taking action on your beliefs that the EA community cultivates in a way that is pretty rare.
Also, as I mentioned earlier, as AI becomes more mainstream, it might be the case that it’s much more impactful to work on issues that aren’t clearly important to everyone in the world, in the way that existential risk is. I’m not sure how true that is. But it wouldn’t surprise me if major world governments and others become pretty concerned about AI existential risk, but very few people care about morally relevant digital sentience, and what to do about that (along with all the other issues).
Gabe: To build on that, there is also maybe another difference in the people or audience you’re addressing with EA community building versus cause-specific community building. When you’re talking to EAs, there is already some kind of shared understanding around doing the most good, which makes certain things easier and certain things harder. How would you recommend people approach cause-specific community building with this different audience?
Kuhan: One thing I’ve been thinking about more as these issues become more mainstream is that a lot of the most impactful work that needs to be done will happen outside of EA or EA-motivated organizations.
I think Alix brought up the concentration of funding in the EA community. Consider the Gates Foundation, for example, which I believe is significantly better funded than Open Philanthropy. I don’t think that concerns about AI are unique to EA. Having more external actors getting concerned or informed enough to start channeling funding into the ecosystem could help address a lot of the concerns around conformity, power imbalances, and groupthink that can happen with one centralized funder.
I also mentioned earlier that a lot of positions in government will be increasingly important as the government gets more and more concerned about AI. So, I’m thinking about where impactful work will happen, and what skills and profiles we’ll need to do those kinds of work.
Alix: I think both community building focused on EA principles and more cause-specific community building are important. They have different roles in bringing about impactful changes. We probably need to find the right balance between them.
Jessica: I think these different audiences are attracted to different things. And it’s quite plausible people can get involved in AI safety without having to go through the whole EA pipeline. It just makes a lot of sense for these groups to exist. You want a diversity of community-building efforts.
Big-tent versus small-tent EA
Gabe: Yeah, speaking of diversity, another topic that was big earlier this year was big-tent EA versus more specific EA. Should we have larger events and groups that are open to a wider range of people, do more targeted outreach to specific audiences? Alix, you run an EA group for a whole country — what’s your experience on this topic?
Alix: I’m not sure to what extent running a countrywide group relates to big-tent versus smaller-tent community building. I think that’s one of the ideas I’m most confused about because I feel there are very strong arguments for each. You want small-tent events for people who have carefully formed opinions and can explain why they have them.
At the same time, big-tent EA makes sense if we want EA to be a philosophy that everybody knows about and considers when deciding where to donate their money and what kind of work to do, even if the causes they choose aren’t necessarily what we see as the most pressing problems. Perhaps someone doesn’t want to do AI — they want to do climate change, or they want to de-pollute the oceans. And maybe if EA was mainstream, people would only think about the best interventions in the field, or gravitate toward certain causes just because everybody else is focused on them, rather than choosing the most important cause areas. So, it’s kind of hard to navigate. I think it also relates a bit to the inclusivity-exclusivity issue that I explained earlier.
Gabe: Can you say more about that?
Alix: I don’t know if there’s more to say. I want people who join the communities that we build to feel welcome, and like they’ve found others like them. I want them to be able to join social events and conferences, and be inspired to do important work, because I believe in the concepts and the frameworks of EA.
But in my day-to-day job as a community builder, I want to put energy where it’s going to yield the most impact. You need to choose whom you’re going to spend time with, what kind of programs you’re going to run, and which cause areas to focus on in conferences and retreats. And yet you want people to be aware of other things as well. How do you factor in cause neutrality? That’s my current conflict: how to do my job and choose what will yield the most impact, while still welcoming everyone.
Jessica: I think there’s an important distinction within big-tent EA around how you define EA. I define EA based on core principles that are really important, like being impartial. If you’re going to be like, “this is my cause” — and not be open to other causes — EA is probably not the best place for you. And I think that sometimes when people talk about big-tent EA they just want their pet cause to be included. I feel less motivated by that.
Then there’s this other question around inclusivity and diversity. How many people should we be focusing on? The unfortunate truth is there are heavy tails with talent in the same way that there are heavy tails with causes that we can work on. I wish homelessness didn’t exist and I could spend my time preventing it from happening; similarly, I wish I could be providing EA advice and community building to everyone who’s potentially interested in it. But we are in triage. We have such limited time and resources. So, if we really want to have the biggest possible impact on the world, we need to target the people who are most likely to do that. I don’t necessarily love what this implies. There are other unfortunate truths — for example, the people we target like hanging out with each other a lot, and sometimes don’t like bigger events. I wish the world wasn’t like that. But it kind of is. So we should keep that in mind, if having a big impact is our goal.
Kuhan: One thing I’ve been thinking about, as someone who’s done pretty targeted outreach at Stanford, Harvard and MIT, is that there are certain forms of targeted outreach that might be significantly more impactful than focusing on young, motivated, sharp, and talented college students. The community is drastically lacking people who are right of center — and who could potentially be part of Republican administrations moving forward on AI, or serve in roles where seniority, credentials, and expertise are pretty important. Our community is very young. It does not have many 40-plus-year-old people whom we can recommend for various positions.
So, for people who are a part of those communities, or are able to do more targeted outreach to them, that could represent a comparative advantage. You could be especially impactful if you’re well-suited to do that kind of community building.
Gabe: Yeah, thank you. I really like those more nuanced takes on big- and small-tent EA. We’ll need to end here, but thanks so much to Jessica, Kuhan, and Alix for your thoughts on community building in EA.
Executive summary: Community building in effective altruism encompasses a wide range of activities aimed at growing the movement and developing talent, with debates around the merits of cause-specific versus general EA outreach and “big tent” versus targeted approaches.
Key points:
Community building activities include running local groups, organizing events, providing career advice, seeding opportunities, and addressing bottlenecks in the EA pipeline.
Community building roles offer valuable skill development in areas like project management, strategy, and operations.
There are tradeoffs between EA-focused and cause-specific community building, with both approaches having merits depending on the context.
Tension exists between being inclusive to grow the movement and targeting high-impact individuals/opportunities.
Targeted outreach to underrepresented groups (e.g. older professionals, conservatives) may be especially impactful.
Community builders should consider their comparative advantages and local context when deciding how to focus their efforts.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.