This is a great post! Upvoted. I appreciate the exceptionally clear writing and the wealth of examples, even if I’m about 50⁄50 on agreeing with your specific points.
I haven’t been involved in university community building for a long time, and don’t have enough data on current strategies to respond comprehensively. Instead, a few scattered thoughts:
I was talking to a friend a little while ago who went to an EA intro talk and is now doing one of 80,000 Hours’ recommended career paths, with a top score for direct impact. She’s also one of the most charismatic people I know, and she cares deeply about doing good, with a healthy practical streak.
She’s not an EA, and she’s not going to be. She told me that she likes the concept and the framing, and that since the intro talk she’s often found that when faced with big ethical questions it’s useful to ask “what would an EA do”. But she’s not an EA.
I don’t like using “EA” as a noun. But if we do want to refer to some people as “EAs”, I think your friend has the most important characteristics described by that term.
Using EA’s core ideas as a factor in big decisions + caring a lot about doing good + strong practical bent + working on promising career path = yes, you are someone who practices effective altruism (which seems, to me, like the best definition of “an EA”). You don’t have to attend the conferences or wear the t-shirts to qualify.
Second, despite some pushback, current EA community building doctrine seems to focus heavily on producing ‘Highly Engaged EAs’ (HEAs). It is relatively easy to tell if someone is a HEA.
Not sure about current doctrine, but my impression is that “HEA” isn’t meant to be a binary category. Based on your statement:
Is one EA in government policy worth more than a hundred civil servants who, though not card-carrying EAs, have seriously considered the ideas and are in touch with engaged EAs who can call them up if need be? What about great managers and entrepreneurs?
I’d be surprised if even the most literal interpretation of any community-building advice would have an organizer favoring “one person in policy” over “one hundred policy people being interested in EA” (feels an order of magnitude off, maybe?).
Bolding for emphasis: People often overestimate how important “full-time EA people are” to the movement, relative to people who “have seriously considered the ideas and are in touch”.
That’s largely because people who discuss EA online are frequently in the first group. But when it comes to impactful projects, a massive amount of work is done by people who are very focused on their own work and less interested in EA qua EA.
When I see my contacts excitedly discussing a project, it often looks like “this person who was briefly involved with group X/is friends with person Y is now pursuing project Z, and we think EA played a role”. The person in question will often have zero connection with “the EA community” at large, no Forum account, etc.
You see less of this on the Forum because “this person got a job/grant” and “this person has started a new project” aren’t exciting posts unless the person in question writes them. And the non-Forum-y people don’t write those posts!
I asked around, and quickly stumbled upon some people who confidently told me that EA was an organisation that wanted to trick me into signing away my future income to them in exchange for being part of their gang.
I got this reaction a lot when I was starting up Yale EA in 2014, despite coming up with all my messaging alone and having no connection to the wider EA community. Requests to donate large amounts of money are suspicious!
I’d expect to see less of this reaction now that donating and pledge-taking get less emphasis than in 2014, especially in college groups. But I think it’s hard to avoid while also trying to convey that the things we care about are really important.
(Doesn’t mean we shouldn’t try, but I wouldn’t see the “donations are a scam” perspective as strong evidence that organizers are making the wrong choices.)
If better epistemics trades off against getting more alignment researchers, maybe you think it’s not worth doing. However, it’s not clear at all that this is the case.
Almost everyone I’ve interacted with in EA leadership/CB is obsessed with good epistemics — they value them highly when recruiting/evaluating people, much more than with any other personal trait (with rare exceptions, e.g. strong technical skills in roles where those are crucial).*
My impression is that they’d be happy to trade a bunch of alignment for epistemic skill/virtue at the margin for most people, as long as alignment didn’t dip to the point where they had no interest in working on a priority problem.
This doesn’t mean that current CB strategy is necessarily encouraging good epistemics. (I’m sure it varies dramatically between and within groups.) It’s possible for a group’s strategy not to achieve the ends they want — and it’s easier to Goodhart on alignment than epistemics, because the former is easier to measure.
But I am confident that leaders’ true desire is “find people who have great epistemics [and are somewhat aligned]”, not “find people who are extremely aligned [and have okay epistemics]”.
*To clarify my perspective: I’ve seen discussion of 100+ candidates for jobs/funding in EA. Alignment comes up often, but mostly as a checkbox/afterthought, while “how the person thinks” is the dominant focus most of the time. Many terms are used — clear thinking, independent thinking, nuanced thinking — but they point to the same cluster of traits.
You should expect there to be whole types of reason (like ‘you guys seem way more zealous than I’m comfortable with’) which you’ll be notably less likely to hear about relative to how much people think it, especially if you’re not prioritising getting this kind of feedback.
This is very true!
One good way to hear a wider range of feedback is to have friends and activities totally separate from your EA work who can give you a more “normal” perspective on these things. This was automatic for me in college; our EA group was tiny and there wasn’t much to do, so we all had lots of other stuff going on, and I’d been making friends for years before I discovered EA.
I gather that EA groups are now, in some cases, more like sports teams or music groups — things that can easily consume most of someone’s non-class hours and leave them in a place where most of their friends are in the same club. It’s good to have a close-knit group of altruistic friends, but spending all of your time around other people in EA will limit your perspective; guard against this!
(Also, having hobbies not related to your life’s central purpose seems healthy for a lot of reasons.)
Assume that people find you more authoritative, important, and hard-to-criticise than you think you are. It’s usually not enough to be open to criticism—you have to actually seek it out or visibly reward it in front of other potential critics.
Also very true!
Flagging this because it is very hard to account for properly, I’ve had to adjust my expectation of how hard-to-criticize I am several times (especially after I started getting jobs within EA).
Minor elaboration on your last point: a piece of advice I got from someone who did psychological research on how to solicit criticism was to try to brainstorm someone’s most likely criticism of you would be, and then offer that up when requesting criticism, as this is a credible indication that you’re open to it. Examples:
“Hey, do you have any critical feedback on the last discussion I ran? I talked a lot about AI stuff, but I know that can be kind of alienating for people who have more interest in political action than technology development… Does that seem right? Is there other stuff I’m missing?”
“Hey, I’m looking for criticism on my leadership of this group. One thing I was worried about is that I make time for 1:1s with new members, but not so much with people that have been in the group for more than one year...”
“Did you think there was there anything off about our booth last week? I was noticing we were the only group handing out free books, maybe that looked weird. Did you notice anything else?”
You say:
But I am confident that leaders’ true desire is “find people who have great epistemics [and are somewhat aligned]”, not “find people who are extremely aligned [and have okay epistemics]”.
I think that’s true for a lot of hires. But does that hold equally true when you think of hiring community builders specifically?
In my experience (5 ish people), leaders’ epistemic criteria seem less stringent for community building. Familiarity with EA, friendliness, and productivity seemed more salient.
I could imagine Bob beating Alice for a “build a new group” role (though I think many CB people would prefer Alice), because friendliness is so crucial.
I could imagine Carol beating Alice for an ops role.
But if I were applying to a wide range of positions in EA and had to pick one trait to max out on my character sheet, I’d choose “epistemics” if my goal were to stand out in a bunch of different interview processes and end up with at least one job.
One complicating factor is that there are only a few plausible candidates (sometimes only one) for a given group leadership position. Maybe the people most likely to actually want those roles are the ones who are really sociable and gung-ho about EA, while the people who aren’t as sociable (but have great epistemics) go into other positions. This state of affairs allows for “EA leaders love epistemics” and “group leaders stand out for other traits” at the same time.
Finally, you mentioned “familiarity” as a separate trait from epistemics, but I see them as conceptually similar when it comes to thinking about group leaders.
Common questions I see about group leaders include “could this person explain these topics in a nuanced way?” and “could this person successfully lead a deep, thoughtful discussion on these topics?” These and other similar questions involve familiarity, but also the ability to look at something from multiple angles, engage seriously with questions (rather than just reciting a canned answer), and do other “good epistemics” things.
This is a great post! Upvoted. I appreciate the exceptionally clear writing and the wealth of examples, even if I’m about 50⁄50 on agreeing with your specific points.
I haven’t been involved in university community building for a long time, and don’t have enough data on current strategies to respond comprehensively. Instead, a few scattered thoughts:
I don’t like using “EA” as a noun. But if we do want to refer to some people as “EAs”, I think your friend has the most important characteristics described by that term.
Using EA’s core ideas as a factor in big decisions + caring a lot about doing good + strong practical bent + working on promising career path = yes, you are someone who practices effective altruism (which seems, to me, like the best definition of “an EA”). You don’t have to attend the conferences or wear the t-shirts to qualify.
Not sure about current doctrine, but my impression is that “HEA” isn’t meant to be a binary category. Based on your statement:
I’d be surprised if even the most literal interpretation of any community-building advice would have an organizer favoring “one person in policy” over “one hundred policy people being interested in EA” (feels an order of magnitude off, maybe?).
Bolding for emphasis: People often overestimate how important “full-time EA people are” to the movement, relative to people who “have seriously considered the ideas and are in touch”.
That’s largely because people who discuss EA online are frequently in the first group. But when it comes to impactful projects, a massive amount of work is done by people who are very focused on their own work and less interested in EA qua EA.
When I see my contacts excitedly discussing a project, it often looks like “this person who was briefly involved with group X/is friends with person Y is now pursuing project Z, and we think EA played a role”. The person in question will often have zero connection with “the EA community” at large, no Forum account, etc.
You see less of this on the Forum because “this person got a job/grant” and “this person has started a new project” aren’t exciting posts unless the person in question writes them. And the non-Forum-y people don’t write those posts!
I got this reaction a lot when I was starting up Yale EA in 2014, despite coming up with all my messaging alone and having no connection to the wider EA community. Requests to donate large amounts of money are suspicious!
I’d expect to see less of this reaction now that donating and pledge-taking get less emphasis than in 2014, especially in college groups. But I think it’s hard to avoid while also trying to convey that the things we care about are really important.
(Doesn’t mean we shouldn’t try, but I wouldn’t see the “donations are a scam” perspective as strong evidence that organizers are making the wrong choices.)
Almost everyone I’ve interacted with in EA leadership/CB is obsessed with good epistemics — they value them highly when recruiting/evaluating people, much more than with any other personal trait (with rare exceptions, e.g. strong technical skills in roles where those are crucial).*
My impression is that they’d be happy to trade a bunch of alignment for epistemic skill/virtue at the margin for most people, as long as alignment didn’t dip to the point where they had no interest in working on a priority problem.
This doesn’t mean that current CB strategy is necessarily encouraging good epistemics. (I’m sure it varies dramatically between and within groups.) It’s possible for a group’s strategy not to achieve the ends they want — and it’s easier to Goodhart on alignment than epistemics, because the former is easier to measure.
But I am confident that leaders’ true desire is “find people who have great epistemics [and are somewhat aligned]”, not “find people who are extremely aligned [and have okay epistemics]”.
*To clarify my perspective: I’ve seen discussion of 100+ candidates for jobs/funding in EA. Alignment comes up often, but mostly as a checkbox/afterthought, while “how the person thinks” is the dominant focus most of the time. Many terms are used — clear thinking, independent thinking, nuanced thinking — but they point to the same cluster of traits.
This is very true!
One good way to hear a wider range of feedback is to have friends and activities totally separate from your EA work who can give you a more “normal” perspective on these things. This was automatic for me in college; our EA group was tiny and there wasn’t much to do, so we all had lots of other stuff going on, and I’d been making friends for years before I discovered EA.
I gather that EA groups are now, in some cases, more like sports teams or music groups — things that can easily consume most of someone’s non-class hours and leave them in a place where most of their friends are in the same club. It’s good to have a close-knit group of altruistic friends, but spending all of your time around other people in EA will limit your perspective; guard against this!
(Also, having hobbies not related to your life’s central purpose seems healthy for a lot of reasons.)
Also very true!
Flagging this because it is very hard to account for properly, I’ve had to adjust my expectation of how hard-to-criticize I am several times (especially after I started getting jobs within EA).
Minor elaboration on your last point: a piece of advice I got from someone who did psychological research on how to solicit criticism was to try to brainstorm someone’s most likely criticism of you would be, and then offer that up when requesting criticism, as this is a credible indication that you’re open to it. Examples:
“Hey, do you have any critical feedback on the last discussion I ran? I talked a lot about AI stuff, but I know that can be kind of alienating for people who have more interest in political action than technology development… Does that seem right? Is there other stuff I’m missing?”
“Hey, I’m looking for criticism on my leadership of this group. One thing I was worried about is that I make time for 1:1s with new members, but not so much with people that have been in the group for more than one year...”
“Did you think there was there anything off about our booth last week? I was noticing we were the only group handing out free books, maybe that looked weird. Did you notice anything else?”
Appreciate your comments, Aaron.
You say: But I am confident that leaders’ true desire is “find people who have great epistemics [and are somewhat aligned]”, not “find people who are extremely aligned [and have okay epistemics]”.
I think that’s true for a lot of hires. But does that hold equally true when you think of hiring community builders specifically?
In my experience (5 ish people), leaders’ epistemic criteria seem less stringent for community building. Familiarity with EA, friendliness, and productivity seemed more salient.
This is a tricky question to answer, and there’s some validity to your perspective here.
I was speaking too broadly when I said there were “rare exceptions” when epistemics weren’t the top consideration.
Imagine three people applying to jobs:
Alice: 3⁄5 friendliness, 3⁄5 productivity, 5⁄5 epistemics
Bob: 5⁄5 friendliness, 3⁄5 productivity, 3⁄5 epistemics
Carol: 3⁄5 friendliness, 5⁄5 productivity, 3⁄5 epistemics
I could imagine Bob beating Alice for a “build a new group” role (though I think many CB people would prefer Alice), because friendliness is so crucial.
I could imagine Carol beating Alice for an ops role.
But if I were applying to a wide range of positions in EA and had to pick one trait to max out on my character sheet, I’d choose “epistemics” if my goal were to stand out in a bunch of different interview processes and end up with at least one job.
One complicating factor is that there are only a few plausible candidates (sometimes only one) for a given group leadership position. Maybe the people most likely to actually want those roles are the ones who are really sociable and gung-ho about EA, while the people who aren’t as sociable (but have great epistemics) go into other positions. This state of affairs allows for “EA leaders love epistemics” and “group leaders stand out for other traits” at the same time.
Finally, you mentioned “familiarity” as a separate trait from epistemics, but I see them as conceptually similar when it comes to thinking about group leaders.
Common questions I see about group leaders include “could this person explain these topics in a nuanced way?” and “could this person successfully lead a deep, thoughtful discussion on these topics?” These and other similar questions involve familiarity, but also the ability to look at something from multiple angles, engage seriously with questions (rather than just reciting a canned answer), and do other “good epistemics” things.