This is a great post! Upvoted. I appreciate the exceptionally clear writing and the wealth of examples, even if Iâm about 50â50 on agreeing with your specific points.
I havenât been involved in university community building for a long time, and donât have enough data on current strategies to respond comprehensively. Instead, a few scattered thoughts:
I was talking to a friend a little while ago who went to an EA intro talk and is now doing one of 80,000 Hoursâ recommended career paths, with a top score for direct impact. Sheâs also one of the most charismatic people I know, and she cares deeply about doing good, with a healthy practical streak.
Sheâs not an EA, and sheâs not going to be. She told me that she likes the concept and the framing, and that since the intro talk sheâs often found that when faced with big ethical questions itâs useful to ask âwhat would an EA doâ. But sheâs not an EA.
I donât like using âEAâ as a noun. But if we do want to refer to some people as âEAsâ, I think your friend has the most important characteristics described by that term.
Using EAâs core ideas as a factor in big decisions + caring a lot about doing good + strong practical bent + working on promising career path = yes, you are someone who practices effective altruism (which seems, to me, like the best definition of âan EAâ). You donât have to attend the conferences or wear the t-shirts to qualify.
Second, despite some pushback, current EA community building doctrine seems to focus heavily on producing âHighly Engaged EAsâ (HEAs). It is relatively easy to tell if someone is a HEA.
Not sure about current doctrine, but my impression is that âHEAâ isnât meant to be a binary category. Based on your statement:
Is one EA in government policy worth more than a hundred civil servants who, though not card-carrying EAs, have seriously considered the ideas and are in touch with engaged EAs who can call them up if need be? What about great managers and entrepreneurs?
Iâd be surprised if even the most literal interpretation of any community-building advice would have an organizer favoring âone person in policyâ over âone hundred policy people being interested in EAâ (feels an order of magnitude off, maybe?).
Bolding for emphasis: People often overestimate how important âfull-time EA people areâ to the movement, relative to people who âhave seriously considered the ideas and are in touchâ.
Thatâs largely because people who discuss EA online are frequently in the first group. But when it comes to impactful projects, a massive amount of work is done by people who are very focused on their own work and less interested in EA qua EA.
When I see my contacts excitedly discussing a project, it often looks like âthis person who was briefly involved with group X/âis friends with person Y is now pursuing project Z, and we think EA played a roleâ. The person in question will often have zero connection with âthe EA communityâ at large, no Forum account, etc.
You see less of this on the Forum because âthis person got a job/âgrantâ and âthis person has started a new projectâ arenât exciting posts unless the person in question writes them. And the non-Forum-y people donât write those posts!
I asked around, and quickly stumbled upon some people who confidently told me that EA was an organisation that wanted to trick me into signing away my future income to them in exchange for being part of their gang.
I got this reaction a lot when I was starting up Yale EA in 2014, despite coming up with all my messaging alone and having no connection to the wider EA community. Requests to donate large amounts of money are suspicious!
Iâd expect to see less of this reaction now that donating and pledge-taking get less emphasis than in 2014, especially in college groups. But I think itâs hard to avoid while also trying to convey that the things we care about are really important.
(Doesnât mean we shouldnât try, but I wouldnât see the âdonations are a scamâ perspective as strong evidence that organizers are making the wrong choices.)
If better epistemics trades off against getting more alignment researchers, maybe you think itâs not worth doing. However, itâs not clear at all that this is the case.
Almost everyone Iâve interacted with in EA leadership/âCB is obsessed with good epistemics â they value them highly when recruiting/âevaluating people, much more than with any other personal trait (with rare exceptions, e.g. strong technical skills in roles where those are crucial).*
My impression is that theyâd be happy to trade a bunch of alignment for epistemic skill/âvirtue at the margin for most people, as long as alignment didnât dip to the point where they had no interest in working on a priority problem.
This doesnât mean that current CB strategy is necessarily encouraging good epistemics. (Iâm sure it varies dramatically between and within groups.) Itâs possible for a groupâs strategy not to achieve the ends they want â and itâs easier to Goodhart on alignment than epistemics, because the former is easier to measure.
But I am confident that leadersâ true desire is âfind people who have great epistemics [and are somewhat aligned]â, not âfind people who are extremely aligned [and have okay epistemics]â.
*To clarify my perspective: Iâve seen discussion of 100+ candidates for jobs/âfunding in EA. Alignment comes up often, but mostly as a checkbox/âafterthought, while âhow the person thinksâ is the dominant focus most of the time. Many terms are used â clear thinking, independent thinking, nuanced thinking â but they point to the same cluster of traits.
You should expect there to be whole types of reason (like âyou guys seem way more zealous than Iâm comfortable withâ) which youâll be notably less likely to hear about relative to how much people think it, especially if youâre not prioritising getting this kind of feedback.
This is very true!
One good way to hear a wider range of feedback is to have friends and activities totally separate from your EA work who can give you a more ânormalâ perspective on these things. This was automatic for me in college; our EA group was tiny and there wasnât much to do, so we all had lots of other stuff going on, and Iâd been making friends for years before I discovered EA.
I gather that EA groups are now, in some cases, more like sports teams or music groups â things that can easily consume most of someoneâs non-class hours and leave them in a place where most of their friends are in the same club. Itâs good to have a close-knit group of altruistic friends, but spending all of your time around other people in EA will limit your perspective; guard against this!
(Also, having hobbies not related to your lifeâs central purpose seems healthy for a lot of reasons.)
Assume that people find you more authoritative, important, and hard-to-criticise than you think you are. Itâs usually not enough to be open to criticismâyou have to actually seek it out or visibly reward it in front of other potential critics.
Also very true!
Flagging this because it is very hard to account for properly, Iâve had to adjust my expectation of how hard-to-criticize I am several times (especially after I started getting jobs within EA).
Minor elaboration on your last point: a piece of advice I got from someone who did psychological research on how to solicit criticism was to try to brainstorm someoneâs most likely criticism of you would be, and then offer that up when requesting criticism, as this is a credible indication that youâre open to it. Examples:
âHey, do you have any critical feedback on the last discussion I ran? I talked a lot about AI stuff, but I know that can be kind of alienating for people who have more interest in political action than technology development⌠Does that seem right? Is there other stuff Iâm missing?â
âHey, Iâm looking for criticism on my leadership of this group. One thing I was worried about is that I make time for 1:1s with new members, but not so much with people that have been in the group for more than one year...â
âDid you think there was there anything off about our booth last week? I was noticing we were the only group handing out free books, maybe that looked weird. Did you notice anything else?â
You say:
But I am confident that leadersâ true desire is âfind people who have great epistemics [and are somewhat aligned]â, not âfind people who are extremely aligned [and have okay epistemics]â.
I think thatâs true for a lot of hires. But does that hold equally true when you think of hiring community builders specifically?
In my experience (5 ish people), leadersâ epistemic criteria seem less stringent for community building. Familiarity with EA, friendliness, and productivity seemed more salient.
I could imagine Bob beating Alice for a âbuild a new groupâ role (though I think many CB people would prefer Alice), because friendliness is so crucial.
I could imagine Carol beating Alice for an ops role.
But if I were applying to a wide range of positions in EA and had to pick one trait to max out on my character sheet, Iâd choose âepistemicsâ if my goal were to stand out in a bunch of different interview processes and end up with at least one job.
One complicating factor is that there are only a few plausible candidates (sometimes only one) for a given group leadership position. Maybe the people most likely to actually want those roles are the ones who are really sociable and gung-ho about EA, while the people who arenât as sociable (but have great epistemics) go into other positions. This state of affairs allows for âEA leaders love epistemicsâ and âgroup leaders stand out for other traitsâ at the same time.
Finally, you mentioned âfamiliarityâ as a separate trait from epistemics, but I see them as conceptually similar when it comes to thinking about group leaders.
Common questions I see about group leaders include âcould this person explain these topics in a nuanced way?â and âcould this person successfully lead a deep, thoughtful discussion on these topics?â These and other similar questions involve familiarity, but also the ability to look at something from multiple angles, engage seriously with questions (rather than just reciting a canned answer), and do other âgood epistemicsâ things.
This is a great post! Upvoted. I appreciate the exceptionally clear writing and the wealth of examples, even if Iâm about 50â50 on agreeing with your specific points.
I havenât been involved in university community building for a long time, and donât have enough data on current strategies to respond comprehensively. Instead, a few scattered thoughts:
I donât like using âEAâ as a noun. But if we do want to refer to some people as âEAsâ, I think your friend has the most important characteristics described by that term.
Using EAâs core ideas as a factor in big decisions + caring a lot about doing good + strong practical bent + working on promising career path = yes, you are someone who practices effective altruism (which seems, to me, like the best definition of âan EAâ). You donât have to attend the conferences or wear the t-shirts to qualify.
Not sure about current doctrine, but my impression is that âHEAâ isnât meant to be a binary category. Based on your statement:
Iâd be surprised if even the most literal interpretation of any community-building advice would have an organizer favoring âone person in policyâ over âone hundred policy people being interested in EAâ (feels an order of magnitude off, maybe?).
Bolding for emphasis: People often overestimate how important âfull-time EA people areâ to the movement, relative to people who âhave seriously considered the ideas and are in touchâ.
Thatâs largely because people who discuss EA online are frequently in the first group. But when it comes to impactful projects, a massive amount of work is done by people who are very focused on their own work and less interested in EA qua EA.
When I see my contacts excitedly discussing a project, it often looks like âthis person who was briefly involved with group X/âis friends with person Y is now pursuing project Z, and we think EA played a roleâ. The person in question will often have zero connection with âthe EA communityâ at large, no Forum account, etc.
You see less of this on the Forum because âthis person got a job/âgrantâ and âthis person has started a new projectâ arenât exciting posts unless the person in question writes them. And the non-Forum-y people donât write those posts!
I got this reaction a lot when I was starting up Yale EA in 2014, despite coming up with all my messaging alone and having no connection to the wider EA community. Requests to donate large amounts of money are suspicious!
Iâd expect to see less of this reaction now that donating and pledge-taking get less emphasis than in 2014, especially in college groups. But I think itâs hard to avoid while also trying to convey that the things we care about are really important.
(Doesnât mean we shouldnât try, but I wouldnât see the âdonations are a scamâ perspective as strong evidence that organizers are making the wrong choices.)
Almost everyone Iâve interacted with in EA leadership/âCB is obsessed with good epistemics â they value them highly when recruiting/âevaluating people, much more than with any other personal trait (with rare exceptions, e.g. strong technical skills in roles where those are crucial).*
My impression is that theyâd be happy to trade a bunch of alignment for epistemic skill/âvirtue at the margin for most people, as long as alignment didnât dip to the point where they had no interest in working on a priority problem.
This doesnât mean that current CB strategy is necessarily encouraging good epistemics. (Iâm sure it varies dramatically between and within groups.) Itâs possible for a groupâs strategy not to achieve the ends they want â and itâs easier to Goodhart on alignment than epistemics, because the former is easier to measure.
But I am confident that leadersâ true desire is âfind people who have great epistemics [and are somewhat aligned]â, not âfind people who are extremely aligned [and have okay epistemics]â.
*To clarify my perspective: Iâve seen discussion of 100+ candidates for jobs/âfunding in EA. Alignment comes up often, but mostly as a checkbox/âafterthought, while âhow the person thinksâ is the dominant focus most of the time. Many terms are used â clear thinking, independent thinking, nuanced thinking â but they point to the same cluster of traits.
This is very true!
One good way to hear a wider range of feedback is to have friends and activities totally separate from your EA work who can give you a more ânormalâ perspective on these things. This was automatic for me in college; our EA group was tiny and there wasnât much to do, so we all had lots of other stuff going on, and Iâd been making friends for years before I discovered EA.
I gather that EA groups are now, in some cases, more like sports teams or music groups â things that can easily consume most of someoneâs non-class hours and leave them in a place where most of their friends are in the same club. Itâs good to have a close-knit group of altruistic friends, but spending all of your time around other people in EA will limit your perspective; guard against this!
(Also, having hobbies not related to your lifeâs central purpose seems healthy for a lot of reasons.)
Also very true!
Flagging this because it is very hard to account for properly, Iâve had to adjust my expectation of how hard-to-criticize I am several times (especially after I started getting jobs within EA).
Minor elaboration on your last point: a piece of advice I got from someone who did psychological research on how to solicit criticism was to try to brainstorm someoneâs most likely criticism of you would be, and then offer that up when requesting criticism, as this is a credible indication that youâre open to it. Examples:
âHey, do you have any critical feedback on the last discussion I ran? I talked a lot about AI stuff, but I know that can be kind of alienating for people who have more interest in political action than technology development⌠Does that seem right? Is there other stuff Iâm missing?â
âHey, Iâm looking for criticism on my leadership of this group. One thing I was worried about is that I make time for 1:1s with new members, but not so much with people that have been in the group for more than one year...â
âDid you think there was there anything off about our booth last week? I was noticing we were the only group handing out free books, maybe that looked weird. Did you notice anything else?â
Appreciate your comments, Aaron.
You say: But I am confident that leadersâ true desire is âfind people who have great epistemics [and are somewhat aligned]â, not âfind people who are extremely aligned [and have okay epistemics]â.
I think thatâs true for a lot of hires. But does that hold equally true when you think of hiring community builders specifically?
In my experience (5 ish people), leadersâ epistemic criteria seem less stringent for community building. Familiarity with EA, friendliness, and productivity seemed more salient.
This is a tricky question to answer, and thereâs some validity to your perspective here.
I was speaking too broadly when I said there were ârare exceptionsâ when epistemics werenât the top consideration.
Imagine three people applying to jobs:
Alice: 3â5 friendliness, 3â5 productivity, 5â5 epistemics
Bob: 5â5 friendliness, 3â5 productivity, 3â5 epistemics
Carol: 3â5 friendliness, 5â5 productivity, 3â5 epistemics
I could imagine Bob beating Alice for a âbuild a new groupâ role (though I think many CB people would prefer Alice), because friendliness is so crucial.
I could imagine Carol beating Alice for an ops role.
But if I were applying to a wide range of positions in EA and had to pick one trait to max out on my character sheet, Iâd choose âepistemicsâ if my goal were to stand out in a bunch of different interview processes and end up with at least one job.
One complicating factor is that there are only a few plausible candidates (sometimes only one) for a given group leadership position. Maybe the people most likely to actually want those roles are the ones who are really sociable and gung-ho about EA, while the people who arenât as sociable (but have great epistemics) go into other positions. This state of affairs allows for âEA leaders love epistemicsâ and âgroup leaders stand out for other traitsâ at the same time.
Finally, you mentioned âfamiliarityâ as a separate trait from epistemics, but I see them as conceptually similar when it comes to thinking about group leaders.
Common questions I see about group leaders include âcould this person explain these topics in a nuanced way?â and âcould this person successfully lead a deep, thoughtful discussion on these topics?â These and other similar questions involve familiarity, but also the ability to look at something from multiple angles, engage seriously with questions (rather than just reciting a canned answer), and do other âgood epistemicsâ things.