Projects to bring in skills that EA currently lacks (ie. EA Communications Fellowship, writer’s retreat, ect.)
On the other hand:
I think that the author undervalues value alignment and how the natural state is towards one of regression to the norm unless specific action is taken to avoid this
I agree that as EA scales, we will be less able to rely personal relationships, but I see no reason to impose those costs now
I agree that it may affect our reputation in the outside world, but I don’t think it’s worth increasing the risk of bad hires to attempt to satisfy our critics.
I’m worried about tensions EA being both a social and professional community entails, but I don’t have a good solution to this and maybe the status quo is the least bad option?
I think that the author undervalues value alignment and how the natural state is towards one of regression to the norm unless specific action is taken to avoid this
I think there is difference between “value alignment” and “personal connection”. I agree that the former is important, and I think the latter is often used (mostly successfully) as a tool to encourage the former. I addressed one aspect of this in the Hiring Managers section.
I agree that as EA scales, we will be less able to rely personal relationships, but I see no reason to impose those costs now
Fair, but I worry that if we’re not prepared for this then the costs will be greater, more sudden, and confusing, e.g. people starting to feel that EA is no longer fun or good and not knowing why. I think it’s good to be thinking about these things and make the tactical choice to do nothing, rather than leaving “overreliance on personal connections can be bad” out of our strategic arsenal completely.
I agree that it may affect our reputation in the outside world, but I don’t think it’s worth increasing the risk of bad hires to attempt to satisfy our critics.
I don’t think my suggestions for hiring managers would increase the risk of bad hires. In fact, I think moving away from “my friend is friends with this person” and towards “this person demonstrates that they care deeply about this mission” would decrease the risk of bad hires. (Sorry if this doesn’t make sense, but I don’t want to go on for too long in a comment.)
moving away from “my friend is friends with this person”
I hadn’t thought of your post in these explicit terms till now, but now that you write it like that I remember that indeed I’ve already applied to a program which explicitly asked for a reference the head organizer knows personally.
I was rejected from that program twice, though I obviously can’t know if the reason was related, and I may still apply in the future.
Explicitly asking for a reference the head organizer knows personally.
That feels pretty bad to me! I can imagine some reason that this would be necessary for some programs, but in general requiring this doesn’t seem healthy.
I find the request for references on the EA Funds’ application to be a good middle-ground. There’s several sentences to it, but the most relevant one is:
References by people who are directly involved in effective altruism and adjacent communities are particularly useful, especially if we are likely to be familiar with their work and thinking.
It’s clearly useful to already be in the fund managers’ network, but it’s also clearly not required. Of course there’s always a difference between the policy and the practice, but this is a pretty good public policy from my perspective.
I should probably be more precise and say the phrasing was something like “preferably someone who [organizer] knows”.
But since this is presented as the better option, I don’t think I see much difference between the two, as you’d expect the actual filtering process to favour exactly those people in the organiser’s network.
I think there is difference between “value alignment” and “personal connection”
Agreed. I was responding to:
Hiring managers should post jobs in more places, and be less dismissive of “non-EA” applicants
Although we might be more on the same page than I was thinking as you write:
I’m not saying that we should stop caring about whether candidates and employees understand and care about their organization’s mission. The mistake is assuming that the only people who understand and believe in my organization’s mission are members of the effective altruism community
I guess my position is that there may be some people who don’t identify with EA who would be really valuable; but it’s also the case that being EA is valuable beyond just caring about the mission in that EAs are likely to have a lot of useful frames.
Fair, but I worry that if we’re not prepared for this then the costs will be greater, more sudden, and confusing
I’d be surprised if it changed that fast. Like even if a bunch of additional people joined the community, you’d still know the people that you know.
I think the extent to which “member of the EA community” comes along with a certain way of thinking (i.e. “a lot of useful frames”) is exaggerated by many people I’ve heard talk about this sort of thing. I think ~50% of the perceived similarity is better described as similar ways of speaking and knowledge of jargon. I think that there actually not that many people who have fully internalized new ways of thinking that are 1.) very rare outside of EA, and 2.) shared across most EA hiring managers.
Another way to put this would be: I think EA hiring managers often weight “membership in the EA community” significantly more highly than it should be weighted. I think our disagreement is mostly about how much this factor should be weighted.
Fair point on the fast changing thing. I have some thoughts, but they’re not very clear and I think what you said is reasonable. One very rough take: Yes you’d still the people you know, but you might go from, “I know 50% of the people in AI alignment” to “I know 10% of the people in AI Alignment” in 3 months, which could be disorienting and demoralizing. So it’s more of a relative thing than the absolute number of people you know.
I like some of the ideas:
Cohosting events with other movements
Seeding groups in more countries
Projects to bring in skills that EA currently lacks (ie. EA Communications Fellowship, writer’s retreat, ect.)
On the other hand:
I think that the author undervalues value alignment and how the natural state is towards one of regression to the norm unless specific action is taken to avoid this
I agree that as EA scales, we will be less able to rely personal relationships, but I see no reason to impose those costs now
I agree that it may affect our reputation in the outside world, but I don’t think it’s worth increasing the risk of bad hires to attempt to satisfy our critics.
I’m worried about tensions EA being both a social and professional community entails, but I don’t have a good solution to this and maybe the status quo is the least bad option?
Thanks for the thoughtful feedback Chris!
I think there is difference between “value alignment” and “personal connection”. I agree that the former is important, and I think the latter is often used (mostly successfully) as a tool to encourage the former. I addressed one aspect of this in the Hiring Managers section.
Fair, but I worry that if we’re not prepared for this then the costs will be greater, more sudden, and confusing, e.g. people starting to feel that EA is no longer fun or good and not knowing why. I think it’s good to be thinking about these things and make the tactical choice to do nothing, rather than leaving “overreliance on personal connections can be bad” out of our strategic arsenal completely.
I don’t think my suggestions for hiring managers would increase the risk of bad hires. In fact, I think moving away from “my friend is friends with this person” and towards “this person demonstrates that they care deeply about this mission” would decrease the risk of bad hires. (Sorry if this doesn’t make sense, but I don’t want to go on for too long in a comment.)
I hadn’t thought of your post in these explicit terms till now, but now that you write it like that I remember that indeed I’ve already applied to a program which explicitly asked for a reference the head organizer knows personally.
I was rejected from that program twice, though I obviously can’t know if the reason was related, and I may still apply in the future.
That feels pretty bad to me! I can imagine some reason that this would be necessary for some programs, but in general requiring this doesn’t seem healthy.
I find the request for references on the EA Funds’ application to be a good middle-ground. There’s several sentences to it, but the most relevant one is:
It’s clearly useful to already be in the fund managers’ network, but it’s also clearly not required. Of course there’s always a difference between the policy and the practice, but this is a pretty good public policy from my perspective.
I should probably be more precise and say the phrasing was something like “preferably someone who [organizer] knows”.
But since this is presented as the better option, I don’t think I see much difference between the two, as you’d expect the actual filtering process to favour exactly those people in the organiser’s network.
Agreed. I was responding to:
Although we might be more on the same page than I was thinking as you write:
I guess my position is that there may be some people who don’t identify with EA who would be really valuable; but it’s also the case that being EA is valuable beyond just caring about the mission in that EAs are likely to have a lot of useful frames.
I’d be surprised if it changed that fast. Like even if a bunch of additional people joined the community, you’d still know the people that you know.
I think the extent to which “member of the EA community” comes along with a certain way of thinking (i.e. “a lot of useful frames”) is exaggerated by many people I’ve heard talk about this sort of thing. I think ~50% of the perceived similarity is better described as similar ways of speaking and knowledge of jargon. I think that there actually not that many people who have fully internalized new ways of thinking that are 1.) very rare outside of EA, and 2.) shared across most EA hiring managers.
Another way to put this would be: I think EA hiring managers often weight “membership in the EA community” significantly more highly than it should be weighted. I think our disagreement is mostly about how much this factor should be weighted.
Fair point on the fast changing thing. I have some thoughts, but they’re not very clear and I think what you said is reasonable. One very rough take: Yes you’d still the people you know, but you might go from, “I know 50% of the people in AI alignment” to “I know 10% of the people in AI Alignment” in 3 months, which could be disorienting and demoralizing. So it’s more of a relative thing than the absolute number of people you know.