I think there is difference between “value alignment” and “personal connection”
Agreed. I was responding to:
Hiring managers should post jobs in more places, and be less dismissive of “non-EA” applicants
Although we might be more on the same page than I was thinking as you write:
I’m not saying that we should stop caring about whether candidates and employees understand and care about their organization’s mission. The mistake is assuming that the only people who understand and believe in my organization’s mission are members of the effective altruism community
I guess my position is that there may be some people who don’t identify with EA who would be really valuable; but it’s also the case that being EA is valuable beyond just caring about the mission in that EAs are likely to have a lot of useful frames.
Fair, but I worry that if we’re not prepared for this then the costs will be greater, more sudden, and confusing
I’d be surprised if it changed that fast. Like even if a bunch of additional people joined the community, you’d still know the people that you know.
I think the extent to which “member of the EA community” comes along with a certain way of thinking (i.e. “a lot of useful frames”) is exaggerated by many people I’ve heard talk about this sort of thing. I think ~50% of the perceived similarity is better described as similar ways of speaking and knowledge of jargon. I think that there actually not that many people who have fully internalized new ways of thinking that are 1.) very rare outside of EA, and 2.) shared across most EA hiring managers.
Another way to put this would be: I think EA hiring managers often weight “membership in the EA community” significantly more highly than it should be weighted. I think our disagreement is mostly about how much this factor should be weighted.
Fair point on the fast changing thing. I have some thoughts, but they’re not very clear and I think what you said is reasonable. One very rough take: Yes you’d still the people you know, but you might go from, “I know 50% of the people in AI alignment” to “I know 10% of the people in AI Alignment” in 3 months, which could be disorienting and demoralizing. So it’s more of a relative thing than the absolute number of people you know.
Agreed. I was responding to:
Although we might be more on the same page than I was thinking as you write:
I guess my position is that there may be some people who don’t identify with EA who would be really valuable; but it’s also the case that being EA is valuable beyond just caring about the mission in that EAs are likely to have a lot of useful frames.
I’d be surprised if it changed that fast. Like even if a bunch of additional people joined the community, you’d still know the people that you know.
I think the extent to which “member of the EA community” comes along with a certain way of thinking (i.e. “a lot of useful frames”) is exaggerated by many people I’ve heard talk about this sort of thing. I think ~50% of the perceived similarity is better described as similar ways of speaking and knowledge of jargon. I think that there actually not that many people who have fully internalized new ways of thinking that are 1.) very rare outside of EA, and 2.) shared across most EA hiring managers.
Another way to put this would be: I think EA hiring managers often weight “membership in the EA community” significantly more highly than it should be weighted. I think our disagreement is mostly about how much this factor should be weighted.
Fair point on the fast changing thing. I have some thoughts, but they’re not very clear and I think what you said is reasonable. One very rough take: Yes you’d still the people you know, but you might go from, “I know 50% of the people in AI alignment” to “I know 10% of the people in AI Alignment” in 3 months, which could be disorienting and demoralizing. So it’s more of a relative thing than the absolute number of people you know.