I feel like a lot of this is downstream from people being reluctant to hire experienced people who aren’t already associated with EA. Particularly for things like operations roles experience doing similar roles is going to make far more of a difference to effectiveness than deep belief in EA values.
When Coke need to hire new people they don’t look for people who have a deep love of sugary drinks brands, they find people in similar roles for other things and offer them money. I feel like the reason EA orgs are reluctant to do this is that there’s a degree of exceptionalism in EA.
I agree that it’s downstream of this, but strongly agree with ideopunk that mission alignment is a reasonable requirement to have.* A (perhaps the) major cause of organizations becoming dysfunctional as they grow is that people within the organization act in ways that are good for them, but bad for the organization overall—for example, fudging numbers to make themselves look more successful, ask for more headcount when they don’t really need it, doing things that are short-term good but long-term bad (with the assumption that they’ll have moved on before the bad stuff kicks in), etc. (cf. the book Moral Mazes.) Hiring mission-aligned people is one of the best ways to provide a check on that type of behavior.
*I think some orgs maybe should be more open to hiring people who are aligned with the org’s particular mission but not part of the EA community—eg that’s Wave’s main hiring demographic—but for orgs with more “hardcore EA” missions, it’s not clear how much that expands their applicant pool.
In fortune 500 companies, rarely you find people that are exceptional on the get go. Most of them that have succeeded were allowed to grow themselves within parameterized environments of multidisciplinary scope so they can have the room to combine ideas.
Can EA develop the EA/ longtermist attitude in exceptionally talented people? I believe digging this question brutally can point every EA founder /Directorship role on how to deal with developing management talent..
It’s pretty common in values-driven organisations to ask for an amount of value-alignment. The other day I helped out a friend with a resume for an organisation which asked for people applying to care about their feminist mission.
In my opinion this is a reasonable thing to ask for and expect. Sharing (overarching) values improves decision-making and requiring for it can help prevent value drift in an org.
What qualifies as ‘a (sufficient) amount of value alignment’? I worked with many people who agreed with the premise of moving money to the worst off, and found the actual practices of many self-identifying EAs hard to fathom.
Also, ‘it’s pretty common’ strikes me as an insufficient argument—many practices are common and bad. More data seems needed.
I feel like a lot of this is downstream from people being reluctant to hire experienced people who aren’t already associated with EA. Particularly for things like operations roles experience doing similar roles is going to make far more of a difference to effectiveness than deep belief in EA values.
When Coke need to hire new people they don’t look for people who have a deep love of sugary drinks brands, they find people in similar roles for other things and offer them money. I feel like the reason EA orgs are reluctant to do this is that there’s a degree of exceptionalism in EA.
I agree that it’s downstream of this, but strongly agree with ideopunk that mission alignment is a reasonable requirement to have.* A (perhaps the) major cause of organizations becoming dysfunctional as they grow is that people within the organization act in ways that are good for them, but bad for the organization overall—for example, fudging numbers to make themselves look more successful, ask for more headcount when they don’t really need it, doing things that are short-term good but long-term bad (with the assumption that they’ll have moved on before the bad stuff kicks in), etc. (cf. the book Moral Mazes.) Hiring mission-aligned people is one of the best ways to provide a check on that type of behavior.
*I think some orgs maybe should be more open to hiring people who are aligned with the org’s particular mission but not part of the EA community—eg that’s Wave’s main hiring demographic—but for orgs with more “hardcore EA” missions, it’s not clear how much that expands their applicant pool.
In fortune 500 companies, rarely you find people that are exceptional on the get go. Most of them that have succeeded were allowed to grow themselves within parameterized environments of multidisciplinary scope so they can have the room to combine ideas.
Can EA develop the EA/ longtermist attitude in exceptionally talented people? I believe digging this question brutally can point every EA founder /Directorship role on how to deal with developing management talent..
It’s pretty common in values-driven organisations to ask for an amount of value-alignment. The other day I helped out a friend with a resume for an organisation which asked for people applying to care about their feminist mission.
In my opinion this is a reasonable thing to ask for and expect. Sharing (overarching) values improves decision-making and requiring for it can help prevent value drift in an org.
What qualifies as ‘a (sufficient) amount of value alignment’? I worked with many people who agreed with the premise of moving money to the worst off, and found the actual practices of many self-identifying EAs hard to fathom.
Also, ‘it’s pretty common’ strikes me as an insufficient argument—many practices are common and bad. More data seems needed.