[Note: this was written in response to being asked for my thoughts on how important it is for EA orgs to hire aligned staff as they scale. Thanks to Sam Bankman-Fried for comments and for significant influence on these thoughts.]
Organizational alignment is a really hard problem; putting EA aside and just thinking about organizations that are trying to make money or something else, I think it’s still one of the biggest problems that organizations face.
There’s this Elon Musk quote that SBF likes to reference: “Every person in your company is a vector. Your progress is determined by the sum of all vectors.” By default, as you scale an organization, those vectors will all end up pointing in random different directions. Getting their directions more aligned is one of the key things you have to do to scale.
I think most of the stuff I have to say about organizational alignment applies to both EA and non-EA organizations. There’s two differences for EA orgs that I can think of:
EA gives you an extra tool in your alignment toolbox, in that other EAs will tend to be more aligned with your organization.
The less legible your goals are, the more important it is to hire aligned people. If your goal is “make money,” you at least have something to fall back on, in that you can do stuff like see how much money people make and compensate them accordingly. If your goal is “do research to determine priorities for making the long-term future go well,” that’s much harder to measure so it seems more important for people to be aligned.
I think the broad ways to do organizational alignment are:
building loyalty/trust
being fair to employees, treating them well, trusting them, etc
compensation
compensating in equity or something similar
making compensation heavily dependent on how much value you add
culture
setting an organization level culture that emphasizes teamwork, deemphasizes individual status, etc
management
you can pass the buck somewhat by just having good and aligned managers
hiring
just hiring people who are going to be aligned in the first place
Unsurprisingly, I think the more decision-making power someone has the more important it is for them to be aligned. There are a few ways I know of to have employees be super aligned with your org:
getting people who are dedicated EAs
getting people with a strong sense of personal loyalty to the org and/or the people in it
getting people who have relatively linear utility in money, and compensating them well in equity-type things
In practice, I think what I tend to do in my hiring is:
to first order, mostly just hire whoever is most competent and don’t worry too much about alignment
I put some weight on a potential hire being EA, but not a ton
I think trying to eg hire only EAs would have been fatally limiting to our growth
though depends on what the talent pools look like in the specific roles you’re trying to hire for
but when deciding to give people a lot of decision-making power or have them manage a lot of people, prioritize very strongly how aligned they are
this alignment can come from EA or from something else
working with them and getting to know them over time is probably more helpful for determining this than “whether they identify as EA”
But caveating that this is at an organization with fairly legible goals, and the less legible things are the more I’d expect hiring EAs to be important.
Organizational alignment
[Note: this was written in response to being asked for my thoughts on how important it is for EA orgs to hire aligned staff as they scale. Thanks to Sam Bankman-Fried for comments and for significant influence on these thoughts.]
Organizational alignment is a really hard problem; putting EA aside and just thinking about organizations that are trying to make money or something else, I think it’s still one of the biggest problems that organizations face.
There’s this Elon Musk quote that SBF likes to reference: “Every person in your company is a vector. Your progress is determined by the sum of all vectors.” By default, as you scale an organization, those vectors will all end up pointing in random different directions. Getting their directions more aligned is one of the key things you have to do to scale.
I think most of the stuff I have to say about organizational alignment applies to both EA and non-EA organizations. There’s two differences for EA orgs that I can think of:
EA gives you an extra tool in your alignment toolbox, in that other EAs will tend to be more aligned with your organization.
The less legible your goals are, the more important it is to hire aligned people. If your goal is “make money,” you at least have something to fall back on, in that you can do stuff like see how much money people make and compensate them accordingly. If your goal is “do research to determine priorities for making the long-term future go well,” that’s much harder to measure so it seems more important for people to be aligned.
I think the broad ways to do organizational alignment are:
building loyalty/trust
being fair to employees, treating them well, trusting them, etc
compensation
compensating in equity or something similar
making compensation heavily dependent on how much value you add
culture
setting an organization level culture that emphasizes teamwork, deemphasizes individual status, etc
management
you can pass the buck somewhat by just having good and aligned managers
hiring
just hiring people who are going to be aligned in the first place
Unsurprisingly, I think the more decision-making power someone has the more important it is for them to be aligned. There are a few ways I know of to have employees be super aligned with your org:
getting people who are dedicated EAs
getting people with a strong sense of personal loyalty to the org and/or the people in it
getting people who have relatively linear utility in money, and compensating them well in equity-type things
In practice, I think what I tend to do in my hiring is:
to first order, mostly just hire whoever is most competent and don’t worry too much about alignment
I put some weight on a potential hire being EA, but not a ton
I think trying to eg hire only EAs would have been fatally limiting to our growth
though depends on what the talent pools look like in the specific roles you’re trying to hire for
but when deciding to give people a lot of decision-making power or have them manage a lot of people, prioritize very strongly how aligned they are
this alignment can come from EA or from something else
working with them and getting to know them over time is probably more helpful for determining this than “whether they identify as EA”
But caveating that this is at an organization with fairly legible goals, and the less legible things are the more I’d expect hiring EAs to be important.