Which brings me to a point the PayPal Mafia member Keith Rabois raised early in this book: he told me that it’s important to hire people who agree with your “first principles”—for example, whether to focus on growth or profitability and, more broadly, the company’s mission and how to pursue it. I’d agree. If your mission is to encourage people to share more online, you shouldn’t hire someone who believes people don’t really want to make their private lives public, or you’ll spend a lot of time arguing, time you don’t have to waste when you’re trying to build a company. But those who believe in your mission and how to execute it aren’t limited to people who look and act like you. To combat this tendency, you must first be explicit about what your first principles are. And then, for all of the reasons we discussed, go out of your way to find people who agree with your first principles and who don’t look like you. Because if you don’t build a diverse team when you start, as you scale, it will be incomparably harder to do so.
The parallels seem pretty obvious to me, and here is my altered version:
If your mission is to improve the long-term future, you shouldn’t hire someone who believes that most of the value is in the next 0 to 50 years. If your mission is to reduce animal suffering, you shouldn’t hire someone who hates animals. But those who believe in your mission and how to execute it aren’t limited to people who look and act like you.
If your mission is to reduce animal suffering, should you hire someone that wants to do that but is simply less intense about it? A person who spends 5% of their free time thinking about this when you spend 60% of your free time thinking about this? I do think that mission alignment is important for some roles, but it is hard to specify without really understanding the work.[1]
As an example of “understanding the work,” my superficial guess is that someone planning an EAG event probably doesn’t need to know all about EA in order to book conference rooms, arrange catering, set up sound & lighting, etc. But I don’t know, because I haven’t done that job or managed that job or closely observed that job. Maybe lot of EA context really is necessary in order to make lots of little decisions which otherwise would make the event a noticeably worse experience for the attendees. Indeed, pretty much the only thing that I am confident in in relation to this is that we can’t make strong claims about a role unless we really understand the work.
I’m reading Brotopia: Breaking Up the Boys’ Club of Silicon Valley, and this paragraph stuck in my head. I’m wondering about EA and “mission alignment” and similar things.
The parallels seem pretty obvious to me, and here is my altered version:
I think this leads me back to two ideas that I’ve been bouncing around. First, be clear about
to what extentif a particular role needs to be mission-aligned. Second, be clear to what level/extent a particular role needs to be mission aligned (3 out of 10? 8 out of 10?). Does the person you hire to handle physical security need to care about AI safety risk scenarios?If your mission is to reduce animal suffering, should you hire someone that wants to do that but is simply less intense about it? A person who spends 5% of their free time thinking about this when you spend 60% of your free time thinking about this? I do think that mission alignment is important for some roles, but it is hard to specify without really understanding the work.[1]
As an example of “understanding the work,” my superficial guess is that someone planning an EAG event probably doesn’t need to know all about EA in order to book conference rooms, arrange catering, set up sound & lighting, etc. But I don’t know, because I haven’t done that job or managed that job or closely observed that job. Maybe lot of EA context really is necessary in order to make lots of little decisions which otherwise would make the event a noticeably worse experience for the attendees. Indeed, pretty much the only thing that I am confident in in relation to this is that we can’t make strong claims about a role unless we really understand the work.