Reading this, I guess I’ll just post the second half of this memo that I wrote here as well, since it has some additional points that seem valuable to the discussion:
When I play forward the future, I can imagine a few different outcomes, assuming that my basic hunches about the dynamics here are correct at all:
I think it would not surprise me that much if many of us do fall prey to the temptation to use the wealth and resources around us for personal gain, or as a tool towards building our own empire, or come to equate “big” with “good”. I think the world’s smartest people will generally pick up on us not really aiming for the common good, but I do think we have a lot of trust to spend down, and could potentially keep this up for a few years. I expect eventually this will cause the decline of our reputation and ability to really attract resources and talent, and hopefully something new and good will form in our ashes before the story of humanity ends.
But I think in many, possibly most, of the worlds where we start spending resources aggressively, whether for personal gain, or because we do really have a bold vision for how to change the future, the relationships of the central benefactors to the community will change. I think it’s easy to forget that for most of us, the reputation and wealth of the community is ultimately borrowed, and when Dustin, or Cari or Sam or Jaan or Eliezer or Nick Bostrom see how their reputation or resources get used, they will already be on high-alert for people trying to take their name and their resources, and be ready to take them away when it seems like they are no longer obviously used for public benefit. I think in many of those worlds we will be forced to run projects in a legible way; or we will choose to run them illegibly, and be surprised by how few of the “pledged” resources were ultimately available for them.
And of course in many other worlds, we learn to handle the pressures of an ecosystem where trust is harder to come by, and we scale, and find new ways of building trust, and take advantage of the resources at our fingertips.
Or maybe we split up into different factions and groups, and let many of the resources that we could reach go to waste, as they ultimately get used by people who don’t seem very aligned to us, but some of us think this loss is worth it to maintain an environment where we can think more freely and with less pressure.
Of course, all of this is likely to be far too detailed to be an accurate prediction of what will happen. I expect reality will successfully surprise me, and I am not at all confident I am reading the dynamics of the situation correctly. But the above is where my current thinking is at, and is the closest to a single expectation I can form, at least when trying to forecast what will happen to people currently in EA leadership.
To also take a bit more of an object-level stance, I currently very tentatively believe that I don’t think this shift is worth it. I don’t actually really have any plans that seem hopeful or exciting to me that really scale with a lot more money or a lot more resources, and I would really prefer to spend more time without needing to be worried about full-time people trying to scheme how to get specifically me to like them.
However, I do see the hope and potential in actually going out and spending the money and reputation we have to maybe get much larger fractions of the world’s talent to dedicate themselves to ensuring a flourishing future and preventing humanity’s extinction. I have inklings and plans that could maybe scale. But I am worried that I’ve already started trying to primarily answer the question “but what plans can meaningfully absorb all this money?” instead of the question of “but what plans actually have the highest chance of success?”, and that this substitution has made me worse, not better, at actually solving the problem.
I think historically we’ve lacked important forms of ambition. And I am excited about us actually thinking big. But I currently don’t know how to do it well. Hopefully this memo will make the conversations about this better, and maybe will help us orient towards this situation more healthily.
To onlookers: There’s a often a low amount of resolution and expertise in some comments and concerns on the LW and EAF, and this creates “bycatch” and reduces clarity. With uncertainty, I’ll lay out one story that seems like it matches the concerns in the parent comment.
Strong Spending
I’m not entirely sure this is correct, but for large EA spending, I usually think of the following:
30%-70% growth in head count in established institutions, sustained for multiple years
Near six figure salaries for junior talent, and well over six figure salaries for very good talent and management who can scale and build an organization (people who can earn multiple times that in the private sector and cause an organization to exist and have impact)
Seven figure salaries for extreme talent (world’s best applied math, CS, top lawyers)
Discretionary spending
Buying operations, consulting and other services
So all the above is manageable, even sort of fundamental for a good leader or ED or CEO. This is why quality CEO or leadership is so important, to hire and integrate this talent well and manage this spending. This is OK.
This is considered “high”, but it’s not really by real world standards.
Now distinct from the above comment, there’s a whole other reference class of spending where:
People can get an amount of cash that is a large fraction of all spending in an existing EA cause area in one raise.
The internal environment is largely “deep tech” or not related to customers or operations
So I’m thinking about valuations in the 2010- tech sector for trendy companies.
I’m not sure, but my model of organizations that can raise 8 figures per person in a series B, for spending that is pretty much purely CapEx (as opposed to capital to support operations or lower margin activity, e.g. inventory, logistics) has internal activity that is really, really different than the above “high” spending in the above comment.
There’s issues here, that are hard to appreciate.
So Facebook’s raises were really hot and oversubscribed. But building the company was a drama fest for the founders, and also there was a nuclear reactor hot business with viral growth. So that’s epic fires to put out every week, customers and partners, actual scaling issues of hockey stick growth (not this meta-business advice discussion on the forum). It’s a mess. So CEO and even junior people have to deal.
But once you’re just raising that amount in deep tech mode, my guesses for how people think, feel, and behave inside of that company with valuations in the 8-9 figures per person. My guess is that the attractiveness, incentives and beliefs in that environment, are really different than even the hottest startups, even above those where junior people exit with 7 figures of income.
To be concrete, the issues on the rest of EA might be that:
Even strong EA CEOs won’t be able to hire many EA talent like software developers (but they should be worried about hiring pretty much anyone really). If they hire, they won’t be able to keep them at comfortable, above EA salaries, without worrying about attrition.
Every person who can convincingly claim interest or signal interest in a cause area is inherently going to be treated very differently in any discussion, interaction, in a deep way that I don’t EA has seen.
The dynamics emerge that good people won’t feel comfortable adding this to their cause area anymore.
Again, this is not “strong spending” but the “next level, next level” world of both funding that is hard to match in human history in any for-profit, plus the nature of work that is different than any other.
I’m not sure, but in situations where this sort of dynamic or resource gradient happens, this isn’t resolved by the high gradient stopping (people don’t stop funding or founding institutions), because the original money is driven by underlying forces that is really strong. My guess is that a lot of this would be counter productive.
Typically in those situations, I think the best path is moderation and focusing on development and culture in other cause areas.
These are some very important points, thanks for taking the time to write them out.
I just made an account here, though I’ve only ever commented on LW before, just to stress how important and vital it is to soberly assess the change in incentives. Because even the best have strengths and weaknesses that need to be adapted to.
“Show me the incentives and I will show you the outcome”—Charlie Munger
Reading this, I guess I’ll just post the second half of this memo that I wrote here as well, since it has some additional points that seem valuable to the discussion:
To onlookers: There’s a often a low amount of resolution and expertise in some comments and concerns on the LW and EAF, and this creates “bycatch” and reduces clarity. With uncertainty, I’ll lay out one story that seems like it matches the concerns in the parent comment.
Strong Spending
I’m not entirely sure this is correct, but for large EA spending, I usually think of the following:
30%-70% growth in head count in established institutions, sustained for multiple years
Near six figure salaries for junior talent, and well over six figure salaries for very good talent and management who can scale and build an organization (people who can earn multiple times that in the private sector and cause an organization to exist and have impact)
Seven figure salaries for extreme talent (world’s best applied math, CS, top lawyers)
Discretionary spending
Buying operations, consulting and other services
So all the above is manageable, even sort of fundamental for a good leader or ED or CEO. This is why quality CEO or leadership is so important, to hire and integrate this talent well and manage this spending. This is OK.
This is considered “high”, but it’s not really by real world standards.
Next-level Next-level
Now distinct from the above comment, there’s a whole other reference class of spending where:
People can get an amount of cash that is a large fraction of all spending in an existing EA cause area in one raise.
The internal environment is largely “deep tech” or not related to customers or operations
So I’m thinking about valuations in the 2010- tech sector for trendy companies.
I’m not sure, but my model of organizations that can raise 8 figures per person in a series B, for spending that is pretty much purely CapEx (as opposed to capital to support operations or lower margin activity, e.g. inventory, logistics) has internal activity that is really, really different than the above “high” spending in the above comment.
There’s issues here, that are hard to appreciate.
So Facebook’s raises were really hot and oversubscribed. But building the company was a drama fest for the founders, and also there was a nuclear reactor hot business with viral growth. So that’s epic fires to put out every week, customers and partners, actual scaling issues of hockey stick growth (not this meta-business advice discussion on the forum). It’s a mess. So CEO and even junior people have to deal.
But once you’re just raising that amount in deep tech mode, my guesses for how people think, feel, and behave inside of that company with valuations in the 8-9 figures per person. My guess is that the attractiveness, incentives and beliefs in that environment, are really different than even the hottest startups, even above those where junior people exit with 7 figures of income.
To be concrete, the issues on the rest of EA might be that:
Even strong EA CEOs won’t be able to hire many EA talent like software developers (but they should be worried about hiring pretty much anyone really). If they hire, they won’t be able to keep them at comfortable, above EA salaries, without worrying about attrition.
Every person who can convincingly claim interest or signal interest in a cause area is inherently going to be treated very differently in any discussion, interaction, in a deep way that I don’t EA has seen.
The dynamics emerge that good people won’t feel comfortable adding this to their cause area anymore.
Again, this is not “strong spending” but the “next level, next level” world of both funding that is hard to match in human history in any for-profit, plus the nature of work that is different than any other.
I’m not sure, but in situations where this sort of dynamic or resource gradient happens, this isn’t resolved by the high gradient stopping (people don’t stop funding or founding institutions), because the original money is driven by underlying forces that is really strong. My guess is that a lot of this would be counter productive.
Typically in those situations, I think the best path is moderation and focusing on development and culture in other cause areas.
These are some very important points, thanks for taking the time to write them out.
I just made an account here, though I’ve only ever commented on LW before, just to stress how important and vital it is to soberly assess the change in incentives. Because even the best have strengths and weaknesses that need to be adapted to.
“Show me the incentives and I will show you the outcome”—Charlie Munger