[I wrote this over year ago—posting it now for Draft Amnesty. I don’t necessarily still endorse my conclusions here, and I’m not planning to respond to comments, sorry!]
One tiny, tiny upside of the FTX crisis: I had interesting new thoughts!
A problem with EA having a few big funders:
The common argument I’d heard against billionaires funding EA was that these billionaires’ biases would affect how funding was spent. E.g. Dustin Muskovitz would push OpenPhil to spend money to prevent Big Tech regulation.
I’m not very convinced here: EAs as a group aspire to cause neutrality and finding the most effective ways of doing good, and seem to have a strong awareness of how personal biases can affect this.
Anecdotally, Dustin doesn’t have much of a role in deciding what OpenPhil spends its money on. See this tweet showing he didn’t know it was OpenPhil who paid for the £10 million Wytham Abbey. [Edit: can’t find the tweet link, sorry!]
However, the FTX crisis convinced me of a related argument.
EA funding coming from a specific billionaire gives the EA community a stake in whatever organisation, industry or system that billionaire’s wealth is tied up in.
For example, I’m somewhat invested in the success of Asana because of the significant portion of Dustin Muskovitz’ wealth being tied up in that.
Extrapolating a little, a wealthy EA has more to lose, and our dreams of what we could do with that wealth and plain old loss-aversion can subtly push every EA towards supporting whatever that predicted wealth depends upon. This can be anything from a specific product to Silicon Valley to the capitalist economic system.
The severity of this depends on the structure of the billionaire in question. Bill Gates’ wealth is mainly in the stock market, not in Microsoft, so the effect is less strong. (though it still could tie people to the stock market doing well)
Sometimes you might think it worth it to support a specific product because EA causes would get much more funding if it prospers. However, sometimes it definitely isn’t.
Be explicit with yourself about why you have a positive opinion of an organisation or Silicon Valley. Don’t let your positive feelings about the money something brings to EA cloud your opinion in the whole of it.
Potential EAs or EA collaborators could legitimately be turned off of EA if they experience it as full of Silicon Valley devotees, blind to that place’s faults.
To the extent you value truth-seeking as a core value of EA, this also damages that.
My overall takeaway boils down to the common rationalist message: know why you believe what you believe.
I’m not arguing against EA allying with billionaire donors in this post, the amount of money they could contribute to high-impact causes outweighs this in my opinion.
However, I think we as a community should be more aware of the subtle impacts of billionaire funding. Don’t blind yourself to the flaws of a company, culture or economic system because it currently benefits EA. Make clear to yourself what you think about EA funding sources, and most importantly, remember positive impact (and whatever intangible values you also believe in!) are what ultimately matter.
Smaller thoughts:
Posts about EA funding should be more clear that this is only a prediction, and explain various other possibilities to decrease the possibility that people grow too attached to the big number.
I think Ben Todd’s post about EA having 50 billion made people grow attached to EA having lots of money—when it was nothing but pledged donations that don’t really exist!
I know EAs who made decisions based on the large amount of money pledged to EA and I suspect some of them didn’t give much thought to alternative worlds.
My suggestions for framing future EA funding situations:
Explain various other scenarios, as they might not be apparent to casual readers. (People also vary in how they think, I’m someone for whom writing really helps me think, so by just reading about something I don’t necessarily critique it enough…)
Frame it as: ‘this is one possible world’, rather than ‘we have this money!’ because losing money feels very bad.
EA needs more people with experience, this is a strike against talented young people
Newspaper articles have drawn attention to the sheer… incompetence of Alameda and FTX. No accounting department? Let alone ‘losing’ eight billion.
A meme I’ve seen in EA is young people can do anything if we’re talented and ambitious enough. I’m all for aiming high, but the FTX crisis has made me update towards the need for more professional experience in EA.
If an EA organisation runs like a company then it should learn from the business world that’s had decades to perfect marketing, operations, recruiting etc.
This might look like young EAs going into the business world to learn these skills, EA organisations hiring aligned business professionals. I think there should also be more thought as to the importance of EA-alignment in each role in an EA org.
This might already exist in word of mouth form that hasn’t reached me.
Something about the reaction and its effects on me:
The FTX crisis produced emotional effects that scared me.
It cost me focus and sleep, almost like a tiny breakup.
It took more than a week for me to feel okay again.
I read everything I could find on the EA forum and in my Twitter circles.
This was really scary. I’d never experienced anything like this before.
But, hearing from other people experiencing similar effects made me feel much less scared. I didn’t know this was a thing that could happen and felt like something was wrong with me.
I now understand when my university sends out support messages when bad things happen in the world.
Overall, the crisis opened me to a new part of human experience.
80,000 Hours portrays interviewees in a positive light:
One strong feeling I had after the FTX crisis was that ‘But the 80k podcast gave a really positive view of him’
In hindsight, this is something 80k does to all its guests.
Anecdotally, I’ve heard lots of people strongly disagree with Ian Morris’ ideas in this episode, but the impression I got from the episode is that ‘this is true and really cool’
80k could challenge guests a little on the show, ask for the strongest counterarguments to some of their ideas, or issue a disclaimer saying they don’t necessarily wholeheartedly support whoever they host on their podcast.
However, challenging people might make guests more hesitant to come on the show.
I think there should be more community awareness that Rob Wiblin portraying someone as super cool on 80k doesn’t mean they are fully vetted and supported by the EA community.
I think a blog post making clear 80k’s general view towards guests would help, and I also think this should be passed along by word of mouth.
Post-FTX reflections from a random EA
[I wrote this over year ago—posting it now for Draft Amnesty. I don’t necessarily still endorse my conclusions here, and I’m not planning to respond to comments, sorry!]
One tiny, tiny upside of the FTX crisis: I had interesting new thoughts!
A problem with EA having a few big funders:
The common argument I’d heard against billionaires funding EA was that these billionaires’ biases would affect how funding was spent. E.g. Dustin Muskovitz would push OpenPhil to spend money to prevent Big Tech regulation.
I’m not very convinced here: EAs as a group aspire to cause neutrality and finding the most effective ways of doing good, and seem to have a strong awareness of how personal biases can affect this.
Anecdotally, Dustin doesn’t have much of a role in deciding what OpenPhil spends its money on. See this tweet showing he didn’t know it was OpenPhil who paid for the £10 million Wytham Abbey. [Edit: can’t find the tweet link, sorry!]
However, the FTX crisis convinced me of a related argument.
EA funding coming from a specific billionaire gives the EA community a stake in whatever organisation, industry or system that billionaire’s wealth is tied up in.
For example, I’m somewhat invested in the success of Asana because of the significant portion of Dustin Muskovitz’ wealth being tied up in that.
Extrapolating a little, a wealthy EA has more to lose, and our dreams of what we could do with that wealth and plain old loss-aversion can subtly push every EA towards supporting whatever that predicted wealth depends upon. This can be anything from a specific product to Silicon Valley to the capitalist economic system.
The severity of this depends on the structure of the billionaire in question. Bill Gates’ wealth is mainly in the stock market, not in Microsoft, so the effect is less strong. (though it still could tie people to the stock market doing well)
Sometimes you might think it worth it to support a specific product because EA causes would get much more funding if it prospers. However, sometimes it definitely isn’t.
Be explicit with yourself about why you have a positive opinion of an organisation or Silicon Valley. Don’t let your positive feelings about the money something brings to EA cloud your opinion in the whole of it.
Potential EAs or EA collaborators could legitimately be turned off of EA if they experience it as full of Silicon Valley devotees, blind to that place’s faults.
To the extent you value truth-seeking as a core value of EA, this also damages that.
My overall takeaway boils down to the common rationalist message: know why you believe what you believe.
I’m not arguing against EA allying with billionaire donors in this post, the amount of money they could contribute to high-impact causes outweighs this in my opinion.
However, I think we as a community should be more aware of the subtle impacts of billionaire funding. Don’t blind yourself to the flaws of a company, culture or economic system because it currently benefits EA. Make clear to yourself what you think about EA funding sources, and most importantly, remember positive impact (and whatever intangible values you also believe in!) are what ultimately matter.
Smaller thoughts:
Posts about EA funding should be more clear that this is only a prediction, and explain various other possibilities to decrease the possibility that people grow too attached to the big number.
I think Ben Todd’s post about EA having 50 billion made people grow attached to EA having lots of money—when it was nothing but pledged donations that don’t really exist!
I know EAs who made decisions based on the large amount of money pledged to EA and I suspect some of them didn’t give much thought to alternative worlds.
My suggestions for framing future EA funding situations:
Explain various other scenarios, as they might not be apparent to casual readers. (People also vary in how they think, I’m someone for whom writing really helps me think, so by just reading about something I don’t necessarily critique it enough…)
Frame it as: ‘this is one possible world’, rather than ‘we have this money!’ because losing money feels very bad.
EA needs more people with experience, this is a strike against talented young people
Newspaper articles have drawn attention to the sheer… incompetence of Alameda and FTX. No accounting department? Let alone ‘losing’ eight billion.
A meme I’ve seen in EA is young people can do anything if we’re talented and ambitious enough. I’m all for aiming high, but the FTX crisis has made me update towards the need for more professional experience in EA.
If an EA organisation runs like a company then it should learn from the business world that’s had decades to perfect marketing, operations, recruiting etc.
This might look like young EAs going into the business world to learn these skills, EA organisations hiring aligned business professionals. I think there should also be more thought as to the importance of EA-alignment in each role in an EA org.
This might already exist in word of mouth form that hasn’t reached me.
Something about the reaction and its effects on me:
The FTX crisis produced emotional effects that scared me.
It cost me focus and sleep, almost like a tiny breakup.
It took more than a week for me to feel okay again.
I read everything I could find on the EA forum and in my Twitter circles.
This was really scary. I’d never experienced anything like this before.
But, hearing from other people experiencing similar effects made me feel much less scared. I didn’t know this was a thing that could happen and felt like something was wrong with me.
I now understand when my university sends out support messages when bad things happen in the world.
Overall, the crisis opened me to a new part of human experience.
80,000 Hours portrays interviewees in a positive light:
One strong feeling I had after the FTX crisis was that ‘But the 80k podcast gave a really positive view of him’
In hindsight, this is something 80k does to all its guests.
Anecdotally, I’ve heard lots of people strongly disagree with Ian Morris’ ideas in this episode, but the impression I got from the episode is that ‘this is true and really cool’
80k could challenge guests a little on the show, ask for the strongest counterarguments to some of their ideas, or issue a disclaimer saying they don’t necessarily wholeheartedly support whoever they host on their podcast.
However, challenging people might make guests more hesitant to come on the show.
I think there should be more community awareness that Rob Wiblin portraying someone as super cool on 80k doesn’t mean they are fully vetted and supported by the EA community.
I think a blog post making clear 80k’s general view towards guests would help, and I also think this should be passed along by word of mouth.