. Leverage and Nonlinear are very peripheral to EA and they mostly (if allegations are true) harmed EAs rather than people outside the movement.
I will again remind people that Leverage at some point had approximately succeeded at a corporate takeover of CEA, placing both the CEO and their second-in-command in the organization. They really were not very peripheral to EA, they were just covert about it.
That’s indeed shocking, and now that you mention it, I also remember the Pareto fellowship Leverage takeover attempt. Maybe I’m too relaxed about this, but it feels to me like there’s no nearby possible world where this situation would have kept going? Pretty much everyone I talked to in EA always made remarks about how Leverage “is a cult” and the Leverage person became CEA’s CEO not because it was the result of a long CEO search process, but because the previous CEO left abruptly and they had few immediate staff-internal options. The CEO (edit: CEA!) board eventually intervened and installed Max Dalton, who was a good leader. Those events happened long ago and in my view they tentatively(?) suggest that the EA community had a good-enough self-correction mechanism so that schemers don’t tend to stay in central positions of power for long periods of time. I concede that we can count these as near misses and maybe even as evidence that there are (often successfully fended off) tensions with the EA culture and who it attracts, but I’m not yet on board with seeing these data points as “evidence for problems with EA as-it-is now” rather than “the sort of thing that happens in both EA and outside of EA as soon as you’re trying to have more influence.”
I think the self-correction mechanism was not very strong. I think if Tara (who was also strongly supportive of the Leverage faction, which is why she placed Larissa in charge) had stayed, I think it would have been the long-term equilibrium of the organization. The primary reason why the equilibrium collapsed is because Tara left to found Alameda.
A few months (maybe up to 9 months, but could be as little as 1 month, I don’t remember the timing) before Larissa had to leave CEA, a friend and I talked to a competent-seeming CEA staff member who was about to leave the org (or had recently left – I don’t remember details) because the org seemed like a mess and had bad leadership. I’m not sure if Leverage was mentioned there – I could imagine that it was, but I don’t remember details and my most salient memory is that I thought of it as “not good leadership for CEA.” My friend and I encouraged them to stay and speak up to try to change leadership, but the person had enough for the time being or had some other reason to leave (again, I don’t remember details). Anyway, this person left CEA at the time without a plan to voice their view that the org was in a bad state. I don’t know if they gave an exit interview or deliberately sought out trustees or if they talked to friends or whether they said nothing at all – I didn’t stay in touch. However, I do remember that my friend discussed if maybe we should at some point get back this former CEA staff person and encourage them again to find out if there are more former colleagues who are dissatisfied and if we could cause a wave of change at CEA. We were so far removed and had so few contacts to anyone who actually worked there that it would’ve been a bit silly for us to get involved in this. And I’m not saying we would’ve done it – it’s easy to talk about stuff like that, but then usually you don’t do anything. Still, I feel like this anecdote suggests that there are sometimes more people interested and invested in good community outcomes than one might think, and multiple pathways to beneficial leadership change (like, it’s very possible this former staff had nothing to do with the eventual chain of causes that led to leadership change, and that means that multiple groups of people were expressing worried sentiments about CEA at that time independently).
At one point somewhere between 2017-2018, someone influential in EA encouraged me to share more about specific stuff that happened in the EA orgs I worked because they were sometimes talking to other people who are “also interested in the health of EA orgs / health of the EA community.” (To avoid confusion, this was not the community health team.) This suggests that people somewhat systematically keep an eye on things and even if CEA were to get temporarily taken over by a Silicon Valley cultish community, probably someone would try to do something about it eventually. (Even if it’s just writing an EA forum post to create common knowledge that a specific is now taken over and no longer similar to what it was when it was founded. I mean, we see that posts did eventually get written about Leverage, for instance, and the main reasons it didn’t happen earlier are probably more because many people thought “oh, everyone knows already” and “like anywhere else, few people actually take the time to do community-useful small bits of work when you can just wait for someone else to do it.”)
By the way, this discussion (mostly my initial comment and what it’s in reaction to; not so much specifics about CEA history) reminded me of this comment about the difficulty of discussing issues around culture and desired norms. Seems like maybe we’d be better off discussing what each of us thinks would be best steps forward to improve EA culture or find a way to promote some kind of EA-relevant message (EA itself, the importance of AI alignment, etc.) and do movement building around that so it isn’t at risk of backfiring.
I will again remind people that Leverage at some point had approximately succeeded at a corporate takeover of CEA, placing both the CEO and their second-in-command in the organization. They really were not very peripheral to EA, they were just covert about it.
That’s indeed shocking, and now that you mention it, I also remember the Pareto fellowship Leverage takeover attempt. Maybe I’m too relaxed about this, but it feels to me like there’s no nearby possible world where this situation would have kept going? Pretty much everyone I talked to in EA always made remarks about how Leverage “is a cult” and the Leverage person became CEA’s CEO not because it was the result of a long CEO search process, but because the previous CEO left abruptly and they had few immediate staff-internal options. The CEO (edit: CEA!) board eventually intervened and installed Max Dalton, who was a good leader. Those events happened long ago and in my view they tentatively(?) suggest that the EA community had a good-enough self-correction mechanism so that schemers don’t tend to stay in central positions of power for long periods of time. I concede that we can count these as near misses and maybe even as evidence that there are (often successfully fended off) tensions with the EA culture and who it attracts, but I’m not yet on board with seeing these data points as “evidence for problems with EA as-it-is now” rather than “the sort of thing that happens in both EA and outside of EA as soon as you’re trying to have more influence.”
I think the self-correction mechanism was not very strong. I think if Tara (who was also strongly supportive of the Leverage faction, which is why she placed Larissa in charge) had stayed, I think it would have been the long-term equilibrium of the organization. The primary reason why the equilibrium collapsed is because Tara left to found Alameda.
Interesting; I didn’t remember this about Tara.
Two data points in the other direction:
A few months (maybe up to 9 months, but could be as little as 1 month, I don’t remember the timing) before Larissa had to leave CEA, a friend and I talked to a competent-seeming CEA staff member who was about to leave the org (or had recently left – I don’t remember details) because the org seemed like a mess and had bad leadership. I’m not sure if Leverage was mentioned there – I could imagine that it was, but I don’t remember details and my most salient memory is that I thought of it as “not good leadership for CEA.” My friend and I encouraged them to stay and speak up to try to change leadership, but the person had enough for the time being or had some other reason to leave (again, I don’t remember details). Anyway, this person left CEA at the time without a plan to voice their view that the org was in a bad state. I don’t know if they gave an exit interview or deliberately sought out trustees or if they talked to friends or whether they said nothing at all – I didn’t stay in touch. However, I do remember that my friend discussed if maybe we should at some point get back this former CEA staff person and encourage them again to find out if there are more former colleagues who are dissatisfied and if we could cause a wave of change at CEA. We were so far removed and had so few contacts to anyone who actually worked there that it would’ve been a bit silly for us to get involved in this. And I’m not saying we would’ve done it – it’s easy to talk about stuff like that, but then usually you don’t do anything. Still, I feel like this anecdote suggests that there are sometimes more people interested and invested in good community outcomes than one might think, and multiple pathways to beneficial leadership change (like, it’s very possible this former staff had nothing to do with the eventual chain of causes that led to leadership change, and that means that multiple groups of people were expressing worried sentiments about CEA at that time independently).
At one point somewhere between 2017-2018, someone influential in EA encouraged me to share more about specific stuff that happened in the EA orgs I worked because they were sometimes talking to other people who are “also interested in the health of EA orgs / health of the EA community.” (To avoid confusion, this was not the community health team.) This suggests that people somewhat systematically keep an eye on things and even if CEA were to get temporarily taken over by a Silicon Valley cultish community, probably someone would try to do something about it eventually. (Even if it’s just writing an EA forum post to create common knowledge that a specific is now taken over and no longer similar to what it was when it was founded. I mean, we see that posts did eventually get written about Leverage, for instance, and the main reasons it didn’t happen earlier are probably more because many people thought “oh, everyone knows already” and “like anywhere else, few people actually take the time to do community-useful small bits of work when you can just wait for someone else to do it.”)
By the way, this discussion (mostly my initial comment and what it’s in reaction to; not so much specifics about CEA history) reminded me of this comment about the difficulty of discussing issues around culture and desired norms. Seems like maybe we’d be better off discussing what each of us thinks would be best steps forward to improve EA culture or find a way to promote some kind of EA-relevant message (EA itself, the importance of AI alignment, etc.) and do movement building around that so it isn’t at risk of backfiring.
“The CEA board”, right?