Unfortunately, I think there is a correlation between traits and rituals that are seen as “weird”, and the popups of these bad behaviour.
For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places.
I don’t think your model sufficiently explains why would be the case, whereas my theory posits the explanation that the bay area rats have a much greater density of “cult ingredients” than, say, the EA meetup in some university in belgium or wherever. People have lives and friends outside of EA, they don’t fully buy into the worldview, etc.
I’m not trying to throw the word “cult” out as a perjorative conversation ender. I don’t think Leverage was a full on religious cult, but it had enough in common with them that it ended up with the same harmful effects on people. I think the rationalist worldview leads to an increased ease of forming these sorts of harmful groups, which does not mean the worldview is inherently wrong or bad, just that you need to be careful.
For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places.
I don’t think your model sufficiently explains why would be the case
One explanation could simply be that social pressure in non-rationalist EA mostly pushes people towards “boring” practices rather than “extreme” practices.
For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places.
I think this premise is false. Both early CEA and FTX indeed had really quite little to do with the rationality community, and early Leverage was also more embedded in EA land than rationality land (though they were in the Bay Area, which is definitely relevant).
Agree that the Bay Area in-general feels kind of destabilizing in a relevant way, though like, again, FTX which is by far the most important datapoint (like, I care about preventing FTX at least 10x more than I care about all the other ones), was not located in the Bay Area and indeed had relatively few Bay Area connections.
I mostly agree with your larger point here, especially about the relative importance of FTX, but early Leverage was far more rationalist than it was EA. As of 2013, Leverage staff was >50% Sequences-quoting rationalists, including multiple ex-SIAI and one ex-MetaMed, compared with exactly one person (Mark, who cofounded THINK) who was arguably more of an EA than a rationalist. Leverage taught at CFAR workshops before they held the first EA Summit. Circa 2013 Leverage donors had strong overlap with SIAI/MIRI donors but not with CEA donors. etc.
Unfortunately, I think there is a correlation between traits and rituals that are seen as “weird”, and the popups of these bad behaviour.
For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places.
I don’t think your model sufficiently explains why would be the case, whereas my theory posits the explanation that the bay area rats have a much greater density of “cult ingredients” than, say, the EA meetup in some university in belgium or wherever. People have lives and friends outside of EA, they don’t fully buy into the worldview, etc.
I’m not trying to throw the word “cult” out as a perjorative conversation ender. I don’t think Leverage was a full on religious cult, but it had enough in common with them that it ended up with the same harmful effects on people. I think the rationalist worldview leads to an increased ease of forming these sorts of harmful groups, which does not mean the worldview is inherently wrong or bad, just that you need to be careful.
One explanation could simply be that social pressure in non-rationalist EA mostly pushes people towards “boring” practices rather than “extreme” practices.
I think this premise is false. Both early CEA and FTX indeed had really quite little to do with the rationality community, and early Leverage was also more embedded in EA land than rationality land (though they were in the Bay Area, which is definitely relevant).
Agree that the Bay Area in-general feels kind of destabilizing in a relevant way, though like, again, FTX which is by far the most important datapoint (like, I care about preventing FTX at least 10x more than I care about all the other ones), was not located in the Bay Area and indeed had relatively few Bay Area connections.
I mostly agree with your larger point here, especially about the relative importance of FTX, but early Leverage was far more rationalist than it was EA. As of 2013, Leverage staff was >50% Sequences-quoting rationalists, including multiple ex-SIAI and one ex-MetaMed, compared with exactly one person (Mark, who cofounded THINK) who was arguably more of an EA than a rationalist. Leverage taught at CFAR workshops before they held the first EA Summit. Circa 2013 Leverage donors had strong overlap with SIAI/MIRI donors but not with CEA donors. etc.
What happened at early CEA?