Still, it’s hard to see how tweaking EA can lead to a product that we and others be excited about growing.
It’s not clear to me how far this is the case.
Re. the EA community: evidence from our community survey, run with CEA, suggests a relatively limited reduction in morale post-FTX.
Re. non-EA audiences, our work reported here and here (though still unpublished due to lack of capacity) suggest relatively low negative effects in the broader population (including among elite US students specifically).
I agree that:
Selection bias (from EAs with more negative reactions dropping out) could mean that the true effects are more negative.
I agree that if we knew large numbers of people were leaving EA this would be another useful datapoint, though I’ve not seen much evidence of this myself. Formally surveying the community to see how many people know of leaving could be useful to adjudicate this.
We could also conduct a ‘non-EA Survey’ which tries to reach people who have dropped out of EA, or who would be in EA’s target audience but who declined to join EA (most likely via referrals), which would be more systematic than anecdotal evidence. RP discussed doing with with researchers/community builders at another org, but haven’t run this due to lack of capacity/lack of funding.
If many engaged EAs are dropping out but growth is continuing only because “new recruits are young and naive about EA’s failings,” this is bad.
That said, I see little reason to think this is the case.
In addition, EA’s recent growth rates seem higher than I would expect if we were seeing considerable dropout.
Especially considering that we have the excellent option of just talking directly about the issues that matter to us, and doing field-building around those ideas—AI safety, Global Priorities Research, etc. and so on. This would be a relatively clean slate, allowing us to do more (as outlined in 11), to discourage RB, and stop bad actors.
It’s pretty unclear to me that we would expect these alternatives to do better.
One major factor is that it’s not clear that these particular ideas/fields are in a reputationally better position than EA. Longtermist work may have been equally or more burned by FTX than EA. AI safety and other existential risk work have their own reputational vulnerabilities. And newer ideas/fields like ‘Global Priorities Research’ could suffer from being seen as essentially a rebrand of EA, especially if they share many of the same key figures/funding sources/topics of concern, which (per your 11a) risks being seen as deceptive. Empirical work to assess these questions seems quite tractable and neglected.
Re. your 10f-g, I’m less sanguine that the effects of a ‘reset’ of our culture/practices would be net positive. It seems like it may be harder to maintain a good culture across multiple fragmented fields in general. Moreover, as suggested by Arden’s point number 1 here, there are some reasons to think that basing work solely around a specific cause may engender a less good culture than EA, given EA’s overt focus on promoting certain virtues.
Thanks for this. I found the uncited claims about EA’s “reputational collapse” in the OP quite frustrating and appreciated this more data-driven response.
I agree, the most striking part of this article was that this core assumption had no numerical data to back it up. Only his own discussions with high level EAs.
“Due to the reputational collapse of EA”
High level EAs are more likely to have closer involvement with SBF/FTX and therefore more likely to have higher levels of reputational Loss than the average EA, or even the movement as a whole. I would confidently guess that the “200-800” EAs who lost big on FTX would skew heavily towards the top of the leadership structure.
The three studies cited here in the comment, and a few from community organisers provide evidence that yes EA had reputationally suffered, but hardly collapsed. Why did not at least mention those? Maybe because it was a February draft, but I would have thought revising to cite what data is available is a good idea?
The OP might be right that the situation is worse than appears on what research we have, but I would have thought that making arguments against the validity of the research would have been a good idea.
It’s not clear to me how far this is the case.
Re. the EA community: evidence from our community survey, run with CEA, suggests a relatively limited reduction in morale post-FTX.
Re. non-EA audiences, our work reported here and here (though still unpublished due to lack of capacity) suggest relatively low negative effects in the broader population (including among elite US students specifically).
I agree that:
Selection bias (from EAs with more negative reactions dropping out) could mean that the true effects are more negative.
I agree that if we knew large numbers of people were leaving EA this would be another useful datapoint, though I’ve not seen much evidence of this myself. Formally surveying the community to see how many people know of leaving could be useful to adjudicate this.
We could also conduct a ‘non-EA Survey’ which tries to reach people who have dropped out of EA, or who would be in EA’s target audience but who declined to join EA (most likely via referrals), which would be more systematic than anecdotal evidence. RP discussed doing with with researchers/community builders at another org, but haven’t run this due to lack of capacity/lack of funding.
If many engaged EAs are dropping out but growth is continuing only because “new recruits are young and naive about EA’s failings,” this is bad.
That said, I see little reason to think this is the case.
In addition, EA’s recent growth rates seem higher than I would expect if we were seeing considerable dropout.
It’s pretty unclear to me that we would expect these alternatives to do better.
One major factor is that it’s not clear that these particular ideas/fields are in a reputationally better position than EA. Longtermist work may have been equally or more burned by FTX than EA. AI safety and other existential risk work have their own reputational vulnerabilities. And newer ideas/fields like ‘Global Priorities Research’ could suffer from being seen as essentially a rebrand of EA, especially if they share many of the same key figures/funding sources/topics of concern, which (per your 11a) risks being seen as deceptive. Empirical work to assess these questions seems quite tractable and neglected.
Re. your 10f-g, I’m less sanguine that the effects of a ‘reset’ of our culture/practices would be net positive. It seems like it may be harder to maintain a good culture across multiple fragmented fields in general. Moreover, as suggested by Arden’s point number 1 here, there are some reasons to think that basing work solely around a specific cause may engender a less good culture than EA, given EA’s overt focus on promoting certain virtues.
Thanks for this. I found the uncited claims about EA’s “reputational collapse” in the OP quite frustrating and appreciated this more data-driven response.
I agree, the most striking part of this article was that this core assumption had no numerical data to back it up. Only his own discussions with high level EAs.
“Due to the reputational collapse of EA”
High level EAs are more likely to have closer involvement with SBF/FTX and therefore more likely to have higher levels of reputational Loss than the average EA, or even the movement as a whole. I would confidently guess that the “200-800” EAs who lost big on FTX would skew heavily towards the top of the leadership structure.
The three studies cited here in the comment, and a few from community organisers provide evidence that yes EA had reputationally suffered, but hardly collapsed. Why did not at least mention those? Maybe because it was a February draft, but I would have thought revising to cite what data is available is a good idea?
The OP might be right that the situation is worse than appears on what research we have, but I would have thought that making arguments against the validity of the research would have been a good idea.
FYI, I’ve just released a post which offers significantly more empirical data on how FTX has impacted EA. FTX’s collapse seems to mark a clear and sizable deterioration across a variety of different EA metrics.