Great post! Another excellent example of the invaluable work Rethink Priorities does.
My observations and takeaways from this latest survey are:
Overall satisfaction is low. On a 10 point scale, a 6.77 rating isn’t very good. Only ~12% give a rating of 9 or 10, and <40% give a rating of 8 or higher. (I’m aware that a 6.8 rating on a ten point scale still suggests more satisfaction than dissatisfaction, but given that we’re talking about people rating their satisfaction with a community they’ve chosen to participate in I think we should interpret this result as damning with faint praise).
The change in satisfaction from pre-FTX to now is quite large. The original survey in the immediate aftermath of FTX’s collapse found a substantial drop in satisfaction vs. pre-FTX. One commenter at the time observed that “[decreased satisfaction by 0.5-1 points on a 10-point scale is] a big drop! In practice I’ve only ever seen this type of satisfaction scale give results between about 7⁄10 through 9.5/10… so that decline is a real big chunk of the scale’s de facto range.” This latest survey suggests that the satisfaction has dropped another ~.2-.4 points. That means from pre-FTX to today, satisfaction has probably dropped something like .7-1.4 points, which is huge for a 10 point scale with a much narrower effective range (and the drop could easily be understated due to selection bias). The mean reversion of satisfaction to pre-FTX levels that some expected has not happened.
Selection bias is probably causing us to overstate satisfaction, and understate the degree to which satisfaction has dropped. As the report notes, selection bias could plausibly cut both ways. That said, I think most people would guess that selection bias is operating in the direction of excluding the least satisfied members from the data set. And the data seems to give some indication to support that view. Looking at the distribution of satisfaction scores in late 2022 vs late 2023, we see the reduction in satisfaction over that period coming from fewer people giving high ratings of 9 or 10, and more people giving low ratings of 3-5. But for the lowest ratings, we see basically no change in the number of people giving a rating of 2, and fewer people (almost nobody) now giving a rating of 1. Given falling satisfaction, we should expect more people to be giving these lowest ratings; the fact that we don’t suggests that the least satisfied people are excluded from the sample.
The behavioral changes described in this survey are significant and warrant attention. The number of people making concrete behavioral changes surprised me (even though I already believed FTX had a significant negative impact on community growth and activity). I was particularly surprised at how many people reported permanently ceasing to promote EA or engage in key EA activities, especially since selection bias likely suppresses the reported numbers. The December 2022 survey indicated a community desire for additional transparency, reflections, and post-mortems. This more recent survey suggests the community has been unimpressed by the limited public efforts on these fronts, with concrete negative implications.
This analysis shows large drops in satisfaction and significant behavior changes, but only gives limited insight into what caused these. Issues like JDEI concerns and concerns about cause prioritization are clearly reasons why people might be dissatisfied, but it’s not obvious why these longstanding issues should cause people to be more dissatisfied now than they were in late 2022. We can hypothesize that the changes were driven by FTX (and to a lesser extent subsequent scandals), but it would be preferable to look directly at the data. I think it would be very valuable to closely examine the cohort of people who report having changed their behavior. (It would also be interesting to look at a cohort of people who report significant decreases in satisfaction, but it seems more important to understand drivers of behavior change.) For this cohort, I would like to know:
What are the largest causes of dissatisfaction? How does this cohort compare to the full sample?
How much did reported satisfaction decrease since previous surveys?
What levels of satisfaction [or sizes of decreases in satisfaction] are associated with behavior change (i.e. can we say something like “if satisfaction drops to 6 or lower [or decreases by 1 or more points], that’s when we start seeing people shift behaviors”?)
How does perception of issues for this cohort compare to the full sample?
Do perceptions of issues look different for modest behavior changes (e.g. stopping referring to “EA” but still promoting) vs. larger behavior changes (e.g. permanently stopping promoting)?
Do we have any insights into the temporarily stopped promoting cohort (a sizeable 15% of the sample)? Do we have any idea whether these are people who stopped promoting for a few months but are now promoting again vs. people who are still not promoting but expect to do so again at some point? I realize the most recent survey might not give enough information to tell, but this seems like a very important issue to understand and something we might want to dig into in the full EA Survey.
How much overlap is there across the different types of behavior change?
Updating analyses of community growth seems like it should be a high priority. The last public effort was based on data collected in August 2023. I’ve been a longstanding proponent of conducting regular analyses of community growth, but refreshing that analysis (perhaps with some methodologicaltweaks) seems particularly important given the results of this survey. Various metrics that are publicly available don’t look great, continuing trends I first noted in October (though note that some of the metrics that performed strongest in the immediate aftermath of FTX, most notably 80k’s metrics, aren’t publicly available):
Donations and donors to EA Funds continue to shrink year over year. The decline in donations has been a multi-year trend, but the decline in donors represents a reversal of a multi-year growth trend , and the timing of that reversal aligns perfectly with FTX’s collapse.
Per CEA’s dashboard, the last few months have seen the lowest traffic ever to the effectivealtruism.org intro page (in a data series going back to 2017) and the EA newsletter subscription base has shown no sign of regaining the growth trajectory it was on for 2 years prior to FTX’s collapse; since the collapse subscribers have actually shrank slightly.
As Jason astutely observed, the massive increase in public awareness around AI over the last year should have provided a significant tailwind to many metrics. I would argue the same is true for the hugely successful promotion around WWOTF. And demographic factors should generally support an expectation of growth in donation related metrics (almost everyone in the EA community is at an age where they are entering the workforce or increasing their earnings; virtually nobody is at retirement age). The weak performance we’ve observed in publicly available metrics looks even worse in this context.
I think it would be very valuable to closely examine the cohort of people who report having changed their behavior...
All behaviour changes were correlated with each other (except for stopping referring to EA, while still promoting it, which was associated with temporarily stopping promoting EA, but somewhat negatively associated with other changes).
All behaviour changes were associated with lower satisfaction, with most behavioural changes common only among people with satisfaction below the midpoint, and quite rare with satisfaction above the midpoint (again, with the exception of stopping referring to EA, while still promoting it, which was more common across levels).
People who reported a behavioural change were more likely, on the whole, to mention factors as reasons for dissatisfaction. (When interpreting these it’s important to account for the fact that people being more/less likely to mention a factor at a particular importance level might be explained by them being less/more likely to mention it at a different importance level, with less difference in terms of their overall propensity to mention it).
Similarly, there was no obvious pattern of particular factors being associated with lower satisfaction. In general, people who mentioned any given factor were less satisfied.
In principle, we could do more to assess whether any particular factors predict particular behavioural changes, controlling for relevant factors, but it might make more sense to wait for the next full iteration of the EA Survey, when we’ll have a larger sample size, and can ask people explicitly whether each of these things are factors (rather than relying on people spontaneously mentioning them.
For the other measures, differences are largely as expected, i.e. people who made a behaviour change are more likely to desire more community change, more likely to strongly agree there’s a leadership vacuum,[1] and trust was higher among people who had not made a behaviour change.
Updating analyses of community growth seems like it should be a high priority… I’ve been a longstanding proponent of conducting regular analyses of community growth..
I still agree with this, unfortunately, we’ve been unsuccessful in securing any funding for more analysis of community growth metrics.
I personally don’t put too much weight on this question. I worry that it’s somewhat leading, and that people who are generally more dissatisfied are more likely to agree with it, but it’s unclear that leadership vacuum is really an active concern for people or that it’s what’s driving people’s dissatisfaction.
Thanks so much for this additional data and analysis! Really interesting stuff here. To me, the most interesting things are:
The midpoint of the satisfaction scale being a pretty good threshold for the point at which we see behavior changes
The relatively high frequency of people with high satisfaction temporarily stopping promoting EA (and the general flatness of this curve)
I was surprised that for the cohort that changed their behavior, “scandal” was just one of many reasons for dissatisfaction and didn’t really stand out. The data you provide looks quite consistent with Luke Freeman’s observation: “My impression is that that there was a confluence of things that peaking around the FTX-collapse. There was building hostility towards some of the more avant garde EA ideas an actions of those associated with EA[1] (towards both accurate and also misrepresentations of those ideas and actions) that seemed to get traction around just prior to the launch of WWOTF which meant there were a lot of people/opinions that got a lot more light when WWOTF was getting lots of attention and FTX failed so spectacularly. Then there was so much energy and angst in the system (both within the community and its critics) that I think the other issues compounded more than any individual one would have. The confluence of all this has sadly left a bad taste in a lot of people’s mouths that costs fairly uncontroversially good things a lot in terms of action and advocacy.”
The relatively high frequency of people with high satisfaction temporarily stopping promoting EA (and the general flatness of this curve)
Agreed. I think that people temporarily stopping promoting EA is compatible with people who are still completely on board with EA, deciding that it’s strategically unwise to publicly promote it, at a time when there’s lots of negative discussion of it in the media. Likewise with still promoting EA, but stopping referring to it as “EA”, which also showed high levels across the board.
I think the prevalence of these behaviours points to the importance of more empirical research on the EA brand and how it compares to alternative brands or just referring to individual causes or projects (see our proposal here). I think it’s entirely possible that the term “EA” itself has been tarnished and that people do better to promote ideas and projects without explicitly branding them as EA. But there’s a real cost to just promoting things piecemeal or using alternative terms (e.g. “have you heard of “high impact careers” / “existential security”?”), rather than referring to a unified established brand. So it’s not clear a priori whether this is a positive move.
I was surprised that for the cohort that changed their behavior, “scandal” was just one of many reasons for dissatisfaction and didn’t really stand out. The data you provide looks quite consistent with Luke Freeman’s observation: “My impression is that that there was a confluence of things that peaking around the FTX-collapse...”
Agreed. One possible explanation, other than it just being a co-incidence of factors, is that the FTX crisis and subsequent revelations dented faith in EA leadership, and made people more receptive to other concerns. (I think historically, much of the community has been extremely deferential to core EA orgs and ~ assumed they know what they’re doing come what may).
Certainly it’s true that many of the other factors e.g. dissatisfaction with cause prioritisation, diversity, and elitism had been cited for a while. It’s also true that even before FTX (though it still holds for 2022), people who had been in the community longer tended to be less satisfied with the community, even though higher engagement was associated with higher satisfaction.[1] While the implications of this for the average satisfaction level of the community depend on how many newer vs older EAs we have at a given time, this is compatible with a story where EAs generally become less satisfied with the community over time.
Note that this is the opposite direction to what you’d see if less satisfied people drop out, leaving more satisfied people remaining in earlier cohorts. That said the linked analyses (for individual years) can’t rule out the possibility that earlier cohorts have just always been distinctively less satisfied, which would require a comparison across years.
I suspect that many EAs tactically and temporarily suppressed their use of EA language after the FTX debacle, when they knew that EA had suffered a (hopefully transient) setback.
This may actually be quite analogous to the cyclical patterns of outreach and enthusiasm that we see in crypto investing itself. The post-FTX 2022-2023 bear market in crypto was reflected in a lot of ‘crypto influencers’ just not talking very much about crypto for a year or two, when investor sentiment was very low. Then, as the price action picked up in the last half of 2023 through now, and optimism returned, and the Bitcoin ETFs got approved by the SEC, people started talking about crypto again. So it has gone, with every 4-year-cycle in crypto.
The thing to note here is that in the dark depths of the ‘crypto winter’ (esp. early 2023), it seemed like confidence and optimism might never return. (Which is, of course, why token prices were so low). But, things did improve, as the short-term sting of the FTX scandal faded.
So, hopefully, things might go with EA itself, as we emerge from this low point in our collective sentiment.
Excellent points, everything you write here makes a lot of sense to me. I really hope you’re able to find funding for the proposal to research the EA brand relative to other alternatives. That seems like a really fundamental issue to understand, and your proposed study could provide a lot of valuable information for a very modest price.
Great post! Another excellent example of the invaluable work Rethink Priorities does.
My observations and takeaways from this latest survey are:
Overall satisfaction is low. On a 10 point scale, a 6.77 rating isn’t very good. Only ~12% give a rating of 9 or 10, and <40% give a rating of 8 or higher. (I’m aware that a 6.8 rating on a ten point scale still suggests more satisfaction than dissatisfaction, but given that we’re talking about people rating their satisfaction with a community they’ve chosen to participate in I think we should interpret this result as damning with faint praise).
The change in satisfaction from pre-FTX to now is quite large. The original survey in the immediate aftermath of FTX’s collapse found a substantial drop in satisfaction vs. pre-FTX. One commenter at the time observed that “[decreased satisfaction by 0.5-1 points on a 10-point scale is] a big drop! In practice I’ve only ever seen this type of satisfaction scale give results between about 7⁄10 through 9.5/10… so that decline is a real big chunk of the scale’s de facto range.” This latest survey suggests that the satisfaction has dropped another ~.2-.4 points. That means from pre-FTX to today, satisfaction has probably dropped something like .7-1.4 points, which is huge for a 10 point scale with a much narrower effective range (and the drop could easily be understated due to selection bias). The mean reversion of satisfaction to pre-FTX levels that some expected has not happened.
Selection bias is probably causing us to overstate satisfaction, and understate the degree to which satisfaction has dropped. As the report notes, selection bias could plausibly cut both ways. That said, I think most people would guess that selection bias is operating in the direction of excluding the least satisfied members from the data set. And the data seems to give some indication to support that view. Looking at the distribution of satisfaction scores in late 2022 vs late 2023, we see the reduction in satisfaction over that period coming from fewer people giving high ratings of 9 or 10, and more people giving low ratings of 3-5. But for the lowest ratings, we see basically no change in the number of people giving a rating of 2, and fewer people (almost nobody) now giving a rating of 1. Given falling satisfaction, we should expect more people to be giving these lowest ratings; the fact that we don’t suggests that the least satisfied people are excluded from the sample.
The behavioral changes described in this survey are significant and warrant attention. The number of people making concrete behavioral changes surprised me (even though I already believed FTX had a significant negative impact on community growth and activity). I was particularly surprised at how many people reported permanently ceasing to promote EA or engage in key EA activities, especially since selection bias likely suppresses the reported numbers. The December 2022 survey indicated a community desire for additional transparency, reflections, and post-mortems. This more recent survey suggests the community has been unimpressed by the limited public efforts on these fronts, with concrete negative implications.
This analysis shows large drops in satisfaction and significant behavior changes, but only gives limited insight into what caused these. Issues like JDEI concerns and concerns about cause prioritization are clearly reasons why people might be dissatisfied, but it’s not obvious why these longstanding issues should cause people to be more dissatisfied now than they were in late 2022. We can hypothesize that the changes were driven by FTX (and to a lesser extent subsequent scandals), but it would be preferable to look directly at the data. I think it would be very valuable to closely examine the cohort of people who report having changed their behavior. (It would also be interesting to look at a cohort of people who report significant decreases in satisfaction, but it seems more important to understand drivers of behavior change.) For this cohort, I would like to know:
What are the largest causes of dissatisfaction? How does this cohort compare to the full sample?
Does dissatisfaction appear to be driven by one major factor, or a confluence of concerns (as Luke Freeman has suggested might be the case)?
How much did reported satisfaction decrease since previous surveys?
What levels of satisfaction [or sizes of decreases in satisfaction] are associated with behavior change (i.e. can we say something like “if satisfaction drops to 6 or lower [or decreases by 1 or more points], that’s when we start seeing people shift behaviors”?)
How does perception of issues for this cohort compare to the full sample?
Do perceptions of issues look different for modest behavior changes (e.g. stopping referring to “EA” but still promoting) vs. larger behavior changes (e.g. permanently stopping promoting)?
Do we have any insights into the temporarily stopped promoting cohort (a sizeable 15% of the sample)? Do we have any idea whether these are people who stopped promoting for a few months but are now promoting again vs. people who are still not promoting but expect to do so again at some point? I realize the most recent survey might not give enough information to tell, but this seems like a very important issue to understand and something we might want to dig into in the full EA Survey.
How much overlap is there across the different types of behavior change?
Updating analyses of community growth seems like it should be a high priority. The last public effort was based on data collected in August 2023. I’ve been a longstanding proponent of conducting regular analyses of community growth, but refreshing that analysis (perhaps with some methodological tweaks) seems particularly important given the results of this survey. Various metrics that are publicly available don’t look great, continuing trends I first noted in October (though note that some of the metrics that performed strongest in the immediate aftermath of FTX, most notably 80k’s metrics, aren’t publicly available):
Donations and donors to EA Funds continue to shrink year over year. The decline in donations has been a multi-year trend, but the decline in donors represents a reversal of a multi-year growth trend , and the timing of that reversal aligns perfectly with FTX’s collapse.
GWWC donations and donors were down in 2023 vs. 2022, and pledges in Dec 23/Jan 24 were also down vs Dec 22/Jan 23 even though that latter period immediately followed FTX’s collapse and GWWC was “instructed to pause all our giving season campaigns around the time of the crisis.”
EA Forum metrics, which had experienced years of consistent and rapid growth leading up to FTX’s collapse, have either flatlined at the levels they were at 2 years ago or have been steadily falling since the collapse (even as resources invested in the forum have increased significantly since then).
Per CEA’s dashboard, the last few months have seen the lowest traffic ever to the effectivealtruism.org intro page (in a data series going back to 2017) and the EA newsletter subscription base has shown no sign of regaining the growth trajectory it was on for 2 years prior to FTX’s collapse; since the collapse subscribers have actually shrank slightly.
As Jason astutely observed, the massive increase in public awareness around AI over the last year should have provided a significant tailwind to many metrics. I would argue the same is true for the hugely successful promotion around WWOTF. And demographic factors should generally support an expectation of growth in donation related metrics (almost everyone in the EA community is at an age where they are entering the workforce or increasing their earnings; virtually nobody is at retirement age). The weak performance we’ve observed in publicly available metrics looks even worse in this context.
Many thanks!
All behaviour changes were correlated with each other (except for stopping referring to EA, while still promoting it, which was associated with temporarily stopping promoting EA, but somewhat negatively associated with other changes).
All behaviour changes were associated with lower satisfaction, with most behavioural changes common only among people with satisfaction below the midpoint, and quite rare with satisfaction above the midpoint (again, with the exception of stopping referring to EA, while still promoting it, which was more common across levels).
People who reported a behavioural change were more likely, on the whole, to mention factors as reasons for dissatisfaction. (When interpreting these it’s important to account for the fact that people being more/less likely to mention a factor at a particular importance level might be explained by them being less/more likely to mention it at a different importance level, with less difference in terms of their overall propensity to mention it).
Similarly, there was no obvious pattern of particular factors being associated with lower satisfaction. In general, people who mentioned any given factor were less satisfied.
In principle, we could do more to assess whether any particular factors predict particular behavioural changes, controlling for relevant factors, but it might make more sense to wait for the next full iteration of the EA Survey, when we’ll have a larger sample size, and can ask people explicitly whether each of these things are factors (rather than relying on people spontaneously mentioning them.
For the other measures, differences are largely as expected, i.e. people who made a behaviour change are more likely to desire more community change, more likely to strongly agree there’s a leadership vacuum,[1] and trust was higher among people who had not made a behaviour change.
I still agree with this, unfortunately, we’ve been unsuccessful in securing any funding for more analysis of community growth metrics.
I personally don’t put too much weight on this question. I worry that it’s somewhat leading, and that people who are generally more dissatisfied are more likely to agree with it, but it’s unclear that leadership vacuum is really an active concern for people or that it’s what’s driving people’s dissatisfaction.
Thanks so much for this additional data and analysis! Really interesting stuff here. To me, the most interesting things are:
The midpoint of the satisfaction scale being a pretty good threshold for the point at which we see behavior changes
The relatively high frequency of people with high satisfaction temporarily stopping promoting EA (and the general flatness of this curve)
I was surprised that for the cohort that changed their behavior, “scandal” was just one of many reasons for dissatisfaction and didn’t really stand out. The data you provide looks quite consistent with Luke Freeman’s observation: “My impression is that that there was a confluence of things that peaking around the FTX-collapse. There was building hostility towards some of the more avant garde EA ideas an actions of those associated with EA[1] (towards both accurate and also misrepresentations of those ideas and actions) that seemed to get traction around just prior to the launch of WWOTF which meant there were a lot of people/opinions that got a lot more light when WWOTF was getting lots of attention and FTX failed so spectacularly. Then there was so much energy and angst in the system (both within the community and its critics) that I think the other issues compounded more than any individual one would have. The confluence of all this has sadly left a bad taste in a lot of people’s mouths that costs fairly uncontroversially good things a lot in terms of action and advocacy.”
Agreed. I think that people temporarily stopping promoting EA is compatible with people who are still completely on board with EA, deciding that it’s strategically unwise to publicly promote it, at a time when there’s lots of negative discussion of it in the media. Likewise with still promoting EA, but stopping referring to it as “EA”, which also showed high levels across the board.
I think the prevalence of these behaviours points to the importance of more empirical research on the EA brand and how it compares to alternative brands or just referring to individual causes or projects (see our proposal here). I think it’s entirely possible that the term “EA” itself has been tarnished and that people do better to promote ideas and projects without explicitly branding them as EA. But there’s a real cost to just promoting things piecemeal or using alternative terms (e.g. “have you heard of “high impact careers” / “existential security”?”), rather than referring to a unified established brand. So it’s not clear a priori whether this is a positive move.
Agreed. One possible explanation, other than it just being a co-incidence of factors, is that the FTX crisis and subsequent revelations dented faith in EA leadership, and made people more receptive to other concerns. (I think historically, much of the community has been extremely deferential to core EA orgs and ~ assumed they know what they’re doing come what may).
Certainly it’s true that many of the other factors e.g. dissatisfaction with cause prioritisation, diversity, and elitism had been cited for a while. It’s also true that even before FTX (though it still holds for 2022), people who had been in the community longer tended to be less satisfied with the community, even though higher engagement was associated with higher satisfaction.[1] While the implications of this for the average satisfaction level of the community depend on how many newer vs older EAs we have at a given time, this is compatible with a story where EAs generally become less satisfied with the community over time.
Note that this is the opposite direction to what you’d see if less satisfied people drop out, leaving more satisfied people remaining in earlier cohorts. That said the linked analyses (for individual years) can’t rule out the possibility that earlier cohorts have just always been distinctively less satisfied, which would require a comparison across years.
David—this is a helpful and reasonable comment.
I suspect that many EAs tactically and temporarily suppressed their use of EA language after the FTX debacle, when they knew that EA had suffered a (hopefully transient) setback.
This may actually be quite analogous to the cyclical patterns of outreach and enthusiasm that we see in crypto investing itself. The post-FTX 2022-2023 bear market in crypto was reflected in a lot of ‘crypto influencers’ just not talking very much about crypto for a year or two, when investor sentiment was very low. Then, as the price action picked up in the last half of 2023 through now, and optimism returned, and the Bitcoin ETFs got approved by the SEC, people started talking about crypto again. So it has gone, with every 4-year-cycle in crypto.
The thing to note here is that in the dark depths of the ‘crypto winter’ (esp. early 2023), it seemed like confidence and optimism might never return. (Which is, of course, why token prices were so low). But, things did improve, as the short-term sting of the FTX scandal faded.
So, hopefully, things might go with EA itself, as we emerge from this low point in our collective sentiment.
Excellent points, everything you write here makes a lot of sense to me. I really hope you’re able to find funding for the proposal to research the EA brand relative to other alternatives. That seems like a really fundamental issue to understand, and your proposed study could provide a lot of valuable information for a very modest price.