Hey @AnonymousEAForumAccount, I’m sorry for not responding to this earlier, and thank you as always for your thoughtful engagement with our strategy. I genuinely appreciate your deep engagement here. As context, I work closely with Jessica on coordinating the growth pillar within CEA.
Going through your comments line by line:
Prioritizing high-value community assets.
As Toby and Sarah have mentioned, I’m really excited that we’re prioritizing work to improve the quality of these programs and expand their reach! I won’t say more since I think my colleagues have covered it.
Creation of good, public growth dashboards.
Thanks for your bids here — responding by category:
Re: the existing CEA dashboard:
I’m glad it has been a valuable product, and apologize that it has not been always consistently kept up to date. We’ve been unusually short staffed on the technical capacity needed to maintain this data in the last few months (in part because I’ve moved into a more generalist role), but are working on finding it a consistent owner internally.
I’m also excited about the value of this dashboard for helping the community track growth in CEA’s products!
Re: a public dashboard for EA growth as a whole:
I agree that if there were a well maintained and easily interpretable dashboard of EA relevant growth metrics, this would be a major win. I wouldn’t rule out prioritizing a project like this, but right now we are prioritizing doing the foundational investigation work ourselves.
From past experience with running similar projects, I expect this project would be a major time investment, both to keep the data fresh, and to coordinate many external stakeholder concerns. If we report core growth metrics for many orgs (especially if this includes metrics that weren’t previously made public which is IMO where the main value add would be), I think we want to do so responsibly and accurately — which takes time!
This is all to say I’d want to think hard about the capacity tradeoffs on our side, and am not immediately convinced it is worth prioritizing, but I’d be excited to revisit this down the line.
Thoughtful reflection on growth measurement.
To take a step back, I think we’d broadly agree that much less effort historically has been put into investigating the question of “How much is EA growing and in what ways?” than we both would like. This is still a very shallow research area relative to where I’d like the EA community to be, and while I think we have made important progress in the last few years, I’d be interested in more work here.
In terms of the specific analysis you point to, we’ve stopped relying on this exact methodology internally so haven’t prioritized following up on it, although if someone wanted to try grading our line-by-line predictions based on e.g. our dashboard + public information (some linked from the post), I’d be pretty excited about that.
I have some quibbles around how “obviously off” the analysis is in retrospect (my confidence intervals around the top line numbers were pretty wide, and the analysis was importantly not just tracking growth in principles-first EA community building projects which I think changes its interpretation), but I won’t dive deep into these for sake of time.
Transparency about growth strategy and targets
Thanks for prompting us for this! For transparency, our top priority right now remains making sure we endorse and are able to reach our growth targets, and I expect this will take up the majority of our growth-specific attention in Q2-Q3. I think that’s appropriate for solidifying our work internally, and am excited for us to share more in due course.
I was extremely surprised to see the claim in the OP that “Growth has long been at the core of our mission.”
I wonder if we are talking past each other here (I’m surprised at your surprise!), although perhaps this wording could also have been clearer. As a community building org, a major way I think CEA has become more successful over time is in building up our programs. For instance I think of the growth in our EAG and EAGx portfolio from pre- to post-pandemic times, and the scaling in our Ongoing Support Program for university group organisers as two emblematic examples of programs finding their product-market-impact fit and then scaling up to achieve more impact over time.
I think what’s new here is that after a period of being focused on building foundations internally (in part to prepare for growth), we are now back towards a more unified growth-focused strategy across CEA.
Thanks Angelina for your engagement and your thoughtful response, and sorry for my slow reply!
Re: dashboards, I’m very sympathetic to the difficulties of collecting metrics from across numerous organizations. It would be great to see what we can learn from that broader data set, but if that is too difficult to realistically keep up to date then the broader dashboard shouldn’t be the goal. The existing CEA dashboard has enough information to build a “good enough” growth dashboard that could easily be updated and would be a vast upgrade to EA’s understanding of its growth.
But for that to happen, the dashboard would need to transition from a bunch of charts showing metrics for different program areas to a dashboard that’s actually measuring growth rates in the metrics and program areas over different time frames, showing how those growth rates have evolved, aggregating and comparing those growth rates across metrics and time frames, and summarizing the results. (IMO you could even drop some of the less important metrics from the current dashboard. Ideally you would also add important and easily/consistently available metrics like google search activity for EA and Wikipedia page views).
Re: transparency around growth targets, let me explain why “I was extremely surprised to see the claim in the OP that “Growth has long been at the core of our mission.”” In my experience, organizations that have growth at the core of their mission won’t shut up about growth. It’s the headline of communications, not in a vague sense, but in a well-defined and quantified sense (i.e. “last quarter our primary metric, defined in such and such a way, grew at x%). There’s an emphasis on understanding specific drivers of, and bottlenecks of, growth.
In contrast, the community has been expressing confusion at CEA’s unwillingness to measure growth for nearly a decade. We’ve seen remarkably little communication from CEA about how fast it believes the community is growing or how it even thinks about measuring it. Your post estimating growth rates is an exception, but even that was framed as a “first stab”, it left important methodological questions unresolved, and has since been abandoned. If growth is so important to CEA, why don’t we know what CEA thinks EA’s growth rate has been the last several years? And if, as Zach says in the OP, growth has been “deprioritized” post-FTX and “during 2024, we explicitly deprioritized trying to grow the EA community”, why weren’t these decisions clearly communicated at the time?
CEA will at times mention that a specific program area or two has experienced rapid growth, but those mentions typically occur in a vacuum without any context about how fast other programs are growing (which can make it seem like cherry-picking). When CEA has talked about its high level strategy, I haven’t drawn the conclusion that growth was “at the core of the mission”; the focus has been more on things like “creating and sustaining high-quality discussion spaces.” And the strategy has often seemed to place more emphasis on targeting particularly high leverage groups (e.g. elite universities) than approaches that are more scalable (e.g. targeting universities that are both good and big, prioritizing virtual programs, etc). In my view, CEA has focused much more on throttling community growth rates back to levels it views as healthy vs. growing itself or raising the capacity to grow faster in a healthy way. Maybe that was a good decision, but I see it very differently from placing growth at the core of the mission.
Re: the intersection of community assets and transparency around growth strategy: since I have your ear, I want to point out a problem that I really hope you’ll address.
On its “mistakes” page, CEA acknowledges that “At times, we’ve carried out projects that we presented as broadly EA that in fact overrepresented some views or cause areas that CEA favored. We should have either worked harder to make these projects genuinely representative, or have communicated that they were not representative”. The page goes on to list examples of this mistake that span a decade.
Right now, under “who runs this website”, the effectivealtruism.org site simply mentions CEA and links to CEA’s website. If someone looks at the “mission” (previously “strategy”) page on CEA’s site, in the “how we think about moderation” section one learns that “When representing this diverse community we think that we have a duty to be thoughtful about how we approach moderation and content curation… We think that we can do this without taking an organizational stance on which cause or strategy is most effective.”
It is only if one then clicks through to a more detailed post about moderation and curation that one learns that “Of the cause-area-specific materials, roughly 50% focuses on existential risk reduction (especially AI risk and pandemic risk), 15% on animal welfare, and 20% on global development, and 15% on other causes (including broader longtermism).”
Yet even that more detailed page, does not explain that the top “Factors that shape CEA’s cause prioritization… (and, for example, why AI safety currently receives more attention than other specific causes)” are: “the opinions of CEA staff”, “our funders” (“The reality is that the majority of our funding comes from Open Philanthropy’s Global Catastrophic Risks Capacity Building Team, which focuses primarily on risks from emerging technologies”), and “The views of people who have thought a lot about cause prioritization”, but that the views of the EA community are not included in these factors. This information can only be found in a forum post Zach wrote, but which is not linked to from CEA’s website anywhere. So someone coming from effectivealtruism.org would have no way to find this information.
I hope that part of prioritizing community assets like effectivealtruism.org will include transparency around how/why the content those assets use is created. The status quo looks to me like it’s just continuing the mistakes of the past.
Just noting that I’ve seen and haven’t forgotten about this, thank you! The CEA team is currently at a retreat, so I’ll be slower to respond. Let me know if there are a few particular points you’d be most interested in talking through (I might be able to prioritize that faster).
Thanks for this! No rush on my end (as you can probably tell from my cadence) but FWIW the points I’m most interested in are 1 ) whether a public growth dashboard (even if it mostly/entirely focuses on CEA data) is planned and 2) CEA’s communication around how it handles content/cause curation for community resources like effectivealtruism.org.
Hey @AnonymousEAForumAccount, I’m sorry for not responding to this earlier, and thank you as always for your thoughtful engagement with our strategy. I genuinely appreciate your deep engagement here. As context, I work closely with Jessica on coordinating the growth pillar within CEA.
Going through your comments line by line:
As Toby and Sarah have mentioned, I’m really excited that we’re prioritizing work to improve the quality of these programs and expand their reach! I won’t say more since I think my colleagues have covered it.
Thanks for your bids here — responding by category:
Re: the existing CEA dashboard:
I’m glad it has been a valuable product, and apologize that it has not been always consistently kept up to date. We’ve been unusually short staffed on the technical capacity needed to maintain this data in the last few months (in part because I’ve moved into a more generalist role), but are working on finding it a consistent owner internally.
I’m also excited about the value of this dashboard for helping the community track growth in CEA’s products!
Re: a public dashboard for EA growth as a whole:
I agree that if there were a well maintained and easily interpretable dashboard of EA relevant growth metrics, this would be a major win. I wouldn’t rule out prioritizing a project like this, but right now we are prioritizing doing the foundational investigation work ourselves.
From past experience with running similar projects, I expect this project would be a major time investment, both to keep the data fresh, and to coordinate many external stakeholder concerns. If we report core growth metrics for many orgs (especially if this includes metrics that weren’t previously made public which is IMO where the main value add would be), I think we want to do so responsibly and accurately — which takes time!
This is all to say I’d want to think hard about the capacity tradeoffs on our side, and am not immediately convinced it is worth prioritizing, but I’d be excited to revisit this down the line.
To take a step back, I think we’d broadly agree that much less effort historically has been put into investigating the question of “How much is EA growing and in what ways?” than we both would like. This is still a very shallow research area relative to where I’d like the EA community to be, and while I think we have made important progress in the last few years, I’d be interested in more work here.
In terms of the specific analysis you point to, we’ve stopped relying on this exact methodology internally so haven’t prioritized following up on it, although if someone wanted to try grading our line-by-line predictions based on e.g. our dashboard + public information (some linked from the post), I’d be pretty excited about that.
I have some quibbles around how “obviously off” the analysis is in retrospect (my confidence intervals around the top line numbers were pretty wide, and the analysis was importantly not just tracking growth in principles-first EA community building projects which I think changes its interpretation), but I won’t dive deep into these for sake of time.
Thanks for prompting us for this! For transparency, our top priority right now remains making sure we endorse and are able to reach our growth targets, and I expect this will take up the majority of our growth-specific attention in Q2-Q3. I think that’s appropriate for solidifying our work internally, and am excited for us to share more in due course.
I wonder if we are talking past each other here (I’m surprised at your surprise!), although perhaps this wording could also have been clearer. As a community building org, a major way I think CEA has become more successful over time is in building up our programs. For instance I think of the growth in our EAG and EAGx portfolio from pre- to post-pandemic times, and the scaling in our Ongoing Support Program for university group organisers as two emblematic examples of programs finding their product-market-impact fit and then scaling up to achieve more impact over time.
I think what’s new here is that after a period of being focused on building foundations internally (in part to prepare for growth), we are now back towards a more unified growth-focused strategy across CEA.
Thanks Angelina for your engagement and your thoughtful response, and sorry for my slow reply!
Re: dashboards, I’m very sympathetic to the difficulties of collecting metrics from across numerous organizations. It would be great to see what we can learn from that broader data set, but if that is too difficult to realistically keep up to date then the broader dashboard shouldn’t be the goal. The existing CEA dashboard has enough information to build a “good enough” growth dashboard that could easily be updated and would be a vast upgrade to EA’s understanding of its growth.
But for that to happen, the dashboard would need to transition from a bunch of charts showing metrics for different program areas to a dashboard that’s actually measuring growth rates in the metrics and program areas over different time frames, showing how those growth rates have evolved, aggregating and comparing those growth rates across metrics and time frames, and summarizing the results. (IMO you could even drop some of the less important metrics from the current dashboard. Ideally you would also add important and easily/consistently available metrics like google search activity for EA and Wikipedia page views).
Re: transparency around growth targets, let me explain why “I was extremely surprised to see the claim in the OP that “Growth has long been at the core of our mission.”” In my experience, organizations that have growth at the core of their mission won’t shut up about growth. It’s the headline of communications, not in a vague sense, but in a well-defined and quantified sense (i.e. “last quarter our primary metric, defined in such and such a way, grew at x%). There’s an emphasis on understanding specific drivers of, and bottlenecks of, growth.
In contrast, the community has been expressing confusion at CEA’s unwillingness to measure growth for nearly a decade. We’ve seen remarkably little communication from CEA about how fast it believes the community is growing or how it even thinks about measuring it. Your post estimating growth rates is an exception, but even that was framed as a “first stab”, it left important methodological questions unresolved, and has since been abandoned. If growth is so important to CEA, why don’t we know what CEA thinks EA’s growth rate has been the last several years? And if, as Zach says in the OP, growth has been “deprioritized” post-FTX and “during 2024, we explicitly deprioritized trying to grow the EA community”, why weren’t these decisions clearly communicated at the time?
CEA will at times mention that a specific program area or two has experienced rapid growth, but those mentions typically occur in a vacuum without any context about how fast other programs are growing (which can make it seem like cherry-picking). When CEA has talked about its high level strategy, I haven’t drawn the conclusion that growth was “at the core of the mission”; the focus has been more on things like “creating and sustaining high-quality discussion spaces.” And the strategy has often seemed to place more emphasis on targeting particularly high leverage groups (e.g. elite universities) than approaches that are more scalable (e.g. targeting universities that are both good and big, prioritizing virtual programs, etc). In my view, CEA has focused much more on throttling community growth rates back to levels it views as healthy vs. growing itself or raising the capacity to grow faster in a healthy way. Maybe that was a good decision, but I see it very differently from placing growth at the core of the mission.
Re: the intersection of community assets and transparency around growth strategy: since I have your ear, I want to point out a problem that I really hope you’ll address.
On its “mistakes” page, CEA acknowledges that “At times, we’ve carried out projects that we presented as broadly EA that in fact overrepresented some views or cause areas that CEA favored. We should have either worked harder to make these projects genuinely representative, or have communicated that they were not representative”. The page goes on to list examples of this mistake that span a decade.
Right now, under “who runs this website”, the effectivealtruism.org site simply mentions CEA and links to CEA’s website. If someone looks at the “mission” (previously “strategy”) page on CEA’s site, in the “how we think about moderation” section one learns that “When representing this diverse community we think that we have a duty to be thoughtful about how we approach moderation and content curation… We think that we can do this without taking an organizational stance on which cause or strategy is most effective.”
It is only if one then clicks through to a more detailed post about moderation and curation that one learns that “Of the cause-area-specific materials, roughly 50% focuses on existential risk reduction (especially AI risk and pandemic risk), 15% on animal welfare, and 20% on global development, and 15% on other causes (including broader longtermism).”
Yet even that more detailed page, does not explain that the top “Factors that shape CEA’s cause prioritization… (and, for example, why AI safety currently receives more attention than other specific causes)” are: “the opinions of CEA staff”, “our funders” (“The reality is that the majority of our funding comes from Open Philanthropy’s Global Catastrophic Risks Capacity Building Team, which focuses primarily on risks from emerging technologies”), and “The views of people who have thought a lot about cause prioritization”, but that the views of the EA community are not included in these factors. This information can only be found in a forum post Zach wrote, but which is not linked to from CEA’s website anywhere. So someone coming from effectivealtruism.org would have no way to find this information.
I hope that part of prioritizing community assets like effectivealtruism.org will include transparency around how/why the content those assets use is created. The status quo looks to me like it’s just continuing the mistakes of the past.
Just noting that I’ve seen and haven’t forgotten about this, thank you! The CEA team is currently at a retreat, so I’ll be slower to respond. Let me know if there are a few particular points you’d be most interested in talking through (I might be able to prioritize that faster).
Thanks for this! No rush on my end (as you can probably tell from my cadence) but FWIW the points I’m most interested in are 1 ) whether a public growth dashboard (even if it mostly/entirely focuses on CEA data) is planned and 2) CEA’s communication around how it handles content/cause curation for community resources like effectivealtruism.org.
Enjoy the retreat!