CEA grew a lot in the past year

For CEA’s Q3 update, we’re sharing multiple posts on different aspects of our work.

Over the past year, we’ve doubled down on the strategy we set out last year. The key metrics we were targeting increased significantly (often more than doubling), and we made many strong hires to nearly double our headcount.

So, unless you’ve been paying a lot of attention, CEA is probably somewhat different from what you think.[1]

Our strategy

We think that humanity will have a better chance of surviving this century, and sharply reducing present suffering, if there are many more highly-engaged EAs (“HEAs”). By this, we mean people who are motivated in part by an impartial care for others[2], who are thinking very carefully about how they can best help others, and who are taking some significant actions to help (most likely through their careers).[3]

In the recent past, people we’d consider “highly engaged” have done a lot to improve human lives, reduce the suffering of animals, develop our understanding of risks from emerging technologies, and build up the effective altruism community.

To increase the number of HEAs working on important problems, we are nurturing discussion spaces: places where people can come together to discuss how to effectively help others, and where they can motivate, support, and coordinate with each other.

In particular, we do this via university groups and conferences, both of which have a strong track record of getting people deeply interested in EA ideas, and then helping them find ways to pursue impactful work (as evidenced, for instance, by OpenPhil’s recent survey).

Recent progress

Some recent progress:

  • For front-facing programs, the key metrics we focus on have (more than) doubled in the last 12-24 months. For instance:

    • The events team is on track to facilitate roughly twice as many new connections as they did in 2019. We hope this means that many more people are getting mentorship, advice, and career opportunities.

    • We had as many calls with group leaders in the last three months as we did in the whole of last year.

    • More generally, it seems that the activities of EA groups are growing rapidly (maybe as much as 400% in some areas), and our support (via retreats, funding, etc.) is contributing to this.

    • The number of hours people spent logged in to the EA Forum doubled in the last year. Many more people are regularly engaging with some of the community’s best new ideas and resources.

  • We introduced and grew some new products and programs to complement our previous products:

    • Virtual Programs, which have helped over 1000 people learn more about EA ideas in the past year. This has helped to seed new groups, get people more involved in EA, and ensure that high-fidelity versions of EA ideas are shared worldwide.

      • The latest EA Handbook, a set of readings based on the curriculum for our introductory virtual program, which has helped hundreds of additional readers work through similar content at their own pace.

    • A new groups/​events platform, which we hope will make it much easier for people to transition from online engagement to in-person engagement.

We think this type of progress is critical, because it means that more people are being exposed to and then engaging deeply with the ideas of effective altruism. We are in the process of assessing how well this progress has translated into more people taking action to help others in the last year, but given previous data, we expect to see a strong connection between these figures and the number of people who proceed to work on important problems.

As for CEA’s internal progress:

  • Our team size nearly doubled, and we were especially pleased with the hires we made this year.

  • Less specifically, when we held our team retreat in September, I felt that:

    • There was a lot more clarity about what we’re doing, and how everyone’s work fits together (in the past, CEA sometimes struggled with a lack of strategic clarity).

    • I’m more excited about the current team than any previous CEA team I’ve been a part of, due to a combination of the people and the culture. (Though I’m also excited to see if we can make further improvements here.)

Mistakes and reflections

I think that the key specific mistakes we made during this period were:

  • The Meta Coordination Forum (a retreat for leaders in the EA meta space) was less valuable than it could have been due to a variety of mistakes we made. We plan to make major changes to address these issues. (More details in the events post.)

  • For at least one hiring round (but not all rounds), I think we should have communicated more promptly with applicants and given them more detailed feedback. Assistants are now supporting hiring managers with emails, and we have updated towards giving more substantive feedback for applicants that make it far in our process.

I also plan to spend part of the next few months reflecting on questions like:

  • Should we be more ambitious, and aim to move more quickly than we currently are?

  • What can we do to make sure we’re not displacing even better community building efforts? And if others do begin to offer similar services, what are the best ways to collaborate effectively with them?

If you are interested in helping us, let me know: finding the right people to hire will help us move forward on many of these improvements, and we’re always keen to diversify our funding base.


  1. ↩︎

    This probably applies to most organizations you’re not tracking closely: but I think the scale of change is maybe greater with CEA.

  2. ↩︎

    Without regard to factors like someone’s nationality, birthdate or species, except insofar as those things might actually be morally relevant.

  3. ↩︎

    For each of these attributes, we set quite a high bar. And when we evaluate whether we’d think of someone as “highly engaged”, we either interview them or look for other strong evidence (such as their having been hired by an organization with high standards and a strong connection to the EA movement).