Project on organizational reforms in EA: summary

Earlier in 2023, Julia put together a project to look at possible reforms in EA. The main people on this were me (Julia) of the community health team at CEA, Ozzie Gooen of Quantified Uncertainty Research Institute, and Sam Donald of Open Philanthropy. About a dozen other people from across the community gave input on the project.

Previously:

Work this project has carried out

Information-gathering

  • Julia interviewed ~20 people based in 6 countries about their views on where EA reforms would be most useful.

    • Interviewees included people with experience on boards inside and outside EA, some current and former leaders of EA organizations, and people with expertise in specific areas like whistleblowing systems.

  • Julia read and cataloged ~all the posts and comments about reform on the EA Forum from the past year and some main ones from the previous year.

  • Separately, Sam collated a longlist of reform ideas from the EA Forum, as part of Open Philanthropy’s look at this area.

  • We gathered about a dozen people interested in different areas of reform into a Slack workspace and shared some ideas and documents there for discussion.

An overview of possible areas of reform

  • Here’s our list of further possible reform projects. We took on a few of these, but the majority are larger than the scope of this project.

  • We’re providing this list for those who might find it beneficial for future projects. However, there isn’t a consensus on whether all these ideas should be pursued.

Advice /​ resources produced during this project

Projects and programs we’d like to see

We think these projects are promising, but they’re sizable or ongoing projects that we don’t have the capacity to carry out. If you’re interested in working on or funding any of these, let’s talk!

  • More investigation capacity, to look at organizations or individuals where something shady might be happening.

  • More capacity on risk management across EA broadly, rather than each org doing it separately.

  • Better HR /​ staff policy resources for organizations — e.g. referrals to services like HR and legal advising that “get” concepts like tradeoffs.

  • A comprehensive investigation into FTX<>EA connections /​ problems — as far as we know, no one is currently doing this.

    • EV’s investigation has a defined scope that won’t be relevant to all the things EAs want to know, and it won’t necessarily publish any of its results.

Context on this project

This project was one relatively small piece of work to help reform EA, and there’s a lot more work we’d be interested to see. It ended up being roughly two person-months of work, mostly from Julia.

The project came out of a period when there was a lot of energy around possible changes to EA in the aftermath of the FTX crisis. Some of the ideas we considered were focused around that situation, but many were around other areas where the functioning of EA organizations or the EA ecosystem could be improved.

After looking at a lot of ideas for reforms, there weren’t a lot of recommendations or projects that seem like clear wins; often there were some thoughtful people who considered a project promising and others who thought it could be net negative. Other changes (such as having a wider range of aligned funders) seemed more clearly beneficial but less tractable.

At first this project had an overly-grand name (“EA reform taskforce”) that may have given the impression it was more official or comprehensive than it really was — we now view that as a mistake. We hope we didn’t crowd out other work here, as we certainly haven’t covered it all. We did talk with some other people interested in pursuing their own work on reforms in parallel.

We’re happy to be in touch if you’re considering work in a related area and want to compare notes or talk through lessons learned from our project.