A Brief Argument for Rapid Movement Growth
I feel that effective altruism has been growing more slowly than is ideal and has the capacity to grow faster. I also feel that faster growth has an extremely positive best-case scenario, and a worst-case scenario that is both highly unlikely and no worse than the status quo. Finally, I feel that rapid growth is highly impactful, highly tractable, and highly neglected, even when compared to other movement-building work, and I believe that rapid growth should be a focus of EA movement-building. For now, I would like to open a possible series on the subject by presenting a simple thought-experiment that gives my intuition in favor of rapid growth. Later, I plan to address potential criticisms at length.
A strong presumption in favor of growth
Let’s imagine that the EA movement as a whole went for mass appeal back when it started, and continued that through to the present day. Newspaper columns, online ads, clubs at every major university in the world, etc. And as a result, EA has a hundred times as many people in it as it does now.
Also, for the sake of argument, let’s imagine that the community as a whole screwed up massively at doing so. 99% of EAs, in this world, misunderstand at least one core EA principle. Many of them just wanted to get high-paying tech jobs, and think EA would help them with networking. Basically everyone thinks that earning to give is the be-all end-all of impact, and most of them thinks that just means “donating money.”
And suppose that you are now the head of a major EA organization, tasked with solving this problem. What solutions would you go for?
Probably there are a lot of good answers. You could open more intro fellowships to teach core EA principles to new people. You could write more resources intended for new EAs to learn about different cause areas. You could adopt new procedures internally to ensure proper organization even in the face of many new hires who don’t yet know the ropes.
At no stage in this process of proposing suggestions would you suggest kicking out the 99% of EAs who misunderstand core EA principles.
This makes it seem that status quo bias is a large factor in current decisions against rapid growth strategies, suggesting that rapid growth strategies are neglected compared to other movement-building approaches. In later posts, I plan to address the most common criticisms of rapid growth strategies, and show that they are not significant enough to override the intuitive presumption in favor of rapid growth.
Of course there are some theoretical reasons for growing fast. But theory only gets you so far, on this issue. Rather, this question depends on whether growing EA is promising currently (I lean against) compared to other projects one could grow. Even if EA looks like the right thing to build, you need to talk to people who have seen EA grow and contract at various rates over the last 15 years, to understand which modes of growth have been healthier, and have contributed to gained capacity, rather than just an increase in raw numbers. In my experience, one of the least healthy phases of EA was when there was the heaviest emphasis on growth, perhaps around 1.5-4 years ago, whereas it seemed to do better pretty-much all of the other times.
Your post seems serious, but it was posted in April 1st, so just double-checking.
It is serious, and in my time zone, it wasn’t April 1.
Appreciate the post—I agree that building EA could probably be bigger and this post made me update toward maybe being more likely to act this out. It also got me thinking of weird potentialities like ‘missing generations’ of EAs (produced by swings in comm-build effort) making us less likely to translate to the future.
To steelman your bolde[d] hypothetical conclusion with a counterexample I thought of, I’ve got a weird impression that even that 1% of people who understood the principles would have their impact somehow diminished (idk, imagine EAG where 1⁄100 people are super valuable, seems hard to trust and find people as easily). Or maybe by some weird spillover effects by the majority of EAs just making it worse for the better ones—could look for proxy/adjacent groups where we do expect 1% to be actually impactful…
Thanks for posting :)
Thanks :3
This is my first post, and I’m so happy to see that you appreciate it. I’ll try to address the “dilution” concern in more depth in my later posts.