This is a very cool question I hoped to think about more. Here’s the 5 I came up with (in a draft that I’m unlikely to finish for various reasons), but without further exploration how they would look like:
1. Collapse.
The size and quality of the group of people that identify as community members reduces by more than 50%
2. Splintering.
Most people identify themselves as ‘[cause area/faction] first, EA second or not at all’.
3. Plateau/stunted growth.
Influence and quality stagnates (i.e size and quality change by −50% to +100%)
4. Harmless flawed realization.
EA becomes influential without really making a decidedly positive impact
5. Harmful flawed realization.
EA becomes influential and has a significantly negative impact.
6. ‘Extinction’.
No one identifies as part of the EA community anymore
The brand or culture becomes regarded as toxic, and that severely hampers long-run growth. (Think: New Atheism.)
A PR disaster, esp among some of the leadership. (Think: New Atheism and Elevatorgate).
Fizzle—it just ekes along, but doesn’t grow very much, loses momentum and goes out of fashion.
Anyway, if you want to continue with this, you could pick yours (or a combination of risks with input from the community) and run a poll asking people’s probability estimates for each risk.
This is a very cool question I hoped to think about more. Here’s the 5 I came up with (in a draft that I’m unlikely to finish for various reasons), but without further exploration how they would look like:
1. Collapse. The size and quality of the group of people that identify as community members reduces by more than 50%
2. Splintering. Most people identify themselves as ‘[cause area/faction] first, EA second or not at all’.
3. Plateau/stunted growth. Influence and quality stagnates (i.e size and quality change by −50% to +100%)
4. Harmless flawed realization. EA becomes influential without really making a decidedly positive impact
5. Harmful flawed realization. EA becomes influential and has a significantly negative impact.
6. ‘Extinction’. No one identifies as part of the EA community anymore
I also asked Will MacAskill for “x-risks to EA”, he said:
Anyway, if you want to continue with this, you could pick yours (or a combination of risks with input from the community) and run a poll asking people’s probability estimates for each risk.