Thanks. A few more quick thoughts! I actually don’t think that the community failed here. I am pretty sure that all communities have bad actors, and EA probably has fewer than many communities that are doing quite well. We hold ourselves to a very high standard.
With that said, I agree that we should think more about what we want the community to become and how to get it there, and would love to hear some visions. Here is a quick attempt at making some predictions for how things go in the next 10-20 years (if we are still around):
EA becomes an established academic discipline and more mainstream. Something like what I think happened with psychology, economics and various other new disciplines/intellectual innovations.
Major conceptual innovations, like the using the ITN framework and expected value in philanthropic settings etc go mainstream and are now widely used in relevant settings.
The overall EA identity weakens and EA starts to become one of many relatively uninteresting topics that people are interested in rather than a cool new identity to have.
People start forming identities and groups focused on current and new causes areas rather than EA
Cause areas/EA related actions (e.g., AI safety or earning to give) become like academic/professional communities with their own conferences and subcultures. We start seeing some quite well managed and formalised community building by specific organisations as they scale up.
We never properly manage things at the EA community level due to the impossible to match increase in the scale and complexity of all the causes and their sub-communities and our decentralised nature. We continue to have informal groups etc but it never really gets more formalised or much larger scale than now.
EAGs continue. These function like academic conferences and start being funded by large cause area orgs to find hires and allow people in their areas to meet and collaborate etc.
The EA forum becomes much less used, having achieved its goal of facilitating and incubating many new ideas, concepts and connection that have now spun off into their own more specific areas of focus or gone mainstream. Again, it’s like an psychology forum or similar at this point, and most people are more interested in something more specific so use more specific forums.
That’s all pretty rushed because I need to go to sleep now (actually 30 minute ago), but it was interesting to attempt, and I would like to hear thoughts or other perspectives.
If I had to guess, I’d point at having a long bulleted list of different specific predictions about the future as a risk factor for someone registering disagreement.
A key thing that changed is that I no longer think we should try to “manage things at the EA community level”—and if we’re not attempting that, we should reconceptualize what it means to be good community members and leaders, and what failure modes we should anticipate and address.
The other thing I want is more ambitious—ideally, in 20+ years I want the idea of prioritizing giving part of your income, viewing the future as at least some level of moral priority, and cause neutrality to all look like women’s suffrage does; so obviously correct and uncontroversial that it’s not a topic of discussion.
Thanks. A few more quick thoughts! I actually don’t think that the community failed here. I am pretty sure that all communities have bad actors, and EA probably has fewer than many communities that are doing quite well. We hold ourselves to a very high standard.
With that said, I agree that we should think more about what we want the community to become and how to get it there, and would love to hear some visions. Here is a quick attempt at making some predictions for how things go in the next 10-20 years (if we are still around):
EA becomes an established academic discipline and more mainstream. Something like what I think happened with psychology, economics and various other new disciplines/intellectual innovations.
Major conceptual innovations, like the using the ITN framework and expected value in philanthropic settings etc go mainstream and are now widely used in relevant settings.
The overall EA identity weakens and EA starts to become one of many relatively uninteresting topics that people are interested in rather than a cool new identity to have.
People start forming identities and groups focused on current and new causes areas rather than EA
Cause areas/EA related actions (e.g., AI safety or earning to give) become like academic/professional communities with their own conferences and subcultures. We start seeing some quite well managed and formalised community building by specific organisations as they scale up.
We never properly manage things at the EA community level due to the impossible to match increase in the scale and complexity of all the causes and their sub-communities and our decentralised nature. We continue to have informal groups etc but it never really gets more formalised or much larger scale than now.
EAGs continue. These function like academic conferences and start being funded by large cause area orgs to find hires and allow people in their areas to meet and collaborate etc.
The EA forum becomes much less used, having achieved its goal of facilitating and incubating many new ideas, concepts and connection that have now spun off into their own more specific areas of focus or gone mainstream. Again, it’s like an psychology forum or similar at this point, and most people are more interested in something more specific so use more specific forums.
That’s all pretty rushed because I need to go to sleep now (actually 30 minute ago), but it was interesting to attempt, and I would like to hear thoughts or other perspectives.
Interesting to see that people disagree with me. I am interested to hear why if anyone wants to share.
If I had to guess, I’d point at having a long bulleted list of different specific predictions about the future as a risk factor for someone registering disagreement.
I think that my earlier attempt to discuss this mostly matches what you’re saying.
A key thing that changed is that I no longer think we should try to “manage things at the EA community level”—and if we’re not attempting that, we should reconceptualize what it means to be good community members and leaders, and what failure modes we should anticipate and address.
The other thing I want is more ambitious—ideally, in 20+ years I want the idea of prioritizing giving part of your income, viewing the future as at least some level of moral priority, and cause neutrality to all look like women’s suffrage does; so obviously correct and uncontroversial that it’s not a topic of discussion.