Zachary Robinson recently stated that CEA would choose to emphasize a principles-first approach to EA. Here are my thoughts on the kinds of strategic decisions that naturally synergies with this high-level strategy:
Growth strategy: Less focused on fast growth, more focus on attracting value-aligned talent:
Eternal September effects make it hard to both grow fast and maintain high-fidelity transmission of EA principles.
Recruiting from audiences that are capable of engaging in nuanced discussions of what these principles imply
Local EA groups: More emphasis on making events attractive for long-term members to attend vs. recruiting new members:
Greater focus on advertising events in ways that bring repeat customers vs. maximising throughput
Community investment: If the aim is to build a relatively small, high-talent community instead of a mass movement, then it makes sense to shift some amount of resources from outreach to improving the effectiveness of the community.
More emphasis on epistemics improves our ability to pick the right goals and achieve those goals intelligently (rather than throwing people at the problem)
Upskilling programs such as the Introductory/Advanced Fellowship or the Precipice reading group are helpful here
It may also make sense to start some new programs or orgs focused on topics like epistemics, leadership training or conflict-resolution (I like how there’s a EA mental health—I forget the name—which is running training at scale)
Mentorship programs may also help with improving the effectiveness of individuals
The community can be more effective with less people if we make progress on long-standing community issues such as the lack of low-cost EA Hub or the limited support for people trying to establish themselves in major hubs like San Fransisco or London:
It also makes more sense for the community to fix its own issues now that EA is less on the frontlines for AI Safety/x-risk (please comment if you’d like me to explain this in more detail)
Question: How can the EA community recursively self-improve?
Weirdness: A principles-first approach suggests that the community should be more tolerant of weirdness than if we were pursuing a fast-growth strategy:
It also suggests more focus on the margin with being a community rather than a professional group given that professional groups experience strong pressure to make themselves seem more respectable.
It also suggests that avoiding jargon to maximise accessibility is less of a priority
Forum debates: Running debates like this to move the community’s understanding of different cause areas forward becomes more important for the principles-first approach
Additional comments:
Some of these proposals make more sense in light of the rise of cause-specific groups (many groups now focus exclusively on AI safety, Effective Animal Advocates have their own conference, Giving What We Can is doing its own movement-building for those focused on donations, particularly donations to global poverty):
If a particular cause area wants a higher rate of growth, then cause-specific groups can pursue this objective.
Similarly, cause-specific groups can choose to be more professional or more focused on developing respectability.
A lower-growth strategy makes more sense given the pummelling EA has taken in the public relations realm:
Growth would be more challenging these days
Attempting to grow really fast is more likely to spark backlash now
Recruiting top-notch folks and developing the knowledge and skills of members of the community will improve the impression that folks form about EA
A lower-growth strategy makes more sense given the reduction in available EA funding:
When there was more funding available, it made more sense to bring in lots of people so that we could rapidly proliferate projects and orgs
We also had more funding to support people who joined, so the marginal benefit from adding people was greater
There are many people in EA who either don’t have the skills to directly work on high-priority areas or wouldn’t enjoy having a career in these areas. Some of these people want to directly do thing rather than just Earn to Give:
A greater focus on improving the community would mean that there would be more things for these folks to do.
What could principles-first EA look like?
Zachary Robinson recently stated that CEA would choose to emphasize a principles-first approach to EA. Here are my thoughts on the kinds of strategic decisions that naturally synergies with this high-level strategy:
Growth strategy: Less focused on fast growth, more focus on attracting value-aligned talent:
Eternal September effects make it hard to both grow fast and maintain high-fidelity transmission of EA principles.
Recruiting from audiences that are capable of engaging in nuanced discussions of what these principles imply
Local EA groups: More emphasis on making events attractive for long-term members to attend vs. recruiting new members:
Greater focus on advertising events in ways that bring repeat customers vs. maximising throughput
Community investment: If the aim is to build a relatively small, high-talent community instead of a mass movement, then it makes sense to shift some amount of resources from outreach to improving the effectiveness of the community.
More emphasis on epistemics improves our ability to pick the right goals and achieve those goals intelligently (rather than throwing people at the problem)
Upskilling programs such as the Introductory/Advanced Fellowship or the Precipice reading group are helpful here
It may also make sense to start some new programs or orgs focused on topics like epistemics, leadership training or conflict-resolution (I like how there’s a EA mental health—I forget the name—which is running training at scale)
Mentorship programs may also help with improving the effectiveness of individuals
The community can be more effective with less people if we make progress on long-standing community issues such as the lack of low-cost EA Hub or the limited support for people trying to establish themselves in major hubs like San Fransisco or London:
It also makes more sense for the community to fix its own issues now that EA is less on the frontlines for AI Safety/x-risk (please comment if you’d like me to explain this in more detail)
Question: How can the EA community recursively self-improve?
Weirdness: A principles-first approach suggests that the community should be more tolerant of weirdness than if we were pursuing a fast-growth strategy:
It also suggests more focus on the margin with being a community rather than a professional group given that professional groups experience strong pressure to make themselves seem more respectable.
It also suggests that avoiding jargon to maximise accessibility is less of a priority
Forum debates: Running debates like this to move the community’s understanding of different cause areas forward becomes more important for the principles-first approach
Additional comments:
Some of these proposals make more sense in light of the rise of cause-specific groups (many groups now focus exclusively on AI safety, Effective Animal Advocates have their own conference, Giving What We Can is doing its own movement-building for those focused on donations, particularly donations to global poverty):
If a particular cause area wants a higher rate of growth, then cause-specific groups can pursue this objective.
Similarly, cause-specific groups can choose to be more professional or more focused on developing respectability.
A lower-growth strategy makes more sense given the pummelling EA has taken in the public relations realm:
Growth would be more challenging these days
Attempting to grow really fast is more likely to spark backlash now
Recruiting top-notch folks and developing the knowledge and skills of members of the community will improve the impression that folks form about EA
A lower-growth strategy makes more sense given the reduction in available EA funding:
When there was more funding available, it made more sense to bring in lots of people so that we could rapidly proliferate projects and orgs
We also had more funding to support people who joined, so the marginal benefit from adding people was greater
There are many people in EA who either don’t have the skills to directly work on high-priority areas or wouldn’t enjoy having a career in these areas. Some of these people want to directly do thing rather than just Earn to Give:
A greater focus on improving the community would mean that there would be more things for these folks to do.