EA community building. It might seem odd to say that EA missed EA community building – but even until 2016⁄16 there was no or minimal support or funding for community building. That is about 4-5 years from EA being a thing to EA community building being widely accepted as valuable. When I talked to people, such as senior CEA staff, about it back at EAG Oxford in 2015 it felt like the key question was: should EA risk trying to outreach to any significant amount of people or just build a very narrow small community of super talented people. To get an EA coordinator in London in 2016 we had to fundraise lots of small donations from individuals in the London EA community. It is now fairly broadly accepted that funding local and especially university outreach is valuable.
Policy careers / longtermist policy work. I might be wrong but it was 2016⁄17 before 80,000 Hours began recommending policy careers as an option that was neither earning to give or direct work. And there was generally no funding for anyone trying to have an impact in the longtermist policy space (except for a one off experiment by OpenPhil to help found CSET) until maybe 2020 with the Survival and Flourishing Fund (SFF). Projects such as the APPG for Future Generations, CLTR, Simon Institute, and various individual EAs in think tanks, have all found that funders would or could not evaluate them or saw them as too risky. If you talked to most folk at EA orgs about this they would say, oh I don’t know about policy it sounds risky. I think this attitude is changing currently with the SFF and FTX Future Fund maybe being more inclusive.
A key feature that makes both of these hard to evaluate as “missed” is that the ideas were not in anyway unknown to the EA community but the EA community was adverse to taking action because they were risk adverse and saw these projects as too risky. This may or may not have been the right decision. Given the EA community has not collapsed into infighting or been rocked by scandal (as many similar communities have been) I am tempted to say that EA orgs aversion to risk has been justified.
The lesson I would like EA folk to take away from this is that if we don’t want to miss things going forward (and I think we are currently missing many high impact things) then we should have a much better understanding of how projects can pose risks, what those risks look like, how they can be managed and so on.
I strongly(?) agree with the high-level texture of both of these points. The first point seems especially egregious ex post. Though I wouldn’t frame your timing quite the same way, feels like 2016(?)-2019(?) feels more dead re: CB than either before or after.
For a while, a) many EA orgs didn’t believe in scale, and b) entrepreneurship was underemphasized in EA advice (so creating new orgs don’t happen as often as they could), which didn’t help.
I feel like most of the years I’ve been in EA has been in “keep EA small” mode, and “don’t do irreversible growth” memes. I’d be interested in whether people who strongly believed this before think that a) reality has changed vs b) their beliefs have changed vs c) they think the current wave of growth is ill-advised.
EA missed:
EA community building.
It might seem odd to say that EA missed EA community building – but even until 2016⁄16 there was no or minimal support or funding for community building. That is about 4-5 years from EA being a thing to EA community building being widely accepted as valuable. When I talked to people, such as senior CEA staff, about it back at EAG Oxford in 2015 it felt like the key question was: should EA risk trying to outreach to any significant amount of people or just build a very narrow small community of super talented people. To get an EA coordinator in London in 2016 we had to fundraise lots of small donations from individuals in the London EA community. It is now fairly broadly accepted that funding local and especially university outreach is valuable.
Policy careers / longtermist policy work.
I might be wrong but it was 2016⁄17 before 80,000 Hours began recommending policy careers as an option that was neither earning to give or direct work. And there was generally no funding for anyone trying to have an impact in the longtermist policy space (except for a one off experiment by OpenPhil to help found CSET) until maybe 2020 with the Survival and Flourishing Fund (SFF). Projects such as the APPG for Future Generations, CLTR, Simon Institute, and various individual EAs in think tanks, have all found that funders would or could not evaluate them or saw them as too risky. If you talked to most folk at EA orgs about this they would say, oh I don’t know about policy it sounds risky. I think this attitude is changing currently with the SFF and FTX Future Fund maybe being more inclusive.
A key feature that makes both of these hard to evaluate as “missed” is that the ideas were not in anyway unknown to the EA community but the EA community was adverse to taking action because they were risk adverse and saw these projects as too risky. This may or may not have been the right decision. Given the EA community has not collapsed into infighting or been rocked by scandal (as many similar communities have been) I am tempted to say that EA orgs aversion to risk has been justified.
The lesson I would like EA folk to take away from this is that if we don’t want to miss things going forward (and I think we are currently missing many high impact things) then we should have a much better understanding of how projects can pose risks, what those risks look like, how they can be managed and so on.
Personal take:
I strongly(?) agree with the high-level texture of both of these points. The first point seems especially egregious ex post. Though I wouldn’t frame your timing quite the same way, feels like 2016(?)-2019(?) feels more dead re: CB than either before or after.
For a while, a) many EA orgs didn’t believe in scale, and b) entrepreneurship was underemphasized in EA advice (so creating new orgs don’t happen as often as they could), which didn’t help.
I feel like most of the years I’ve been in EA has been in “keep EA small” mode, and “don’t do irreversible growth” memes. I’d be interested in whether people who strongly believed this before think that a) reality has changed vs b) their beliefs have changed vs c) they think the current wave of growth is ill-advised.