When I was most involved in in-person EA organizing/âoutreach/âactivism around 2015 to 2017, it seemed to me at the time that the focus of the movement was something roughly like this:
Global poverty: 80%
Animal welfare: 15%
AI and existential risk: 5%
Now, organizations like 80,000 Hours and individuals like Will MacAskill are saying theyâre pivoting to focusing exclusively on AGI. Judging from what I see online, it seems like the focus of the movement is currently something like:
AGI: 95%
Global poverty and animal welfare: 5%
It seems like the movement has really changed. It started as a movement around charity effectiveness in the cause area of global poverty, and now itâs a movement or community devoted to talking about AGI. This is a big change and it has probably alienated a lot of people who formerly felt an affinity for the EA movement, as well as put off people who would have been interested in what the EA movement used to be but who arenât on board with the current movementâs beliefs about AGI.
I recently wrote about why I donât believe in the predictions of AGI within 5 years here. For me, at this point, the EA movement has almost completely killed off its credibility. I donât think there is any way to undo what has been done. The âeffective altruismâ label is now owned by people who think AGI is coming soon and as the years tick on and it becomes increasingly clear AGI is not coming soon, the term âeffective altruismâ will be seen by more mainstream parts of the world as even more fringe, weird, and dubious than it is today.
Iâm not sure what people who donât believe in near-term AGI and who want to focus on global poverty and/âor animal welfare should do. Maybe there would be value in creating some kind of spin-off term? A term that creates a clear distinction between âeffective altruism the movement about AGIâ and âa movement that focuses on charity effectiveness in the cause area(s) of global poverty and/âor animal welfareâ. The benefit of coining and popularizing such a term would be to draw a clear distinction between what EA used to be ten years ago and what EA is now.
Using such a term wouldnât necessarily require disavowing and distancing yourself or your organization from the EA movement or from self-identified EA organizations. Maybe some people would want to do that (I donât know), but the primary purpose would just be to clearly differentiate your beliefs and your focus from the people who believe in a relatively imminent Singularity and who treat that as the most important thing in the world to focus on right now.
When I was most involved in in-person EA organizing/âoutreach/âactivism around 2015 to 2017, it seemed to me at the time that the focus of the movement was something roughly like this:
Global poverty: 80%
Animal welfare: 15%
AI and existential risk: 5%
Now, organizations like 80,000 Hours and individuals like Will MacAskill are saying theyâre pivoting to focusing exclusively on AGI. Judging from what I see online, it seems like the focus of the movement is currently something like:
AGI: 95%
Global poverty and animal welfare: 5%
It seems like the movement has really changed. It started as a movement around charity effectiveness in the cause area of global poverty, and now itâs a movement or community devoted to talking about AGI. This is a big change and it has probably alienated a lot of people who formerly felt an affinity for the EA movement, as well as put off people who would have been interested in what the EA movement used to be but who arenât on board with the current movementâs beliefs about AGI.
I recently wrote about why I donât believe in the predictions of AGI within 5 years here. For me, at this point, the EA movement has almost completely killed off its credibility. I donât think there is any way to undo what has been done. The âeffective altruismâ label is now owned by people who think AGI is coming soon and as the years tick on and it becomes increasingly clear AGI is not coming soon, the term âeffective altruismâ will be seen by more mainstream parts of the world as even more fringe, weird, and dubious than it is today.
Iâm not sure what people who donât believe in near-term AGI and who want to focus on global poverty and/âor animal welfare should do. Maybe there would be value in creating some kind of spin-off term? A term that creates a clear distinction between âeffective altruism the movement about AGIâ and âa movement that focuses on charity effectiveness in the cause area(s) of global poverty and/âor animal welfareâ. The benefit of coining and popularizing such a term would be to draw a clear distinction between what EA used to be ten years ago and what EA is now.
Using such a term wouldnât necessarily require disavowing and distancing yourself or your organization from the EA movement or from self-identified EA organizations. Maybe some people would want to do that (I donât know), but the primary purpose would just be to clearly differentiate your beliefs and your focus from the people who believe in a relatively imminent Singularity and who treat that as the most important thing in the world to focus on right now.