I think two other plausible ways include large-scale global catastrophic risks (which are not necessarily existential[1]) and government persecution for actual or perceived wrongdoing (which is correlated with bad press but not the same thing).
Also I’d be interested in separating out infighting from “reputation failures.”While some of the causal pathways for infighting leading to breakage include mainly PR/media stuff, some of it could look more like a (confusing to the outside) implosion, akin to what’s happening within many leftist nonprofits.
I agree large-scale catastrophic failures are an important consideration. Originally I thought all global catastrophic risks would be downstream of some reservation failure (i.e., EA didn’t do enough), but now I think this categorization unrealistically estimates EA’s capabilities at the moment (i.e., global catastrophic risks might occur despite the realistic ideal EA movement’s best efforts).
In some sense I think large scale catastrophic risks aren’t super action guiding because we’re victim to them despite our best efforts, which is why I didn’t include them. But now I counter my own point: Large-scale catastrophic risks could be action guiding in that they indicate the importance of thinking about things like EA coordination recovery post-catastrophe.
I’m now considering adding a fifth cluster of failure: uncaring universe failures. Failures in which EA becomes crippled from something like a global catastrophic risk despite our best efforts. (I could also call them ruthless universe failures if I really care about my Rs).
Another example of uncaring universe failures, given slow AGI timelines is if either a) the West loses Great Power conflicts or b) gets outcompeted by other powers AND c) EA does not manage to find much traction in other cultures.
(Note that you can also frame this as a diversity failure)
I think two other plausible ways include large-scale global catastrophic risks (which are not necessarily existential[1]) and government persecution for actual or perceived wrongdoing (which is correlated with bad press but not the same thing).
Also I’d be interested in separating out infighting from “reputation failures.”While some of the causal pathways for infighting leading to breakage include mainly PR/media stuff, some of it could look more like a (confusing to the outside) implosion, akin to what’s happening within many leftist nonprofits.
Which means I’d prefer it if EA survives even if I personally won’t
This was very well put. These scenarios have always seemed to me to be the most likely ones to take down EA.
I agree large-scale catastrophic failures are an important consideration. Originally I thought all global catastrophic risks would be downstream of some reservation failure (i.e., EA didn’t do enough), but now I think this categorization unrealistically estimates EA’s capabilities at the moment (i.e., global catastrophic risks might occur despite the realistic ideal EA movement’s best efforts).
In some sense I think large scale catastrophic risks aren’t super action guiding because we’re victim to them despite our best efforts, which is why I didn’t include them. But now I counter my own point: Large-scale catastrophic risks could be action guiding in that they indicate the importance of thinking about things like EA coordination recovery post-catastrophe.
I’m now considering adding a fifth cluster of failure: uncaring universe failures. Failures in which EA becomes crippled from something like a global catastrophic risk despite our best efforts. (I could also call them ruthless universe failures if I really care about my Rs).
Agreement Upvote: Yeah do that
Disagreement Downvote: Nah
Another example of uncaring universe failures, given slow AGI timelines is if either a) the West loses Great Power conflicts or b) gets outcompeted by other powers AND c) EA does not manage to find much traction in other cultures.
(Note that you can also frame this as a diversity failure)