This suggests that we might want to focus in particular on both short and fairly long timelines worlds [...]
I’ve recently started thinking of this as a playing to your outs strategy, though without the small probabilities that that implies. One other factor in favor of believing that long timelines might happen, and those worlds might be good worlds to focus on, would be that starting very recently it’s begun to look possible to actually slow down AI. In those worlds, it’s presumably easier to pay an alignment tax, which makes those world more likely to survive.
I like this comment a whole bunch.
I’ve recently started thinking of this as a playing to your outs strategy, though without the small probabilities that that implies. One other factor in favor of believing that long timelines might happen, and those worlds might be good worlds to focus on, would be that starting very recently it’s begun to look possible to actually slow down AI. In those worlds, it’s presumably easier to pay an alignment tax, which makes those world more likely to survive.