Thanks for this Ollie. As you allude to in your post, I think the problem with this is that past collapses of empires and civilisations or extinctions of species are not an appropriate reference class for technological risks from AI, bio and nuclear weapons. I think this shows that the typical things that local civilisations faced prior to 1950 tended to cause relatively slow collapse. But the risks that we seem to face this century are entirely novel—the destructive power of the tools at our disposal today is orders of magnitude greater than the tools that the Romans had. We should therefore expect collapse to happen much more quickly. eg We just couldn’t vaporise entire cities until 1945.
To add to this, we now know that most slow collapse probably won’t kill us all. We are much more capable of adapting and surviving as a species than the reference class. This leaves disatsters which are too strategic or quick for us to adapt to. Most of that class seems to be rapid.
Thanks for this Ollie. As you allude to in your post, I think the problem with this is that past collapses of empires and civilisations or extinctions of species are not an appropriate reference class for technological risks from AI, bio and nuclear weapons. I think this shows that the typical things that local civilisations faced prior to 1950 tended to cause relatively slow collapse. But the risks that we seem to face this century are entirely novel—the destructive power of the tools at our disposal today is orders of magnitude greater than the tools that the Romans had. We should therefore expect collapse to happen much more quickly. eg We just couldn’t vaporise entire cities until 1945.
To add to this, we now know that most slow collapse probably won’t kill us all. We are much more capable of adapting and surviving as a species than the reference class. This leaves disatsters which are too strategic or quick for us to adapt to. Most of that class seems to be rapid.
Thanks, John! Helpful to see that you think my best guess of why I might be wrong might be right.