(1) I don’t think that the engineered pandemics argument is of the same type as the Flat Earther or Creationist arguments. And it’s not the kind of argument that requires a PhD in biochemistry to follow either. But I guess from your point of view there’s no reason to trust me on that? I’m not sure where to go from there.
I’ve got to with … not just vibes, exactly, but a kind of human approach to the numbers of people who believe things on both sides of the argument, how plausible they are and so on.
Maybe one question is: why do you think engineered pandemics are implausible?
(2) I agree that you should start from a position of skepticism when people say the End is Nigh. But I don’t think it should be a complete prohibition on considering those arguments.
And the fact that previous predictions have proven overblown is a pattern worth paying attention to (although as an aside: I think people were right to worry during the cold war — we really did come close to full nuclear exchange on more than 1 occasion! The fact that we got through it unscathed doesn’t mean they were wrong to worry. If somebody played Russian Roulette and survived you shouldn’t conclude “look, Russian Roulette is completely safe.”). Where I think the pattern of overblown predictions of doom has a risk of breaking down is when you introduce dangerous new technologies. I don’t expect technology to remain roughly at current levels. I expect technology to be very different in 25, 50, 100 years’ time. Previous centuries have been relatively stable because no new dangerous technologies were invented (nuclear weapons aside). You can’t extrapolate that pattern into the future if the future contains for example easily available machines that can print Covid-19 but with 10x transmissibility and a 50% mortality rate. Part of my brain wants to say “We will rise to the challenge! Some hero will emerge at the last moment and save the day” but then I remember the universe runs on science and not movie plot lines.
I wouldn’t say that I believe engineered pandemics or AI mis-alignment or whatever are implausible. It’s simply that I think I’ll get a better handle on whether they are real threats by seeing if there’s a consensus view among respected experts that these things are dangerous than if I try to dive in to the details myself. Nuclear weapons are a good example because everyone did agree that they were dangerous and even during the Cold War the superpowers co-operated to try to reduce the risks (hotline, arms treaties), albeit after a shaky start, as you say.
I also agree with you that there is no prohibition on considering really bad but unlikely outcomes. In fact, I think this is one of the good things EA has done – to encourage us to look seriously at the difference between very very bad threats and disastrous, civilisation-destroying threats. The sort of thing I have in mind is: “let’s leave some coal in the ground in case we need to re-do the Industrial Revolution”. Also, things like seed banks. These kinds of ‘insurance policies’ seem like really sensible – and also really conservative – things to think about. That’s the kind of ‘expect the best, prepare for the worst’ conservatism that I fully endorse. Just like I recommend you get life insurance if your family depend on your income, although I have no reason to think you won’t live to a ripe old age. Whatever the chances of an asteroid strike or nuclear war or an engineered pandemic are, I fully support having some defences against them and/or building capacity to come back afterwards.
I suppose I’d put it this way: I’m a fan of looking out for asteroids, thinking about how they could be deflected and preparing a space craft that can shoot them down. But I wouldn’t suggest we all move underground right now – and abandon our current civilisation – just to reduce the risk. I’m exaggerating for effect, but I hope you see my point.
I appreciate the thoughtful reply.
(1) I don’t think that the engineered pandemics argument is of the same type as the Flat Earther or Creationist arguments. And it’s not the kind of argument that requires a PhD in biochemistry to follow either. But I guess from your point of view there’s no reason to trust me on that? I’m not sure where to go from there.
Maybe one question is: why do you think engineered pandemics are implausible?
(2) I agree that you should start from a position of skepticism when people say the End is Nigh. But I don’t think it should be a complete prohibition on considering those arguments.
And the fact that previous predictions have proven overblown is a pattern worth paying attention to (although as an aside: I think people were right to worry during the cold war — we really did come close to full nuclear exchange on more than 1 occasion! The fact that we got through it unscathed doesn’t mean they were wrong to worry. If somebody played Russian Roulette and survived you shouldn’t conclude “look, Russian Roulette is completely safe.”). Where I think the pattern of overblown predictions of doom has a risk of breaking down is when you introduce dangerous new technologies. I don’t expect technology to remain roughly at current levels. I expect technology to be very different in 25, 50, 100 years’ time. Previous centuries have been relatively stable because no new dangerous technologies were invented (nuclear weapons aside). You can’t extrapolate that pattern into the future if the future contains for example easily available machines that can print Covid-19 but with 10x transmissibility and a 50% mortality rate. Part of my brain wants to say “We will rise to the challenge! Some hero will emerge at the last moment and save the day” but then I remember the universe runs on science and not movie plot lines.
Thank you for your comments.
I wouldn’t say that I believe engineered pandemics or AI mis-alignment or whatever are implausible. It’s simply that I think I’ll get a better handle on whether they are real threats by seeing if there’s a consensus view among respected experts that these things are dangerous than if I try to dive in to the details myself. Nuclear weapons are a good example because everyone did agree that they were dangerous and even during the Cold War the superpowers co-operated to try to reduce the risks (hotline, arms treaties), albeit after a shaky start, as you say.
I also agree with you that there is no prohibition on considering really bad but unlikely outcomes. In fact, I think this is one of the good things EA has done – to encourage us to look seriously at the difference between very very bad threats and disastrous, civilisation-destroying threats. The sort of thing I have in mind is: “let’s leave some coal in the ground in case we need to re-do the Industrial Revolution”. Also, things like seed banks. These kinds of ‘insurance policies’ seem like really sensible – and also really conservative – things to think about. That’s the kind of ‘expect the best, prepare for the worst’ conservatism that I fully endorse. Just like I recommend you get life insurance if your family depend on your income, although I have no reason to think you won’t live to a ripe old age. Whatever the chances of an asteroid strike or nuclear war or an engineered pandemic are, I fully support having some defences against them and/or building capacity to come back afterwards.
I suppose I’d put it this way: I’m a fan of looking out for asteroids, thinking about how they could be deflected and preparing a space craft that can shoot them down. But I wouldn’t suggest we all move underground right now – and abandon our current civilisation – just to reduce the risk. I’m exaggerating for effect, but I hope you see my point.