That’s an interesting point. There’s a lot of thinking about how we judge the output of experts in other fields (and I’m not an expert in that), but I’ll give you my thoughts. In short, I’m not sure you can engage with all the arguments on the object level. Couple of reasons:
(1) There are lots of people who know more about X than I do. If they are trying to fool me about X, they can; and if they are honestly wrong about X then I’ve got no chance. If some quantum physicist explains how setting up a quantum computer could trigger a chain reaction that could end human life, I’ve got no chance of delving into the details of quantum theory to disprove that. I’ve got to with … not just vibes, exactly, but a kind of human approach to the numbers of people who believe things on both sides of the argument, how plausible they are and so on. That’s the way I deal with Flat Earth, Creationism and Global Warming arguments: there are guys out there who know much more than me, but I just don’t bother looking at their arguments.
(2) People love catastrophes and apocalypses! Those guys who keep moving the doomsday clock so that we are 2 seconds to midnight or whatever; the guys who thought the Cold War was bound to end in a nuclear holocaust; all the sects who have thought the world is going to end and gathered together to await the Rapture or the aliens or whatever—there are just too many examples of prophets predicting disaster. So I think it’s fair to discount anyone who says the End is Nigh. On the other hand, the civilisation we have behind us has got us to this state, which is not perfect, but involves billions of decently-fed people living long-ish lives, mostly in peace. There’s a risk (a much less exciting risk that people don’t get so excited about) that if you make radical changes to that then you’ll make things much worse.
Thank you for your comments.
I wouldn’t say that I believe engineered pandemics or AI mis-alignment or whatever are implausible. It’s simply that I think I’ll get a better handle on whether they are real threats by seeing if there’s a consensus view among respected experts that these things are dangerous than if I try to dive in to the details myself. Nuclear weapons are a good example because everyone did agree that they were dangerous and even during the Cold War the superpowers co-operated to try to reduce the risks (hotline, arms treaties), albeit after a shaky start, as you say.
I also agree with you that there is no prohibition on considering really bad but unlikely outcomes. In fact, I think this is one of the good things EA has done – to encourage us to look seriously at the difference between very very bad threats and disastrous, civilisation-destroying threats. The sort of thing I have in mind is: “let’s leave some coal in the ground in case we need to re-do the Industrial Revolution”. Also, things like seed banks. These kinds of ‘insurance policies’ seem like really sensible – and also really conservative – things to think about. That’s the kind of ‘expect the best, prepare for the worst’ conservatism that I fully endorse. Just like I recommend you get life insurance if your family depend on your income, although I have no reason to think you won’t live to a ripe old age. Whatever the chances of an asteroid strike or nuclear war or an engineered pandemic are, I fully support having some defences against them and/or building capacity to come back afterwards.
I suppose I’d put it this way: I’m a fan of looking out for asteroids, thinking about how they could be deflected and preparing a space craft that can shoot them down. But I wouldn’t suggest we all move underground right now – and abandon our current civilisation – just to reduce the risk. I’m exaggerating for effect, but I hope you see my point.