Many effective altruists prioritize reducing existential risks. The international security community also works to lower the odds of certain sorts of existential risks, such as nuclear war. But if we want to work together with them, we’ll need to understand their perspective, which may be quite different from that of outsiders. In this talk from Effective Altruism Global 2018: San Francisco, Seth Baum describes the international security perspective on nuclear weapons, and touches on how some of this may apply to risks from artificial intelligence, too.
Below is a transcript of Seth’s talk, which we have lightly edited for clarity. You can also watch it on YouTube and read it on effectivealtruism.org.
The Talk
First, a quick disclaimer: the views in this talk are mine alone and not necessarily those of the Global Catastrophic Risk Institute.
The work that I’ll be presenting today comes from a broader theme in my research, especially these three papers, which is that essentially for us to make a productive change on the issues that we care about, we need to really understand the perspectives and opportunities from those people who are positioned to take important actions on them. We can’t just think about what sorts of actions we would ideally like to see on these issues. We need to see what people could actually do in their positions.
In order to do this we need to understand their perspectives, how they think. They don’t necessarily think like we do. They don’t have the same beliefs, the same values necessarily, and even if they did, based on where they work, the institutional context, they might not be able to take certain types of actions. We really would do well to internalize that and adjust our own strategies accordingly.
A good place to start is in understanding how they think and talk about these sorts of ideas. So in international security, for example, specifically in intelligence analysis, they have a concept called “mirror imaging” or “mirroring”. Mirroring is the mistake that analysts often make when they assume that the people who they are analyzing think and act like they do. It’s essentially like “If I was in that position, this is what I would do”. Well, you’re not in that position: it’s someone else and in intelligence analysis, it’s typically somebody from a different country, with a different cultural and political background and a different institutional context that they work in. It’s a mistake to think that they would do what you would do in that circumstance. Avoiding mirroring works well for the intelligence community, and it’s also an important message for us.
A pretty classic example of this, in global catastrophic risk, from the early 1980s and the early studies of nuclear winter, is Carl Sagan and colleagues. They were really concerned about nuclear winter for essentially the same reasons that we might be concerned about global catastrophic risk and existential risk, and things like that, which is because a nuclear winter would be a very severe event, potentially including loss of future generations. They were concerned about this and concluded that therefore, we should reduce the size of nuclear arsenals down to a level that would be too small to cause nuclear winter.
They tried promoting this idea in international security policy circles and met with the response that no, actually, to the extent that nuclear winter was a concern, it was a concern that only reinforced our existing nuclear weapons policy; it didn’t suggest any changes. The policy was based on nuclear deterrence: the idea that nuclear war would be so catastrophic that the only sensible policy was to avoid war in the first place. Nuclear winter just makes nuclear war even more catastrophic than it already was, reinforcing the deterrence policy. This is really where the debate over nuclear winter ended. It was a stalemate, didn’t really get much farther, didn’t change the actual nuclear weapons policy that we had. It was because they didn’t really reconcile with this reaction from the nuclear weapons policy community.
For a more recent example, in a high profile video on lethal autonomous weapons, the term “slaughterbots” was put out recently, and you can agree or disagree with what the video itself was actually saying. I personally am somewhat skeptical, but it’s still the case that the video was fairly poorly received by a lot of important people in international security policy communities, and because of that it has made it more difficult for the people behind the video to get their message out there to these very important audiences.
Then finally, an example with my own work, because I am not above making this mistake myself. A few years back I published some research on a concept I called “winter-safe deterrence”. Now this was actually an attempt to reconcile concerns about the risk of nuclear winter with the belief in nuclear deterrence as a policy construct, and I said, “Okay, if we are going to reduce our nuclear arsenals down to a level that would not cause nuclear winter, are there other things we can do using other types of weapons to achieve a successful deterrence policy, so that we can have the best of both worlds?” For this, I looked at a number of different types of weapons, including biological weapons, and it was the attention to biological weapons in this research that prompted these responses that you see here. This is actually just a small portion of the very negative reaction that this part of the work generated.
I was not intending to do this. I was not trying to be contrarian or anything like that. This was a mistake. I really did not appreciate how strongly the international security community would come out against even the suggestion of a positive role for biological weapons.
I mean how I thought about it myself, in ethical terms, thinking basically as a consequentialist, I would look at a lot of different type of weapons, where for small amounts of the weapon that were carefully used in very select circumstances, there could be some benefits to society, and then you get more problems as the arsenal size increases.
Well what I failed to realize is that a lot of people think about it more like that, where there is a very large benefit to not having any of the weapon in the first place. Might even be larger than what’s shown here, in that they just think that the weapon itself as a class of weapon should not be there. This is why we see bans of a number of different types of weapons: biological weapons, chemical weapons, land mines, and several others. Okay, some of that I disagree with. Part of the argument that is made for banning weapons is, in ethics terms, deontological.
They will argue that certain types of weapons are fundamentally immoral. Personally I disagree with that. Like if I’m going to be killed in the battlefield I don’t actually care what type of weapon it is that kills me, I’m dead anyway. Likewise, if I’m going to be hurt, I don’t care which type of weapon it is that hurts me, I care about the pain that I have to suffer. But at the same time there are a lot of people who think deontologically about weapon types, and that’s something that I need to internalize and factor in.
But there’s more to it than that. Banning a class of weapons is just a simpler approach to policy. I mean, “ban biological weapons”—that’s three words. You could put it on a bumper sticker, versus like “craft a carefully optimized small quantity of biological weapons to be used in specific circumstances.” That’s a lot more complicated, it’s harder to rally political support behind it and in consequentialist terms that matters, and so this is something that I have been, since this, trying to grapple with and internalize, and factor into my own approach to weapons policy.
Now there are still some parts of it where I really do think the international security community gets it wrong. In particular in the last few years with the really massive international attention paid to a fairly small handful of chemical weapons attacks in Syria, in comparison to the much larger number of attacks using guns and conventional explosives. Conventional weapons that just aren’t quite as salient, for whatever reason. This seems wrong to me. Where we make such a big deal out of the people who are hurt and killed by chemical weapons, but for this much larger number of people who just have the misfortune of being hurt and killed by the wrong weapons, we don’t actually care about that so much.
There are some people who do care about it, like this organization Action on Armed Violence. They are focused specifically on this issue, and superficially it seems like they’re doing good work. I haven’t looked into them closely, so I can’t make an endorsement of them, but it looks like they do pretty good work on this, and for good reason. I mean, I’m actually myself thinking of getting involved in this, to really help drive home the point that there are all these lives that are being harmed and killed on something that we’re essentially neglecting. I think this is a good way to put in more quantitative reasoning into international security conversations. That could be important in general.
Essentially, it kind of looks like this. You have your high frequency, low severity types of weapons like conventional explosives, then on the opposite end you have our low frequency, high severity weapons like nuclear weapons, which is where most of my work is, and then you have chemical weapons, which are in this kind of sad bottom left corner, where we don’t use them that often, and when we do the magnitude’s not that large. Okay, sure, you can cause a lot of harm with chemical weapons, but aside from World War I trench warfare, 100 years ago, for the most part that hasn’t really been the case.
Now with respect to nuclear weapons we should be careful in talking about the international security community perspective, because there’s not just one perspective. There are a lot of different perspectives and we should recognize that.
This chart here summarizes the main perspectives on this, where starting from the top you have a small minority of people who actually argue for more nuclear proliferation. These are people who really believe in nuclear deterrence, and there is a certain logic to this.
If nuclear deterrence brings peace, peace is good, so we should spread nuclear weapons around to more countries, so that they can have peace. I might not believe in the effectiveness of nuclear deterrence quite as much as they do, but there is at least a logic to it. That is a small minority, though. A more common view among the hawkish folks are to essentially maintain the status quo. We will keep our weapons because we see benefits from them, but we don’t want another countries to have them, especially the other countries that we don’t trust so much.
Another prominent view is to have gradual disarmament on timescales of decades or so, as international conditions permit. This is like Obama, people with similar views as him, the NGO Global Zero is big in this space. And then finally another very prominent view is for a much more rapid nuclear disarmament. ICAN is the International Coalition for the Abolition of Nuclear Weapons. They just won the Nobel Peace Prize for this. And NWS, Non-Nuclear Weapons States, the countries that don’t have the nuclear weapons, quite a lot of them also support this view on nuclear disarmament.
Now I’ve been studying the risk of nuclear war for many years now. And I’ve got to be honest, I’m actually not sure which of these camps I support. It’s tricky because okay, nuclear weapons obviously make war more severe. That much is pretty clear, but with nuclear deterrence they have some potential to reduce the frequency of major war, such that if you switched over to conventional deterrence it’s possible that you would have major wars that occur more frequently.
And so you have this tradeoff between the frequency and severity of war. And it’s actually not really obvious to me how that tradeoff is resolved. And so because of that, I have tended to shy away from advocating specific policy positions on nuclear disarmament—it’s not quite clear which policy position we should be pushing for. Instead, at least a few years ago, what I would do is push for a different type of policy, which is to improve relations between the different countries, especially the nuclear weapons countries that don’t get along with each other, starting with the United States and Russia, which have the two major nuclear arsenals.
This is something that would clearly reduce the risk of all types of war, and it’s something that people across all the different disarmament camps would agree with. Everybody agrees that if countries get along with each other better, they’re less likely to have wars. And that makes the world safer. And, there are things that could be done on this.
Something that I like to point to is a really great project by Dorothie and Martin Hellman. Martin Hellman, a professor at Stanford, did early work on cryptography, and then some work on the risk of nuclear war, that my own risk analysis is built off of. In a really amazing project, the two of them analyzed their own marriage, and what they did to overcome their marital struggles, and used that as a starting point for thinking about how to improve relationships between different countries around the world. Really brilliant concept.
And this is the sort of thing that I would point to as having good potential to make progress on this front. However, within the last year or two, things have changed. We now have, as I’m sure you’re all aware, a fairly different political environment, such that if you are to talk about trying to improve the relationship between the United States and Russia, right now it’s more complicated. I think that’s a fair way of putting it.
And in this political environment, I actually don’t have great ideas. I don’t know what the best way forward, because so many of the issues now are within the space of domestic politics, especially within the US. That’s really kind of emerged as a big thing. It used to be that there as a lot of consensus within the US about say, how to approach the relationship with Russia, now it’s just gotten a lot more complicated. And the important actors for domestic politics are not technical policy analysts like myself, it’s more like journalists or grassroots political organizers, campaign managers, of course the politicians themselves.
And so I have not felt like I’ve had really good opportunities to make a positive difference on this front. My own strategy has actually mostly been to lay low, do some research and some quiet networking with policy communities and sharing ideas with them. Which doesn’t get to the root issues going on but is still something I feel like we can make constructive progress on.
What is clear though, is for those of us working on international security issues, we need to understand this and internalize this and factor it into our plans and our strategies because this really matters. And who knows how this story is gonna play out? It’s too soon to tell.
And that goes for nuclear weapons, and a lot of other issues. As of recently, those issues include artificial intelligence. This is just a small portion of the attention that major national governments have paid to AI as a security issue within the last year or two. And I would expect that there will be more and more of this as time goes on.
Now, most of this is for narrow near-term AI. Not the more dramatic transformative AI that we could have in the future. However, shown here this is a map from a survey of artificial general intelligence projects that I’ve published last year. And what you see is they’re all over the world. And maybe half of them are based in the United States, but there’s a lot in Europe, a lot in China, a few in Russia and other countries around the world. Now most of these are private, they’re either at private companies or in academia, but some of them have government ties and the governments that the countries that these projects are based in, may be paying attention to them anyway. I think we will expect to see more and more of that as this field progresses. And so for this as well, it’s important for us to understand these international security dynamics, and also the perspectives of the people who are taking relevant actions on that. If we do understand their perspectives, we can customize our own efforts on these issues accordingly.
Q&A
Question: What does it take to create a nuclear winter, what is that threshold? There’s thousands and thousands of these missiles, how many have to go off?
Seth: That’s a good question because if you have just one nuclear weapon it’s probably not going to cause anything close to a nuclear winter. We had two used in WWII; that didn’t really change things that much.
What I did in my survey was to take a cautious approach. Which is to say that well, we don’t know exactly what it would take, let’s err on the safe side, aim low for it. And so the number that I proposed in this study was 50. This was based on other research that had studied nuclear war scenarios involving 100 nuclear weapons and they found substantial effect. And so I thought, okay, let’s take a number that’s lower than that, but I have to say that this 50 number was somewhat arbitrary. It’s not something that we really have a precise understanding of.
Question: That would be enough though, to certainly destroy any… I mean I guess you have to still factor in how many of them will actually hit targets and go through defenses and what not, but there’s no rival that can survive 50 nuclear strikes. So what is the argument that that’s not enough?
Seth: As far as the number of nuclear weapons that a country needs to have successful deterrence, opinions vary quite widely. I’ve seen anything from as few as 5 to as many as like, 1500. And the high numbers come from this idea that, we may need weapons for a variety of contexts, different adversaries that we may need to simultaneously deter as well as defenses and so on that these adversaries may have. And so in order to have a reliable deterrent we might need that many nuclear weapons.
I’m skeptical that we would need so many, but the important to understand is that ultimately deterrence is psychological. Deterrence succeeds when the other side makes a decision to not attack, and so ultimately it’s not about the number of weapons, it’s about the human reaction to that.
Question: And we have, the United States has 7000 or something?
Seth: It’s in the thousands, depends on how you count it. And there is some interest in reducing the size of our arsenal, certainly in the Obama administration there was interest in that. My sense is that the main factor for progress on nuclear disarmament right now is the relationship between the United States and Russia. I think we are unlikely to unilaterally make significant cuts to our arsenal, that we would only do that in tandem with cuts from Russia.
Question: What do you think the trajectory is that we’re on right now? It seems like, well obviously we have a summit coming up in two days, it seems like things are not going that great. But how do you assess it?
Seth: It’s not clear that where we are right now can be readily described as a trajectory. Yeah. I think that’s how I would sum it.
It’s a volatile situation. I live in New York, that’s one of the target cities, San Francisco is another target city. Sometimes I have my fingers crossed. But that’s, I guess we’ll see.
Question: I know there have been major high level arms reduction agreements in the past. Has there been any breakthrough with a situation even remotely like the North Korea situation? I’m kind of struggling to come up with one. People would maybe say Libya but now that’s like the warning to anybody else.
Seth: The Libya example is a good one because a key attribute of the situation with North Korea is they are a much smaller country than we are. So it’s a very uneven dynamic in contrast with Russia. The US considers Russia and China “near peer” adversaries. They might consider themselves peer adversaries. That’s the term that gets used. North Korea I think, from North Korea’s perspective it makes sense that they would recall the Libya example. The United States has not been trustworthy on these matters in the past and I would expect North Korea to understand that and factor that into its plans. I would be surprised if North Korea were to agree to give up its nuclear weapons. I would be very surprised to be honest. There is progress that can nonetheless be made in a talk like this, but that gets into more subtle technical details of the nuclear weapons enterprise. We’ll see.
Question: You spoke for a minute about branding mistakes that you and others have made as you engage with this content. Can you offer any advice about how to engage without projecting Snowden-style risk, if somebody really wants to get involved?
Seth: That’s a good question and the really the answer is pretty simple. It’s just to understand their perspectives. So for example in my own paper that I did on this, I probably could have done a better job of sharing early drafts with people from this. I mean, there were international security people who saw the drafts. The paper went through peer review, in an international security journal and the readers actually did not flag the thing that ended up getting all the attention, so you never know. That’s the sort of thing I could have done more of and that I think applies more generally. The more that we are kind of, of that world and of that community then the better we can predict how they would react to things and get people’s reactions to test ideas out before putting them out more widely.
Question: How much of what goes on in this space of kind of managing these international catastrophic risks happens behind the scenes and is sort of invisible to the public, versus kind of gets discussed in the open?
Seth: A lot of it is behind the scenes. There is no question about that—it’s a mix. It’s a mix, but the public context is important. National leaders especially in democracies do need to care about the domestic reaction to what they’re doing. Not just in democracies. In China, for example, there is a sense that since Xi was made like president for life, or whatever the term is, that he actually now has more flexibility to pursue international initiatives because he is in a more comfortable place domestically. So it goes both ways, there’s always a bit of a negotiation on two levels, internationally and then also nationally.
Question: Why are there so few women in the audience right now for this talk? Do you notice that this is sort of a gendered topic in some way?
Seth: To some extent. It varies in nuclear weapons more so than, say, biological weapons. In the biological weapons space the gender breakdown is more balanced. I, as far as why, I don’t think I have any particularly clever or insightful answers. What I can say is that there are some people in international security who are aware of this sort of imbalance and try to address it. Especially I think like when there is discrimination against women, which probably happens. That definitely should be recognized and countered because we lose talent, and that’s a problem. I’m glad that there are at least some women in the audience and hopefully more can be in this conversation. There is actually some sense that women bring different perspectives that add something very important to these conversations. Like, for example, we talked about the marital struggle thing, that sort of contribution is kind of a traditionally female thing though of course men can do it, such as Martin Hellman, who was an equal partner on that project. I think there’s an important role there and it’s something good to pay attention to.
Question: A clarification question on this map. What do these things represent? Are they technical projects, trying to crack AGI? Are they kind of ethical think tank type stuff?
Seth: Yes, that’s a good question. These are R&D projects, these are projects that are trying to build AGI, and you see it says lead and partner. The lead is the country the project is based in and the partner is when there are projects that have other collaborators from other countries.
Question: Okay. What possible benefit might there be to a madman style of managing such situations? Do you see any benefit to that?
Seth: So the story is that if the other side thinks you’re crazy they’re more likely to yield. So the classic analogy in the literature on deterrence theory is to imagine that you and someone who you really don’t like are fighting each other while tied to each other by rope and right by the edge of a cliff. So what you could do is you could try to just beat up the other person and win, or you could do something that’s kind of crazy, like start dancing and shimmying closer to the cliff where they start to think maybe this guy is a little crazy, maybe he actually wants to go over the cliff—or she wants to actually go over the cliff—and maybe that will induce them to yield. That’s kind of the basic concept. Does it actually work in practice? Maybe it does, maybe it doesn’t. Maybe it depends on the context, but that’s the idea.
Question: What is your outlook for the future, especially as the AGI dynamics kind of begin to develop perhaps along similar, perhaps along different lines to the nuclear dynamics? Do you think we have the prospect for kind of new age of international cooperation or are we in kind of a new dark age maybe?
Seth: There’s prospect for it but there’s a lot of work to be done. This will all play out in the context of the international relations that we’re going to be having anyway because of all this other stuff going on. So a lot of it depends on what is going on with the rest of the world. Now with all types of AI, it’s important to recognize that it’s not just international in the sense of between one nation and another because so much of the work is being done privately. So a lot of it might look less like, say, nuclear weapons and more like say climate change where with climate change it’s mostly the private sector that are the important actors. The fossil fuel companies, the energy industry, all of that, so we can learn both from the nuclear weapons experience as far as how to get certain types of cooperation, especially on really high stakes threats and then also from climate change and similar types of issues, as far as how to handle the international cooperation on issues that are driven by the private sector.
Seth Baum: Reconciling international security
Link post
Many effective altruists prioritize reducing existential risks. The international security community also works to lower the odds of certain sorts of existential risks, such as nuclear war. But if we want to work together with them, we’ll need to understand their perspective, which may be quite different from that of outsiders. In this talk from Effective Altruism Global 2018: San Francisco, Seth Baum describes the international security perspective on nuclear weapons, and touches on how some of this may apply to risks from artificial intelligence, too.
Below is a transcript of Seth’s talk, which we have lightly edited for clarity. You can also watch it on YouTube and read it on effectivealtruism.org.
The Talk
First, a quick disclaimer: the views in this talk are mine alone and not necessarily those of the Global Catastrophic Risk Institute.
The work that I’ll be presenting today comes from a broader theme in my research, especially these three papers, which is that essentially for us to make a productive change on the issues that we care about, we need to really understand the perspectives and opportunities from those people who are positioned to take important actions on them. We can’t just think about what sorts of actions we would ideally like to see on these issues. We need to see what people could actually do in their positions.
In order to do this we need to understand their perspectives, how they think. They don’t necessarily think like we do. They don’t have the same beliefs, the same values necessarily, and even if they did, based on where they work, the institutional context, they might not be able to take certain types of actions. We really would do well to internalize that and adjust our own strategies accordingly.
A good place to start is in understanding how they think and talk about these sorts of ideas. So in international security, for example, specifically in intelligence analysis, they have a concept called “mirror imaging” or “mirroring”. Mirroring is the mistake that analysts often make when they assume that the people who they are analyzing think and act like they do. It’s essentially like “If I was in that position, this is what I would do”. Well, you’re not in that position: it’s someone else and in intelligence analysis, it’s typically somebody from a different country, with a different cultural and political background and a different institutional context that they work in. It’s a mistake to think that they would do what you would do in that circumstance. Avoiding mirroring works well for the intelligence community, and it’s also an important message for us.
A pretty classic example of this, in global catastrophic risk, from the early 1980s and the early studies of nuclear winter, is Carl Sagan and colleagues. They were really concerned about nuclear winter for essentially the same reasons that we might be concerned about global catastrophic risk and existential risk, and things like that, which is because a nuclear winter would be a very severe event, potentially including loss of future generations. They were concerned about this and concluded that therefore, we should reduce the size of nuclear arsenals down to a level that would be too small to cause nuclear winter.
They tried promoting this idea in international security policy circles and met with the response that no, actually, to the extent that nuclear winter was a concern, it was a concern that only reinforced our existing nuclear weapons policy; it didn’t suggest any changes. The policy was based on nuclear deterrence: the idea that nuclear war would be so catastrophic that the only sensible policy was to avoid war in the first place. Nuclear winter just makes nuclear war even more catastrophic than it already was, reinforcing the deterrence policy. This is really where the debate over nuclear winter ended. It was a stalemate, didn’t really get much farther, didn’t change the actual nuclear weapons policy that we had. It was because they didn’t really reconcile with this reaction from the nuclear weapons policy community.
For a more recent example, in a high profile video on lethal autonomous weapons, the term “slaughterbots” was put out recently, and you can agree or disagree with what the video itself was actually saying. I personally am somewhat skeptical, but it’s still the case that the video was fairly poorly received by a lot of important people in international security policy communities, and because of that it has made it more difficult for the people behind the video to get their message out there to these very important audiences.
Then finally, an example with my own work, because I am not above making this mistake myself. A few years back I published some research on a concept I called “winter-safe deterrence”. Now this was actually an attempt to reconcile concerns about the risk of nuclear winter with the belief in nuclear deterrence as a policy construct, and I said, “Okay, if we are going to reduce our nuclear arsenals down to a level that would not cause nuclear winter, are there other things we can do using other types of weapons to achieve a successful deterrence policy, so that we can have the best of both worlds?” For this, I looked at a number of different types of weapons, including biological weapons, and it was the attention to biological weapons in this research that prompted these responses that you see here. This is actually just a small portion of the very negative reaction that this part of the work generated.
I was not intending to do this. I was not trying to be contrarian or anything like that. This was a mistake. I really did not appreciate how strongly the international security community would come out against even the suggestion of a positive role for biological weapons.
I mean how I thought about it myself, in ethical terms, thinking basically as a consequentialist, I would look at a lot of different type of weapons, where for small amounts of the weapon that were carefully used in very select circumstances, there could be some benefits to society, and then you get more problems as the arsenal size increases.
Well what I failed to realize is that a lot of people think about it more like that, where there is a very large benefit to not having any of the weapon in the first place. Might even be larger than what’s shown here, in that they just think that the weapon itself as a class of weapon should not be there. This is why we see bans of a number of different types of weapons: biological weapons, chemical weapons, land mines, and several others. Okay, some of that I disagree with. Part of the argument that is made for banning weapons is, in ethics terms, deontological.
They will argue that certain types of weapons are fundamentally immoral. Personally I disagree with that. Like if I’m going to be killed in the battlefield I don’t actually care what type of weapon it is that kills me, I’m dead anyway. Likewise, if I’m going to be hurt, I don’t care which type of weapon it is that hurts me, I care about the pain that I have to suffer. But at the same time there are a lot of people who think deontologically about weapon types, and that’s something that I need to internalize and factor in.
But there’s more to it than that. Banning a class of weapons is just a simpler approach to policy. I mean, “ban biological weapons”—that’s three words. You could put it on a bumper sticker, versus like “craft a carefully optimized small quantity of biological weapons to be used in specific circumstances.” That’s a lot more complicated, it’s harder to rally political support behind it and in consequentialist terms that matters, and so this is something that I have been, since this, trying to grapple with and internalize, and factor into my own approach to weapons policy.
Now there are still some parts of it where I really do think the international security community gets it wrong. In particular in the last few years with the really massive international attention paid to a fairly small handful of chemical weapons attacks in Syria, in comparison to the much larger number of attacks using guns and conventional explosives. Conventional weapons that just aren’t quite as salient, for whatever reason. This seems wrong to me. Where we make such a big deal out of the people who are hurt and killed by chemical weapons, but for this much larger number of people who just have the misfortune of being hurt and killed by the wrong weapons, we don’t actually care about that so much.
There are some people who do care about it, like this organization Action on Armed Violence. They are focused specifically on this issue, and superficially it seems like they’re doing good work. I haven’t looked into them closely, so I can’t make an endorsement of them, but it looks like they do pretty good work on this, and for good reason. I mean, I’m actually myself thinking of getting involved in this, to really help drive home the point that there are all these lives that are being harmed and killed on something that we’re essentially neglecting. I think this is a good way to put in more quantitative reasoning into international security conversations. That could be important in general.
Essentially, it kind of looks like this. You have your high frequency, low severity types of weapons like conventional explosives, then on the opposite end you have our low frequency, high severity weapons like nuclear weapons, which is where most of my work is, and then you have chemical weapons, which are in this kind of sad bottom left corner, where we don’t use them that often, and when we do the magnitude’s not that large. Okay, sure, you can cause a lot of harm with chemical weapons, but aside from World War I trench warfare, 100 years ago, for the most part that hasn’t really been the case.
Now with respect to nuclear weapons we should be careful in talking about the international security community perspective, because there’s not just one perspective. There are a lot of different perspectives and we should recognize that.
This chart here summarizes the main perspectives on this, where starting from the top you have a small minority of people who actually argue for more nuclear proliferation. These are people who really believe in nuclear deterrence, and there is a certain logic to this.
If nuclear deterrence brings peace, peace is good, so we should spread nuclear weapons around to more countries, so that they can have peace. I might not believe in the effectiveness of nuclear deterrence quite as much as they do, but there is at least a logic to it. That is a small minority, though. A more common view among the hawkish folks are to essentially maintain the status quo. We will keep our weapons because we see benefits from them, but we don’t want another countries to have them, especially the other countries that we don’t trust so much.
Another prominent view is to have gradual disarmament on timescales of decades or so, as international conditions permit. This is like Obama, people with similar views as him, the NGO Global Zero is big in this space. And then finally another very prominent view is for a much more rapid nuclear disarmament. ICAN is the International Coalition for the Abolition of Nuclear Weapons. They just won the Nobel Peace Prize for this. And NWS, Non-Nuclear Weapons States, the countries that don’t have the nuclear weapons, quite a lot of them also support this view on nuclear disarmament.
Now I’ve been studying the risk of nuclear war for many years now. And I’ve got to be honest, I’m actually not sure which of these camps I support. It’s tricky because okay, nuclear weapons obviously make war more severe. That much is pretty clear, but with nuclear deterrence they have some potential to reduce the frequency of major war, such that if you switched over to conventional deterrence it’s possible that you would have major wars that occur more frequently.
And so you have this tradeoff between the frequency and severity of war. And it’s actually not really obvious to me how that tradeoff is resolved. And so because of that, I have tended to shy away from advocating specific policy positions on nuclear disarmament—it’s not quite clear which policy position we should be pushing for. Instead, at least a few years ago, what I would do is push for a different type of policy, which is to improve relations between the different countries, especially the nuclear weapons countries that don’t get along with each other, starting with the United States and Russia, which have the two major nuclear arsenals.
This is something that would clearly reduce the risk of all types of war, and it’s something that people across all the different disarmament camps would agree with. Everybody agrees that if countries get along with each other better, they’re less likely to have wars. And that makes the world safer. And, there are things that could be done on this.
Something that I like to point to is a really great project by Dorothie and Martin Hellman. Martin Hellman, a professor at Stanford, did early work on cryptography, and then some work on the risk of nuclear war, that my own risk analysis is built off of. In a really amazing project, the two of them analyzed their own marriage, and what they did to overcome their marital struggles, and used that as a starting point for thinking about how to improve relationships between different countries around the world. Really brilliant concept.
And this is the sort of thing that I would point to as having good potential to make progress on this front. However, within the last year or two, things have changed. We now have, as I’m sure you’re all aware, a fairly different political environment, such that if you are to talk about trying to improve the relationship between the United States and Russia, right now it’s more complicated. I think that’s a fair way of putting it.
And in this political environment, I actually don’t have great ideas. I don’t know what the best way forward, because so many of the issues now are within the space of domestic politics, especially within the US. That’s really kind of emerged as a big thing. It used to be that there as a lot of consensus within the US about say, how to approach the relationship with Russia, now it’s just gotten a lot more complicated. And the important actors for domestic politics are not technical policy analysts like myself, it’s more like journalists or grassroots political organizers, campaign managers, of course the politicians themselves.
And so I have not felt like I’ve had really good opportunities to make a positive difference on this front. My own strategy has actually mostly been to lay low, do some research and some quiet networking with policy communities and sharing ideas with them. Which doesn’t get to the root issues going on but is still something I feel like we can make constructive progress on.
What is clear though, is for those of us working on international security issues, we need to understand this and internalize this and factor it into our plans and our strategies because this really matters. And who knows how this story is gonna play out? It’s too soon to tell.
And that goes for nuclear weapons, and a lot of other issues. As of recently, those issues include artificial intelligence. This is just a small portion of the attention that major national governments have paid to AI as a security issue within the last year or two. And I would expect that there will be more and more of this as time goes on.
Now, most of this is for narrow near-term AI. Not the more dramatic transformative AI that we could have in the future. However, shown here this is a map from a survey of artificial general intelligence projects that I’ve published last year. And what you see is they’re all over the world. And maybe half of them are based in the United States, but there’s a lot in Europe, a lot in China, a few in Russia and other countries around the world. Now most of these are private, they’re either at private companies or in academia, but some of them have government ties and the governments that the countries that these projects are based in, may be paying attention to them anyway. I think we will expect to see more and more of that as this field progresses. And so for this as well, it’s important for us to understand these international security dynamics, and also the perspectives of the people who are taking relevant actions on that. If we do understand their perspectives, we can customize our own efforts on these issues accordingly.
Q&A
Question: What does it take to create a nuclear winter, what is that threshold? There’s thousands and thousands of these missiles, how many have to go off?
Seth: That’s a good question because if you have just one nuclear weapon it’s probably not going to cause anything close to a nuclear winter. We had two used in WWII; that didn’t really change things that much.
What I did in my survey was to take a cautious approach. Which is to say that well, we don’t know exactly what it would take, let’s err on the safe side, aim low for it. And so the number that I proposed in this study was 50. This was based on other research that had studied nuclear war scenarios involving 100 nuclear weapons and they found substantial effect. And so I thought, okay, let’s take a number that’s lower than that, but I have to say that this 50 number was somewhat arbitrary. It’s not something that we really have a precise understanding of.
Question: That would be enough though, to certainly destroy any… I mean I guess you have to still factor in how many of them will actually hit targets and go through defenses and what not, but there’s no rival that can survive 50 nuclear strikes. So what is the argument that that’s not enough?
Seth: As far as the number of nuclear weapons that a country needs to have successful deterrence, opinions vary quite widely. I’ve seen anything from as few as 5 to as many as like, 1500. And the high numbers come from this idea that, we may need weapons for a variety of contexts, different adversaries that we may need to simultaneously deter as well as defenses and so on that these adversaries may have. And so in order to have a reliable deterrent we might need that many nuclear weapons.
I’m skeptical that we would need so many, but the important to understand is that ultimately deterrence is psychological. Deterrence succeeds when the other side makes a decision to not attack, and so ultimately it’s not about the number of weapons, it’s about the human reaction to that.
Question: And we have, the United States has 7000 or something?
Seth: It’s in the thousands, depends on how you count it. And there is some interest in reducing the size of our arsenal, certainly in the Obama administration there was interest in that. My sense is that the main factor for progress on nuclear disarmament right now is the relationship between the United States and Russia. I think we are unlikely to unilaterally make significant cuts to our arsenal, that we would only do that in tandem with cuts from Russia.
Question: What do you think the trajectory is that we’re on right now? It seems like, well obviously we have a summit coming up in two days, it seems like things are not going that great. But how do you assess it?
Seth: It’s not clear that where we are right now can be readily described as a trajectory. Yeah. I think that’s how I would sum it.
It’s a volatile situation. I live in New York, that’s one of the target cities, San Francisco is another target city. Sometimes I have my fingers crossed. But that’s, I guess we’ll see.
Question: I know there have been major high level arms reduction agreements in the past. Has there been any breakthrough with a situation even remotely like the North Korea situation? I’m kind of struggling to come up with one. People would maybe say Libya but now that’s like the warning to anybody else.
Seth: The Libya example is a good one because a key attribute of the situation with North Korea is they are a much smaller country than we are. So it’s a very uneven dynamic in contrast with Russia. The US considers Russia and China “near peer” adversaries. They might consider themselves peer adversaries. That’s the term that gets used. North Korea I think, from North Korea’s perspective it makes sense that they would recall the Libya example. The United States has not been trustworthy on these matters in the past and I would expect North Korea to understand that and factor that into its plans. I would be surprised if North Korea were to agree to give up its nuclear weapons. I would be very surprised to be honest. There is progress that can nonetheless be made in a talk like this, but that gets into more subtle technical details of the nuclear weapons enterprise. We’ll see.
Question: You spoke for a minute about branding mistakes that you and others have made as you engage with this content. Can you offer any advice about how to engage without projecting Snowden-style risk, if somebody really wants to get involved?
Seth: That’s a good question and the really the answer is pretty simple. It’s just to understand their perspectives. So for example in my own paper that I did on this, I probably could have done a better job of sharing early drafts with people from this. I mean, there were international security people who saw the drafts. The paper went through peer review, in an international security journal and the readers actually did not flag the thing that ended up getting all the attention, so you never know. That’s the sort of thing I could have done more of and that I think applies more generally. The more that we are kind of, of that world and of that community then the better we can predict how they would react to things and get people’s reactions to test ideas out before putting them out more widely.
Question: How much of what goes on in this space of kind of managing these international catastrophic risks happens behind the scenes and is sort of invisible to the public, versus kind of gets discussed in the open?
Seth: A lot of it is behind the scenes. There is no question about that—it’s a mix. It’s a mix, but the public context is important. National leaders especially in democracies do need to care about the domestic reaction to what they’re doing. Not just in democracies. In China, for example, there is a sense that since Xi was made like president for life, or whatever the term is, that he actually now has more flexibility to pursue international initiatives because he is in a more comfortable place domestically. So it goes both ways, there’s always a bit of a negotiation on two levels, internationally and then also nationally.
Question: Why are there so few women in the audience right now for this talk? Do you notice that this is sort of a gendered topic in some way?
Seth: To some extent. It varies in nuclear weapons more so than, say, biological weapons. In the biological weapons space the gender breakdown is more balanced. I, as far as why, I don’t think I have any particularly clever or insightful answers. What I can say is that there are some people in international security who are aware of this sort of imbalance and try to address it. Especially I think like when there is discrimination against women, which probably happens. That definitely should be recognized and countered because we lose talent, and that’s a problem. I’m glad that there are at least some women in the audience and hopefully more can be in this conversation. There is actually some sense that women bring different perspectives that add something very important to these conversations. Like, for example, we talked about the marital struggle thing, that sort of contribution is kind of a traditionally female thing though of course men can do it, such as Martin Hellman, who was an equal partner on that project. I think there’s an important role there and it’s something good to pay attention to.
Question: A clarification question on this map. What do these things represent? Are they technical projects, trying to crack AGI? Are they kind of ethical think tank type stuff?
Seth: Yes, that’s a good question. These are R&D projects, these are projects that are trying to build AGI, and you see it says lead and partner. The lead is the country the project is based in and the partner is when there are projects that have other collaborators from other countries.
Question: Okay. What possible benefit might there be to a madman style of managing such situations? Do you see any benefit to that?
Seth: So the story is that if the other side thinks you’re crazy they’re more likely to yield. So the classic analogy in the literature on deterrence theory is to imagine that you and someone who you really don’t like are fighting each other while tied to each other by rope and right by the edge of a cliff. So what you could do is you could try to just beat up the other person and win, or you could do something that’s kind of crazy, like start dancing and shimmying closer to the cliff where they start to think maybe this guy is a little crazy, maybe he actually wants to go over the cliff—or she wants to actually go over the cliff—and maybe that will induce them to yield. That’s kind of the basic concept. Does it actually work in practice? Maybe it does, maybe it doesn’t. Maybe it depends on the context, but that’s the idea.
Question: What is your outlook for the future, especially as the AGI dynamics kind of begin to develop perhaps along similar, perhaps along different lines to the nuclear dynamics? Do you think we have the prospect for kind of new age of international cooperation or are we in kind of a new dark age maybe?
Seth: There’s prospect for it but there’s a lot of work to be done. This will all play out in the context of the international relations that we’re going to be having anyway because of all this other stuff going on. So a lot of it depends on what is going on with the rest of the world. Now with all types of AI, it’s important to recognize that it’s not just international in the sense of between one nation and another because so much of the work is being done privately. So a lot of it might look less like, say, nuclear weapons and more like say climate change where with climate change it’s mostly the private sector that are the important actors. The fossil fuel companies, the energy industry, all of that, so we can learn both from the nuclear weapons experience as far as how to get certain types of cooperation, especially on really high stakes threats and then also from climate change and similar types of issues, as far as how to handle the international cooperation on issues that are driven by the private sector.