Error
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
This paper on the concern of nuclear explosives for asteroid deflection increasing the risk of nuclear war is relevant.
I find this post very interesting. However, I don’t think the dual-use should worry us much. I cannot estimate how much harder it is in general to divert an asteroid toward Earth than away from it, but I can confidently say that it is several orders of magnitude higher than 10x (the precision needed would be staggering). In addition, to divert an asteroid toward Earth, one needs an asteroid. The closer the better. The fact that the risk of a big-enough asteroid hitting the Earth is so low indicates that there are not too many candidates. This factor has to be taken into account as well.
But, even if diverting an asteroid towards the Earth would be only 10 times harder than diverting it from the Earth, dual-use does not need to be a big concern. To actually manage to divert an asteroid towards the Earth one does not only need to divert it, one also needs to prevent the rest of humanity from diverting it away on time, which is much easier. So, as long as a small bunch of independent institutions are able and ready to divert asteroids, dual-use does not seem a concern to me.
Thanks, these are great points.
Interestingly enough, the importance of asteroid size might be overestimated, compared to impact angle and impact site. The asteroid that killed the dinosaurs wouldn’t have been nearly as deadly, hadn’t it struck at one of the worst possible places at one of the worst possible angles.
This 2017 paper used computer models to see if the rock composition of the impact site could’ve made a difference. The computer calculated the amount of soot and sulfates that would be ejected into the atmosphere as well as what that would mean for our planet, since both soot and sulfates can block the sun’s light. The blocked out sun started a global winter that lasted years and this is what killed the dinosaurs, not the impact of the asteroid directly. The researchers found that the composition of the impact site was especially unlucky. And since the Earth is constantly spinning and moving in space, this means that if the asteroid had just arrived a couple minutes later it wouldn’t have hit such a problematic piece of land or might have even hit the ocean where a lot of its impact would have been lessened (in terms of the amount of rock that got ejected into the atmosphere).
This 2020 paper concluded that the asteroid hit from a pretty steep angle, about 45-60 degrees. This vaporized more rock than a shallow strike and released more climate-changing gases than other angles, with 2-3 times as much carbon dioxide released as a vertical impact and 10 times as much as a shallow impact.
Seeing how unlucky the impact timing was means that asteroids probably aren’t as big of a risk as they are imagined to be. And even if we don’t develop the technology to completely deflect asteroids, changing the angle or delaying it so it hits a different impact site might be enough to change a mass extinction into a mere disaster.
If I could give more than a Strong Upvote for your bringing up the dual-use issue as a crucial consideration for working on asteroid deflection capabilities, I would. I was considering doing a write-up on this as well. It is a wonderful example of second-order considerations making the effort to reduce risk actually increase it.
I think this is strong enough as a factor that I now update to the position that derisking our exposure to natural extinction risks via increasing the sophistication of our knowledge and capability to control those risks is actually bad and we should not do it. Maybe this generalizes to working on all existential risks...
Thank you for the kind words!
I would feel a bit wary about making a sweeping statement like this. I agree that there might be a more general dyanmic where (i) natural risks are typically small per century, and (ii) the technologies capable of controlling those risks might often be powerful enough to pose a non-negligible risk of their own, such that (iii) carelessly developing those technologies could sometimes increase risk on net, and (iv) we might want to delay building those capabilities while other competences catch up, such as our understanding of their effects and some meaure of international trust that we’ll use them responsibly. Very ambitious geoengineering comes to mind as close to an example.
Perhaps I’m misunderstanding you, but I’m very hopeful that it doesn’t. One reason is that (it seems to me) very little existential risk work is best described as “let’s do build dual-use capabilities whose primary aim is to reduce some risk, and hope they don’t get misused”; but a lot of existential risk work can be described as either (i) “some people are building dual-use technologies ostensibly to reduce some risk or produce some benefits, but we think that could be really bad, let’s do something about that” and (ii) “this technology already looks set to become radically more powerful, let’s see if we can help shape its development so it doesn’t turn to do catastrophic harm”.
I think the meme of x-risk and related will spread and degrade beyond careful thinkers such as readers of this forum, and a likely subset of responses to a perception of impending doom are to take drastic actions to gain perceived control, exacerbating risk. The concept of x-risk is itself dual-use.
I don’t know how I’m supposed to interpret this statistic without a time frame. Is this supposed to be per century?
Thanks for the pointer, fixed now. I meant for an average century.