Hi Stephen! Thanks for the post. What are the typical frameworks that you use to think about existential threats? Sometimes for instance we utilize probabilities to describe the chance of say nuclear Armageddon though that seems a bit off from a frequentinost philosophical perspective. For example, that type of event either happens or it doesn’t. We can’t run 100 earth high fidelity simulations and count the various outcomes and then calculate the probability of various catastrophes. I work with data in my day job so these types of questions are top of mind.
Hi Dem, I don’t really have a defined framework for thinking about existential threats. I have read quite a lot around AI, Nuclear (command and control is a great book on the history of nuclear weapons) and Climate Change. I tend to focus mainly on the likelihood of something occurring and the tractability of preventing it. On a very high level I’ve concluded that the AI threat is unlikely to be catastrophic, and until a general AI is even invented there is little research or useful work that can be done in this area. I think the nuclear weapons threat is very serious and likely underestimated (given the history of near misses it seems amazing to me that there hasn’t been a major incident) - but this is deeply tied up in geopolitics and seems highly intractable to me. For me that leaves climate change, which has ever stronger scientific evidence supporting the idea that it will be really bad, and there is enough political support to allow it to be tractable—which is why I have chosen to make it the area of my focus. I also think economic development for poorer countries (or the failure to do so ) is a huge issue on a similar scale to the above, but again I believe that it’s too bogged down in politics and national interests to be tractable.
Yes that makes sense and aligns with my thinking as well. Do you have a sense of how much the EA community gives to AI vs nuclear vs bioweapon existential risks? Or how to go about figuring that out?
Hi Locke—I’m not 100% sure how seriously nuclear Armageddon is taken in the EA community as I’m also pretty new. I’m just starting a piece of research to try and highlight where specific de-carbonisation efforts will be found (focused on a specific country—in my case Canada). Even though I haven’t started I strongly suspect the answer will be agriculture, as it accounts for a very large proportion of emissions, there are many proven, scalable low cost solutions and it seems to me to be very neglected from a funding point of view (I say that based on some brief research I did on the UK) compared to other areas like electric vehicles and renewable energy.
Hi Stephen! Thanks for the post. What are the typical frameworks that you use to think about existential threats? Sometimes for instance we utilize probabilities to describe the chance of say nuclear Armageddon though that seems a bit off from a frequentinost philosophical perspective. For example, that type of event either happens or it doesn’t. We can’t run 100 earth high fidelity simulations and count the various outcomes and then calculate the probability of various catastrophes. I work with data in my day job so these types of questions are top of mind.
Hi Dem, I don’t really have a defined framework for thinking about existential threats. I have read quite a lot around AI, Nuclear (command and control is a great book on the history of nuclear weapons) and Climate Change. I tend to focus mainly on the likelihood of something occurring and the tractability of preventing it. On a very high level I’ve concluded that the AI threat is unlikely to be catastrophic, and until a general AI is even invented there is little research or useful work that can be done in this area. I think the nuclear weapons threat is very serious and likely underestimated (given the history of near misses it seems amazing to me that there hasn’t been a major incident) - but this is deeply tied up in geopolitics and seems highly intractable to me. For me that leaves climate change, which has ever stronger scientific evidence supporting the idea that it will be really bad, and there is enough political support to allow it to be tractable—which is why I have chosen to make it the area of my focus. I also think economic development for poorer countries (or the failure to do so ) is a huge issue on a similar scale to the above, but again I believe that it’s too bogged down in politics and national interests to be tractable.
Yes that makes sense and aligns with my thinking as well. Do you have a sense of how much the EA community gives to AI vs nuclear vs bioweapon existential risks? Or how to go about figuring that out?
Up until recently, the vast majority of EA donations come from Open Philanthropy, so you can look at their grants database to get a pretty good sense.
Does the Doomsday Clock and the bulletin of the Atomic scientists come up much in EA? I’m a bit new to this scene. https://thebulletin.org/
Jerry Brown’s warnings about nuclear Armageddon and the slow building climate tidal wave have definitely turned me on that organization.
Where do you see the opportunity to make a difference in the decarbonization effort?
Hi Locke—I’m not 100% sure how seriously nuclear Armageddon is taken in the EA community as I’m also pretty new. I’m just starting a piece of research to try and highlight where specific de-carbonisation efforts will be found (focused on a specific country—in my case Canada). Even though I haven’t started I strongly suspect the answer will be agriculture, as it accounts for a very large proportion of emissions, there are many proven, scalable low cost solutions and it seems to me to be very neglected from a funding point of view (I say that based on some brief research I did on the UK) compared to other areas like electric vehicles and renewable energy.