Thanks for this database! I’m currently working on a project for the Foresight Centre (a think-tank at the Estonian parliament) about existential risks and the EU’s role in reducing them. I cover risks form AI, engineered pandemics, and climate change. For each risk, I discuss possible scenarios, probabilities, and the EUs role. I’ve found a couple of sources from your database on some of these risks that I hadn’t seen before.
That sounds like a really valuable project! Glad to hear this database has been helpful in that.
I cover risks form AI, engineered pandemics, and climate change.
Do you mean that those are the only existential risks the Foresight Centre’s/Estonian Parliament’s work will cover (rather than just that those are the ones you’re covering personally, or the ones that are being covered at the moment)? If so, I’d be interested to hear a bit about why? (Only if you’re at liberty to discuss that publicly, of course!)
I ask because, while I do think that AI and engineered pandemics are probably the two biggest existential risks, I also think that other risks are noteworthy and probably roughly on par with climate change. I have in mind in particular “unforeseen” risks/technologies, nuclear weapons, nanotechnology, stable and global authoritarianism, and natural pandemics. I’d guess that it’d be hard to get buy-in for discussing nanotechnology, authoritarianism, and maybe unforeseen risks/technologies in the sort of report you’re writing, but not too hard to get buy-in for discussing nuclear weapons and natural pandemics.
Existential risks are not something they have worked on before, so my project is a new addition to their portfolio. I didn’t mention this but I intend to have a section for other risks depending on space. The reason climate change gets prioritized in the project is that arguably the EU has more of a role to play in climate change initiatives compared to, say, nuclear risks.
I imagine having climate change feature prominently could also help get buy-in for the whole project, since lots of people already care about climate change and it could be highlighted to them that the same basic reasoning arguably suggests they should care about other existential risks as well.
FWIW, I’d guess that the EU could do quite a bit on nuclear risks. But I haven’t thought about that question specifically very much yet, and I’d agree that they can tackle a larger fraction of the issue of climate change.
Thanks for this database! I’m currently working on a project for the Foresight Centre (a think-tank at the Estonian parliament) about existential risks and the EU’s role in reducing them. I cover risks form AI, engineered pandemics, and climate change. For each risk, I discuss possible scenarios, probabilities, and the EUs role. I’ve found a couple of sources from your database on some of these risks that I hadn’t seen before.
That sounds like a really valuable project! Glad to hear this database has been helpful in that.
Do you mean that those are the only existential risks the Foresight Centre’s/Estonian Parliament’s work will cover (rather than just that those are the ones you’re covering personally, or the ones that are being covered at the moment)? If so, I’d be interested to hear a bit about why? (Only if you’re at liberty to discuss that publicly, of course!)
I ask because, while I do think that AI and engineered pandemics are probably the two biggest existential risks, I also think that other risks are noteworthy and probably roughly on par with climate change. I have in mind in particular “unforeseen” risks/technologies, nuclear weapons, nanotechnology, stable and global authoritarianism, and natural pandemics. I’d guess that it’d be hard to get buy-in for discussing nanotechnology, authoritarianism, and maybe unforeseen risks/technologies in the sort of report you’re writing, but not too hard to get buy-in for discussing nuclear weapons and natural pandemics.
Existential risks are not something they have worked on before, so my project is a new addition to their portfolio. I didn’t mention this but I intend to have a section for other risks depending on space. The reason climate change gets prioritized in the project is that arguably the EU has more of a role to play in climate change initiatives compared to, say, nuclear risks.
Makes sense!
I imagine having climate change feature prominently could also help get buy-in for the whole project, since lots of people already care about climate change and it could be highlighted to them that the same basic reasoning arguably suggests they should care about other existential risks as well.
FWIW, I’d guess that the EU could do quite a bit on nuclear risks. But I haven’t thought about that question specifically very much yet, and I’d agree that they can tackle a larger fraction of the issue of climate change.