That sounds like a really valuable project! Glad to hear this database has been helpful in that.
I cover risks form AI, engineered pandemics, and climate change.
Do you mean that those are the only existential risks the Foresight Centreās/āEstonian Parliamentās work will cover (rather than just that those are the ones youāre covering personally, or the ones that are being covered at the moment)? If so, Iād be interested to hear a bit about why? (Only if youāre at liberty to discuss that publicly, of course!)
I ask because, while I do think that AI and engineered pandemics are probably the two biggest existential risks, I also think that other risks are noteworthy and probably roughly on par with climate change. I have in mind in particular āunforeseenā risks/ātechnologies, nuclear weapons, nanotechnology, stable and global authoritarianism, and natural pandemics. Iād guess that itād be hard to get buy-in for discussing nanotechnology, authoritarianism, and maybe unforeseen risks/ātechnologies in the sort of report youāre writing, but not too hard to get buy-in for discussing nuclear weapons and natural pandemics.
Existential risks are not something they have worked on before, so my project is a new addition to their portfolio. I didnāt mention this but I intend to have a section for other risks depending on space. The reason climate change gets prioritized in the project is that arguably the EU has more of a role to play in climate change initiatives compared to, say, nuclear risks.
I imagine having climate change feature prominently could also help get buy-in for the whole project, since lots of people already care about climate change and it could be highlighted to them that the same basic reasoning arguably suggests they should care about other existential risks as well.
FWIW, Iād guess that the EU could do quite a bit on nuclear risks. But I havenāt thought about that question specifically very much yet, and Iād agree that they can tackle a larger fraction of the issue of climate change.
That sounds like a really valuable project! Glad to hear this database has been helpful in that.
Do you mean that those are the only existential risks the Foresight Centreās/āEstonian Parliamentās work will cover (rather than just that those are the ones youāre covering personally, or the ones that are being covered at the moment)? If so, Iād be interested to hear a bit about why? (Only if youāre at liberty to discuss that publicly, of course!)
I ask because, while I do think that AI and engineered pandemics are probably the two biggest existential risks, I also think that other risks are noteworthy and probably roughly on par with climate change. I have in mind in particular āunforeseenā risks/ātechnologies, nuclear weapons, nanotechnology, stable and global authoritarianism, and natural pandemics. Iād guess that itād be hard to get buy-in for discussing nanotechnology, authoritarianism, and maybe unforeseen risks/ātechnologies in the sort of report youāre writing, but not too hard to get buy-in for discussing nuclear weapons and natural pandemics.
Existential risks are not something they have worked on before, so my project is a new addition to their portfolio. I didnāt mention this but I intend to have a section for other risks depending on space. The reason climate change gets prioritized in the project is that arguably the EU has more of a role to play in climate change initiatives compared to, say, nuclear risks.
Makes sense!
I imagine having climate change feature prominently could also help get buy-in for the whole project, since lots of people already care about climate change and it could be highlighted to them that the same basic reasoning arguably suggests they should care about other existential risks as well.
FWIW, Iād guess that the EU could do quite a bit on nuclear risks. But I havenāt thought about that question specifically very much yet, and Iād agree that they can tackle a larger fraction of the issue of climate change.