In a recently published paper, we identified two major shortcomings of National Risk Assessment (NRA) processes:
Lack of transparency around foundational assumptions
Exclusion of the largest scale risks
We demonstrate the potential problems and ambiguities that arise in NRA due to these shortcomings.
We identify the exclusion of global catastrophic risks (GCRs) and existential risks (x-risks) from NRAs as a critical process error.
Even when only considering people alive today, and with a time horizon of just one year, the consequence in expectation of several existential risks is higher than all other risks commonly included in NRAs.
A longtermist perspective is not needed to prioritise existential risk mitigation through NRA, and potentially detracts from getting such risks onto the agenda for assessment.
We propose the development of a freely available, open-access, risk communication and engagement tool to facilitate stakeholder discussions on NRAs.
This post is a partial and high-level summary of our research paper on national risk assessment (NRA) published in the academic journal Risk Analysis in March 2023. This post also places our work in the context of another recent report on NRA identifying common ground. Consider reading our full paper for complete details of our thinking on NRA as it applies to global catastrophe, and existential risk.
Introduction
Many countries undertake National Risk Assessment (NRA) to evaluate risks of national significance, assessing for example, natural hazards, infectious diseases, industrial accidents, terrorist attacks, cyberattacks, organised crime, or institutional failure. The NRA process is complex and cross-sectoral, often excluding risks with low probability, and often has a short-term focus of less than five years. The outputs of NRA tend to communicate results in some form of National Risk Register (NRR) and/or consequence-probability (C,P) risk matrix.
However, NRAs and NRRs can be criticised particularly where the common practice of presenting a two-dimensional risk matrix obscures uncertainties, stakeholder disagreements on values, bias, and systematic errors. Critically, the exclusion of large-scale (and cross-border) risks such as global catastrophic risks (GCRs) and existential threats to humanity (x-risks) is another limitation of NRAs.
The aim of NRA should be to find common understanding across stakeholders of risks and priorities, stimulate local authorities to build capacity and capability, and identify common consequences across multiple risks. Prioritisation of risks is sometimes explicitly intended through the NRA process, but methods for prioritisation depend on foundational assumptions of the NRA process that are not always clearly articulated.
Aim of our paper
Our paper sought to demonstrate some shortcomings of existing NRA processes and outputs, namely:
How the choice of fundamental NRA process assumptions makes a material difference to the NRR output and any subsequent deliberation on risk.
The weaknesses and ambiguity of risk matrices for communicating NRAs.
A major class of risks often neglected by NRA (namely GCRs and x-risks).
The difficulties that uncertainty poses.
We then suggest how those undertaking NRA could enter a productive dialogue with stakeholders, supported by an interactive communication and engagement tool, to overcome some of these difficulties (details of that are in the paper, not the post below).
We note that another report, by Kevin Kohler, titled National Risk Assessments of Cross-Border Risks was published in February 2023, shortly before our paper. Throughout this post we also highlight some of the key points therein.
Important Assumptions of National Risk Assessments
In our paper, we introduce a hypothetical set of six risks A–F (which vary by probability and consequences) to illustrate some key issues when undertaking NRA and when using NRAs and risk matrices to communicate national risk or inform prevention and mitigation.
We demonstrate how changing fundamental analysis assumptions changes the ordinal prioritisation of the risks. The importance of this is that the basis of the assumptions is often opaque to end users, or has not been authorised by public debate and stakeholder input (noting that future generations are also stakeholders).
The assumptions we systematically alter are: the scenario of choice (challenging scenario vs worst case), the time horizon of interest (one year, 50-years), the discount rate on future value (0%, 3%), and decision rule.
We demonstrate how different assumption combinations alter the ordinal priority of the risks A–F (when considering just expected fatalities for simplicity). We show that varying the evaluation assumptions leads to different prioritisation of risks in 7 out of 8 analyses, thereby emphasising the critical importance of agreeing on process assumptions.
Probability-consequence Risk Matrices
The next section of our paper reiterates some criticisms of probability-consequence risk matrices in the context of NRA. We note that such matrices are fairly arbitrary constructions. Risk matrices generally look something like the following figure. Risks are placed in categories according to likelihood and expected impact. Darker regions (purple, red, orange) allegedly represent more salient risks than lighter regions (yellow, green, blue).
Figure 1: A probability-impact risk matrix
We dispense with the colours and simply plot our demonstration risks A–F on axes representing likelihood and impact. A concrete example of the misleading nature of risk matrices (if categories are used) can be seen in the following figure. Risks ‘F’, ‘D’, and ‘B’ all appear to cluster in one region, towards the ‘upper right’, ie, the highest priority area of the risk matrix. Yet, the numerical consequence in expectation (fatalities) of risk D is 20x that of risk B. This may be somewhat apparent when the logarithmic axes are labelled and the risks are plotted in a scatterplot, but it would be completely obscure in the coloured matrix above.
Figure 2: Risks with vastly different consequences in expectation can cluster in risk matrices
We provide further examples in the paper illustrating how risks with the highest consequence in expectation can end up being equated with minor common events due to the heat-map nature of some risk matrices.
Global catastrophic and existential risks
Not only do fundamental assumptions and communication choices bias the assessment of national risks, but cross-border risks and in particular global catastrophic and existential risks are seldom included in NRAs.
Our analysis of five NRAs (and Kohler’s 2023 analysis of nine) shows that no NRA appears to include many, if any, GCRs or x-risks. Surprisingly the Norwegian NRA mentions in one sentence that a large volcanic eruption could ‘cool the earth by several degrees’. But then never mentions the global consequences of what could be the single most catastrophic impact contemplated by any NRA.
In our paper, we consider only the existential risks among a set of GCRs and ignore the more likely but non-existential manifestations of the same risks. Simple estimates reveal that several of these risks harbour annualised consequences in expectation greater than all typically occurring natural hazards combined.
Even when only considering people alive today, and with a time horizon of just one year, the consequence in expectation of several existential risks appears higher than all other risks commonly included in NRAs. We identify the exclusion of GCRs and x-risks from NRAs as a critical process error.
A longtermist perspective is not needed to prioritise existential risk mitigation and potentially detracts from getting such risks onto the agenda for assessment. Indeed, it appears that standard risk assessment processes, and standard government cost-effectiveness analyses should be enough to reveal the overwhelming priority of GCRs and x-risks in NRA (Shulman & Thornley have posted recently on the EA Forum agreeing with this point at length).
We argue in the paper that deliberation over such risks and whether they ought to be prioritised for mitigation, can only happen if they are included in the NRA, characterised, communicated to stakeholders, and put forward to resource prioritisation processes for prevention or mitigation.
Kohler’s new paper notes that the European Commission specifically recommends that NRAs include risks (no matter how rare) if the likely impact exceeds 0.6% of gross national income and the time horizon of interest should ideally be at least 25–35 years. These instructions mean that all GCRs and x-risks should be assessed in NRAs.
Indeed, the US has recently passed a world-leading Global Catastrophic Risk Management Act, which mandates exactly this kind of systematic assessment of GCRs and x-risks, along with response plans to ensure basic necessities are available post-facto (we have blogged about this Act). There is no good reason why all countries can’t replicate this legislation or at least empower the United Nations to do it for all countries/regions (you can read a recent 2023 discussion of existential risks and the UN Office for Disaster Risk Reduction by the Simon Institute here).
Example: Pandemics
Pandemics are an interesting case, and although we don’t dwell on them specifically in our paper, the Covid-19 pandemic illustrates the clear shortcomings of NRAs. We argue in our paper that national risks should not be presented in a risk matrix, but should be communicated quantitatively in ordinal fashion according to the consequence in expectation of agreed scenario types, across an agreed timeframe, under an agreed discount rate.
A standard national risk assessment presents the risk of pandemics something like this:
Figure 3: Human pandemic as a relatively likely & catastrophic risk (source: DPMC publication: ‘NZ’s National Security System’ Sept 2011).
However, Kohler points out that the Covid-19 pandemic has already exceeded the most severe pandemic scenario in most NRAs. This is even though it ‘only’ had an infection fatality ratio of less than 0.6%. Even the conservative official death toll from Covid-19 accounts for 95% of the deaths from disasters in the 21st Century. The other 5% include all deaths from the 2010 Haiti earthquake, plus the 2004 Indian Ocean tsunami, plus the 2008 Myanmar cyclone (about 200,000 deaths each).
If the risk from human pandemics in the first two decades of the 21st Century were presented in a treemap chart (rather than a risk matrix) it might look something like this, thereby revealing the real salience of human pandemics:
Figure 4: Gestural treemap chart showing scale of pandemics in the 21st Century vs other major disasters
Indeed, Kohler found that only Switzerland and the Netherlands have chosen risk impact categories at the upper end that roughly correspond to the impact of Covid-19. And these categories would not discriminate between Covid-19 and a worse pandemic in the risk matrix.
It has been our own experience that even using the Swiss method for NRA, applied in a workshop on the nuclear war/winter hazard risk to a non-combatant nation, that these upper impact categories are seriously inadequate.
The reality is that if NRAs were actually presented as Treemap charts, or in some other form than risk matrices, and if the suite of GCRs and x-risks was included, then the picture of risk communicated would look very different. Over longer periods of time most (almost all?) expected disaster deaths come from a few worst case scenarios.
However, any presentation of a chart or graph is packed with foundational assumptions and can obscure uncertainties.
Uncertainty and Assumption
We acknowledge that the probability of GCRs and x-risks is highly uncertain. But this appears to be the case with many risks already included in NRAs. For example Kohler reports that the likelihood of a −1600 nano-tesla (nT) solar storm was cited as 1:80 per annum in the 2015 Swiss NRA, but 1:1700 in the 2020 version of the same analysis. The explanation was that a mathematical analysis concluded that intensity of solar storms decreases with time since an event. Yet, research post-dating that analysis suggests that tree ring radiocarbon evidence might indicate large solar storms might be much more common than we have thought. More expert input appears to be needed.
Similarly, for volcanic eruptions, the probability of a volcano affecting Switzerland was estimated at 1:70,000 whereas the UK’s analysis cited 1:20 to 1:4. Kohler notes an annualised baseline probability of 1:3000 for a Volcanic Explosivity Index (VEI) 6+ eruption in Europe. However, neither NRA mentions the 1:625 probability of a VEI 7 eruption somewhere else in the world, which like the Mt Tambora eruption of 1815 could have devastating consequences for global agriculture (we discuss the Mt Tambora eruption as it impacted potential island refuges in a separate paper in 2023).
In the present paper, our discussion then proceeds across other issues of uncertainty, including the problem that strength of knowledge poses (eg, equally likely risks but the strength of knowledge underpinning the data varies), the problem of dealing with different scenarios of a single hazard, the difficulty of probabilities that change across time, and how all these factors point towards the need for public engagement.
Ultimately, NRAs are a social construction, built upon allegedly reasonable assumptions (about time frame of interest, discount rate, scenarios of choice, and decision rules), and including agreed choices about risk communication methods. All of this needs to be debated openly.
Stakeholder Engagement
Most NRA processes involve little public consultation and in some instances overt secrecy. There is a documented lack of awareness of NRRs, even among local authorities to whom they are in part directed. This is despite the UN advocating for ‘increased access to risk information’ and that, ‘low risk awareness is one of the main challenges’.
It is also unclear if citizens understand the foundational assumptions underpinning NRAs and whether they would authorise them if they did.
In the paper we identify a range of arguments that would support wider public and expert engagement, including: risks of potential groupthink, politicisation, or uncertainty.
We note that scrutiny must logically first be applied to the underlying process assumptions, then to the resulting empirical claims, and finally deliberative prioritisation (for prevention, mitigation or further research) can take place. We propose the development of a freely available, open-access, risk communication and engagement tool to facilitate discussions on NRAs. Aspects of such a tool could be tailored to experts and other aspects to the general public.
In our paper we lay out the rationale for expert engagement, public engagement, and describe in some detail the sort of interactive online tool that could be deployed to support such engagement.
Conclusions
In our paper we identified two shortcomings of National Risk Assessment (NRA) processes: lack of transparency around foundational assumptions, and exclusion of the largest scale risks.
We discuss the importance of agreeing on key assumptions before conducting a NRA. The assumptions include methodological and normative choices that determine which risks are included, how they are characterised over time, and how uncertainties are expressed in risk communication.
A hypothetical demonstration set of risks is used to show how choices around time horizon, discount rate, and impact estimation affect risk characterisation. We highlighted the potentially dominating importance of global catastrophic and existential risks, which are often omitted from NRAs, and suggested using standard risk assessment and cost-effectiveness analyses to address them.
Given the array of possible assumptions, uncertainties and inclusions, it is crucial that those undertaking NRA engage the public and a broad array of experts in the NRA process through a transparent and two-way risk communication process. This could help legitimise key assumptions, avoid omitting important risks, and provide robust critique of risk characterisations and the knowledge underpinning them.
Revolutionising National Risk Assessment (NRA): improved methods and stakeholder engagement to tackle global catastrophe and existential risks
Co-authored with Nick Wilson
Summary/TLDR
In a recently published paper, we identified two major shortcomings of National Risk Assessment (NRA) processes:
Lack of transparency around foundational assumptions
Exclusion of the largest scale risks
We demonstrate the potential problems and ambiguities that arise in NRA due to these shortcomings.
We identify the exclusion of global catastrophic risks (GCRs) and existential risks (x-risks) from NRAs as a critical process error.
Even when only considering people alive today, and with a time horizon of just one year, the consequence in expectation of several existential risks is higher than all other risks commonly included in NRAs.
A longtermist perspective is not needed to prioritise existential risk mitigation through NRA, and potentially detracts from getting such risks onto the agenda for assessment.
We propose the development of a freely available, open-access, risk communication and engagement tool to facilitate stakeholder discussions on NRAs.
This post is a partial and high-level summary of our research paper on national risk assessment (NRA) published in the academic journal Risk Analysis in March 2023. This post also places our work in the context of another recent report on NRA identifying common ground. Consider reading our full paper for complete details of our thinking on NRA as it applies to global catastrophe, and existential risk.
Introduction
Many countries undertake National Risk Assessment (NRA) to evaluate risks of national significance, assessing for example, natural hazards, infectious diseases, industrial accidents, terrorist attacks, cyberattacks, organised crime, or institutional failure. The NRA process is complex and cross-sectoral, often excluding risks with low probability, and often has a short-term focus of less than five years. The outputs of NRA tend to communicate results in some form of National Risk Register (NRR) and/or consequence-probability (C,P) risk matrix.
However, NRAs and NRRs can be criticised particularly where the common practice of presenting a two-dimensional risk matrix obscures uncertainties, stakeholder disagreements on values, bias, and systematic errors. Critically, the exclusion of large-scale (and cross-border) risks such as global catastrophic risks (GCRs) and existential threats to humanity (x-risks) is another limitation of NRAs.
The aim of NRA should be to find common understanding across stakeholders of risks and priorities, stimulate local authorities to build capacity and capability, and identify common consequences across multiple risks. Prioritisation of risks is sometimes explicitly intended through the NRA process, but methods for prioritisation depend on foundational assumptions of the NRA process that are not always clearly articulated.
Aim of our paper
Our paper sought to demonstrate some shortcomings of existing NRA processes and outputs, namely:
How the choice of fundamental NRA process assumptions makes a material difference to the NRR output and any subsequent deliberation on risk.
The weaknesses and ambiguity of risk matrices for communicating NRAs.
A major class of risks often neglected by NRA (namely GCRs and x-risks).
The difficulties that uncertainty poses.
We then suggest how those undertaking NRA could enter a productive dialogue with stakeholders, supported by an interactive communication and engagement tool, to overcome some of these difficulties (details of that are in the paper, not the post below).
We note that another report, by Kevin Kohler, titled National Risk Assessments of Cross-Border Risks was published in February 2023, shortly before our paper. Throughout this post we also highlight some of the key points therein.
Important Assumptions of National Risk Assessments
In our paper, we introduce a hypothetical set of six risks A–F (which vary by probability and consequences) to illustrate some key issues when undertaking NRA and when using NRAs and risk matrices to communicate national risk or inform prevention and mitigation.
We demonstrate how changing fundamental analysis assumptions changes the ordinal prioritisation of the risks. The importance of this is that the basis of the assumptions is often opaque to end users, or has not been authorised by public debate and stakeholder input (noting that future generations are also stakeholders).
The assumptions we systematically alter are: the scenario of choice (challenging scenario vs worst case), the time horizon of interest (one year, 50-years), the discount rate on future value (0%, 3%), and decision rule.
We demonstrate how different assumption combinations alter the ordinal priority of the risks A–F (when considering just expected fatalities for simplicity). We show that varying the evaluation assumptions leads to different prioritisation of risks in 7 out of 8 analyses, thereby emphasising the critical importance of agreeing on process assumptions.
Probability-consequence Risk Matrices
The next section of our paper reiterates some criticisms of probability-consequence risk matrices in the context of NRA. We note that such matrices are fairly arbitrary constructions. Risk matrices generally look something like the following figure. Risks are placed in categories according to likelihood and expected impact. Darker regions (purple, red, orange) allegedly represent more salient risks than lighter regions (yellow, green, blue).
Figure 1: A probability-impact risk matrix
We dispense with the colours and simply plot our demonstration risks A–F on axes representing likelihood and impact. A concrete example of the misleading nature of risk matrices (if categories are used) can be seen in the following figure. Risks ‘F’, ‘D’, and ‘B’ all appear to cluster in one region, towards the ‘upper right’, ie, the highest priority area of the risk matrix. Yet, the numerical consequence in expectation (fatalities) of risk D is 20x that of risk B. This may be somewhat apparent when the logarithmic axes are labelled and the risks are plotted in a scatterplot, but it would be completely obscure in the coloured matrix above.
Figure 2: Risks with vastly different consequences in expectation can cluster in risk matrices
We provide further examples in the paper illustrating how risks with the highest consequence in expectation can end up being equated with minor common events due to the heat-map nature of some risk matrices.
Global catastrophic and existential risks
Not only do fundamental assumptions and communication choices bias the assessment of national risks, but cross-border risks and in particular global catastrophic and existential risks are seldom included in NRAs.
Our analysis of five NRAs (and Kohler’s 2023 analysis of nine) shows that no NRA appears to include many, if any, GCRs or x-risks. Surprisingly the Norwegian NRA mentions in one sentence that a large volcanic eruption could ‘cool the earth by several degrees’. But then never mentions the global consequences of what could be the single most catastrophic impact contemplated by any NRA.
In our paper, we consider only the existential risks among a set of GCRs and ignore the more likely but non-existential manifestations of the same risks. Simple estimates reveal that several of these risks harbour annualised consequences in expectation greater than all typically occurring natural hazards combined.
Even when only considering people alive today, and with a time horizon of just one year, the consequence in expectation of several existential risks appears higher than all other risks commonly included in NRAs. We identify the exclusion of GCRs and x-risks from NRAs as a critical process error.
A longtermist perspective is not needed to prioritise existential risk mitigation and potentially detracts from getting such risks onto the agenda for assessment. Indeed, it appears that standard risk assessment processes, and standard government cost-effectiveness analyses should be enough to reveal the overwhelming priority of GCRs and x-risks in NRA (Shulman & Thornley have posted recently on the EA Forum agreeing with this point at length).
We argue in the paper that deliberation over such risks and whether they ought to be prioritised for mitigation, can only happen if they are included in the NRA, characterised, communicated to stakeholders, and put forward to resource prioritisation processes for prevention or mitigation.
Kohler’s new paper notes that the European Commission specifically recommends that NRAs include risks (no matter how rare) if the likely impact exceeds 0.6% of gross national income and the time horizon of interest should ideally be at least 25–35 years. These instructions mean that all GCRs and x-risks should be assessed in NRAs.
Indeed, the US has recently passed a world-leading Global Catastrophic Risk Management Act, which mandates exactly this kind of systematic assessment of GCRs and x-risks, along with response plans to ensure basic necessities are available post-facto (we have blogged about this Act). There is no good reason why all countries can’t replicate this legislation or at least empower the United Nations to do it for all countries/regions (you can read a recent 2023 discussion of existential risks and the UN Office for Disaster Risk Reduction by the Simon Institute here).
Example: Pandemics
Pandemics are an interesting case, and although we don’t dwell on them specifically in our paper, the Covid-19 pandemic illustrates the clear shortcomings of NRAs. We argue in our paper that national risks should not be presented in a risk matrix, but should be communicated quantitatively in ordinal fashion according to the consequence in expectation of agreed scenario types, across an agreed timeframe, under an agreed discount rate.
A standard national risk assessment presents the risk of pandemics something like this:
Figure 3: Human pandemic as a relatively likely & catastrophic risk (source: DPMC publication: ‘NZ’s National Security System’ Sept 2011).
However, Kohler points out that the Covid-19 pandemic has already exceeded the most severe pandemic scenario in most NRAs. This is even though it ‘only’ had an infection fatality ratio of less than 0.6%. Even the conservative official death toll from Covid-19 accounts for 95% of the deaths from disasters in the 21st Century. The other 5% include all deaths from the 2010 Haiti earthquake, plus the 2004 Indian Ocean tsunami, plus the 2008 Myanmar cyclone (about 200,000 deaths each).
If the risk from human pandemics in the first two decades of the 21st Century were presented in a treemap chart (rather than a risk matrix) it might look something like this, thereby revealing the real salience of human pandemics:
Figure 4: Gestural treemap chart showing scale of pandemics in the 21st Century vs other major disasters
Indeed, Kohler found that only Switzerland and the Netherlands have chosen risk impact categories at the upper end that roughly correspond to the impact of Covid-19. And these categories would not discriminate between Covid-19 and a worse pandemic in the risk matrix.
It has been our own experience that even using the Swiss method for NRA, applied in a workshop on the nuclear war/winter hazard risk to a non-combatant nation, that these upper impact categories are seriously inadequate.
The reality is that if NRAs were actually presented as Treemap charts, or in some other form than risk matrices, and if the suite of GCRs and x-risks was included, then the picture of risk communicated would look very different. Over longer periods of time most (almost all?) expected disaster deaths come from a few worst case scenarios.
However, any presentation of a chart or graph is packed with foundational assumptions and can obscure uncertainties.
Uncertainty and Assumption
We acknowledge that the probability of GCRs and x-risks is highly uncertain. But this appears to be the case with many risks already included in NRAs. For example Kohler reports that the likelihood of a −1600 nano-tesla (nT) solar storm was cited as 1:80 per annum in the 2015 Swiss NRA, but 1:1700 in the 2020 version of the same analysis. The explanation was that a mathematical analysis concluded that intensity of solar storms decreases with time since an event. Yet, research post-dating that analysis suggests that tree ring radiocarbon evidence might indicate large solar storms might be much more common than we have thought. More expert input appears to be needed.
Similarly, for volcanic eruptions, the probability of a volcano affecting Switzerland was estimated at 1:70,000 whereas the UK’s analysis cited 1:20 to 1:4. Kohler notes an annualised baseline probability of 1:3000 for a Volcanic Explosivity Index (VEI) 6+ eruption in Europe. However, neither NRA mentions the 1:625 probability of a VEI 7 eruption somewhere else in the world, which like the Mt Tambora eruption of 1815 could have devastating consequences for global agriculture (we discuss the Mt Tambora eruption as it impacted potential island refuges in a separate paper in 2023).
In the present paper, our discussion then proceeds across other issues of uncertainty, including the problem that strength of knowledge poses (eg, equally likely risks but the strength of knowledge underpinning the data varies), the problem of dealing with different scenarios of a single hazard, the difficulty of probabilities that change across time, and how all these factors point towards the need for public engagement.
Ultimately, NRAs are a social construction, built upon allegedly reasonable assumptions (about time frame of interest, discount rate, scenarios of choice, and decision rules), and including agreed choices about risk communication methods. All of this needs to be debated openly.
Stakeholder Engagement
Most NRA processes involve little public consultation and in some instances overt secrecy. There is a documented lack of awareness of NRRs, even among local authorities to whom they are in part directed. This is despite the UN advocating for ‘increased access to risk information’ and that, ‘low risk awareness is one of the main challenges’.
It is also unclear if citizens understand the foundational assumptions underpinning NRAs and whether they would authorise them if they did.
In the paper we identify a range of arguments that would support wider public and expert engagement, including: risks of potential groupthink, politicisation, or uncertainty.
We note that scrutiny must logically first be applied to the underlying process assumptions, then to the resulting empirical claims, and finally deliberative prioritisation (for prevention, mitigation or further research) can take place. We propose the development of a freely available, open-access, risk communication and engagement tool to facilitate discussions on NRAs. Aspects of such a tool could be tailored to experts and other aspects to the general public.
In our paper we lay out the rationale for expert engagement, public engagement, and describe in some detail the sort of interactive online tool that could be deployed to support such engagement.
Conclusions
In our paper we identified two shortcomings of National Risk Assessment (NRA) processes: lack of transparency around foundational assumptions, and exclusion of the largest scale risks.
We discuss the importance of agreeing on key assumptions before conducting a NRA. The assumptions include methodological and normative choices that determine which risks are included, how they are characterised over time, and how uncertainties are expressed in risk communication.
A hypothetical demonstration set of risks is used to show how choices around time horizon, discount rate, and impact estimation affect risk characterisation. We highlighted the potentially dominating importance of global catastrophic and existential risks, which are often omitted from NRAs, and suggested using standard risk assessment and cost-effectiveness analyses to address them.
Given the array of possible assumptions, uncertainties and inclusions, it is crucial that those undertaking NRA engage the public and a broad array of experts in the NRA process through a transparent and two-way risk communication process. This could help legitimise key assumptions, avoid omitting important risks, and provide robust critique of risk characterisations and the knowledge underpinning them.