Governance Strategies for Dual-Use Research of concern: Balancing Scientific Progress and Global Security
Diane Letourneur, Patrice Binder, and Jean-Claude Dupont
Introduction
During the 20th century, humanity developed remarkable scientific and technological advancements that revolutionized our understanding of the world and enabled spectacular progress in many fields. In the health sector, the discovery of antibiotics by Alexander Fleming in 1928 increased life expectancy by ten years. In biology, the resolution of the DNA structure by James Watson, Francis Crick, and Rosalind Franklin in 1953 laid the foundation of modern molecular biology, opening new perspectives in genetics, medicine, and biotechnology. Albert Einstein’s theory of relativity, formulated at the beginning of the century, radically changed our understanding of space, time, and gravity, paving the way for new research in theoretical physics and astronomy. Lastly, advances in the field of computing, such as the creation of the personal computer and the development of the Internet, transformed the way we communicate, work, and access information, marking the beginning of the digital era in which we live today.
However, these discoveries are all accompanied by challenges related to ethics, safety, and security. During the development of the atomic bomb, humanity became fully aware of the destructive potential of certain technologies. For the first time, it became evident that a human production could trigger a catastrophe on a global scale. Thus, how can we continue technological progress while limiting the associated risks? In this report, we explore different avenues to address this question.
Definitions |
|
Dual-use research of concern
Scientific research is conducted either to pursue knowledge or for “deterministic” purposes such as public health, industrial development, environmental protection, defense, technological innovation, etc. Risks associated with research can be of two types: firstly, potential consequences in case of incidents or accidents during research activities, and secondly, the potential misuse of research outcomes for malicious purposes. The misuse of research products for malicious ends is termed “misuse”. These risks involve not only tangible assets (materials, equipment, devices, microorganisms, cells, molecules, etc.) but also intangible assets (processes, technologies, knowledge, ideas resulting from research, publications). In the case of intangible assets, the dissemination of information could either lead to malicious acts or cause direct or indirect harm due to their content, or enable malicious organizations or individuals to exploit them for such purposes. These are information hazards, or “infohazards”, directly linked to the inappropriate circulation of information.
Dual-use research primarily refers to research that can have both civilian and military applications. For example, nuclear fission produces energy that can be converted into electricity or used in the development of nuclear weapons. Dual-use research is categorized either by its subject matter or by its applications: in one instance, it involves “dual-use goods” and is therefore regulated by international agreements; in the other instance, it has legitimate outcomes with applications in both civilian and military contexts, although military applications may not be the primary objective. Moreover, in fields like biology and medicine, “dual-use research” or “dual-use technologies” refer to legitimate research that can potentially be diverted for illegitimate purposes.
In this report, we focus on “dual-use research of concern”. Such research serves a legitimate purpose but carries intrinsic risks. These risks include safety concerns due to potential consequences in case of incidents or accidents in the laboratories where they are conducted, which could be unacceptable (endangering human, animal, or environmental health). There are also security risks due to potential misuse (terrorism, chemical or biological weapons, abuse of trust or position, improper control and diversion of goods and people). For example, facial recognition algorithms can enhance access control to restricted areas (military bases, P4 laboratories), but they can also lead to abusive surveillance technologies and unacceptable restrictions on freedoms in cases of misuse.
The risks of misuse are not limited to hard sciences, technology, biological, and medical research fields. They increasingly affect social sciences, behavioral sciences, communication sciences, and political sciences, among others. The rise of “artificial intelligence” opens new research avenues but also introduces new risks. The rapid access to ever-expanding and often poorly curated information universes raises concerns about “infodemics,” a paradoxical phenomenon where increased information accessibility is accompanied by a flood of data, making it challenging to distinguish reality from speculation or manipulation. Thus, dynamics of disinformation and misinformation are now integral to “dual-use research of concern”.
It is impossible to establish precise boundaries and lists of scientific domains that could generate dual-use and of concern research. This concept is not categorical; it should be understood as a component within the spectrum of dual-use research that is not fixed but subject to evolution over time, particularly with technological, social, and geopolitical developments. The difficulty in classifying and defining dual-use research of concern poses a challenge when it comes to establishing robust governance for these studies — that is, effective mechanisms to regulate these research activities, manage risks, prevent misuse, while maintaining the inherent curiosity and inventiveness of scientific endeavors.
However, it should not be said that “all research can be dual-use” and therefore do nothing or ban everything. While it may be difficult to distinguish dual-use research in its entirety, this observation can be nuanced when focusing on specific domains. Indeed, in certain disciplines, it is possible to establish a list of research domains requiring traceability and specific regulations. For instance, certain pathogens and toxins presenting particular risks for defense (both military and civilian) or public health have been listed and are subject to regulation under the French Public Health Code. This is governed by the Micro-Organisms and Toxins (MOT) regulation, which controls their possession, use, and transfer.
Internationally, there are conventions aimed at combating the proliferation of so-called “weapons of mass destruction.” For example, the objective of the Convention on the Prohibition of Biological Weapons and Toxins (BWC) or the Chemical Weapons Convention (CWC). The latter lists chemical warfare agents and their chemical precursors, the possession of which requires declaration and monitoring. States parties to these conventions have also committed to destroying existing stocks, old weapons discovered, or abandoned weapons from past conflicts. These conventions do not directly address research issues or chemical or biological terrorism. However, a Security Council resolution has established a framework specifically addressing terrorism, particularly terrorism attributable to non-state actors (Resolution 1540 of 2004). This resolution encourages states parties to establish legal frameworks to combat the misuse of technologies and promotes inter-state cooperation in combating nuclear, radiological, biological, and chemical terrorism.
A potentially dual-use and of concern research must therefore undergo a risk assessment. This assessment should balance these risks against the expected benefits: a relatively low risk can question the ethical nature of a research project in the absence of real expected benefits, while research with significant and documented benefits may justify, especially ethically, taking risks if they are measured and controlled.
Assessing the benefits and risks
A key challenge in governing dual-use research of concern is to assess both the benefits and risks associated with such research. This evaluation must not only consider the current context but also be forward-looking, striving to anticipate benefits and risks in light of technological advancements. Therefore, this assessment is not static but subject to change over time. For instance, risk assessment is closely tied to technology availability. The development of dual-use research of concern is particularly worrying when associated technologies are rare and hard to access; conversely, while the danger may be greater, the risk of exacerbating it decreases if these technologies are already widely disseminated.
While there are no universal tools and methodological frameworks to perform this analysis of benefits and risks, there are methods for determining an acceptable residual risk taking into account the consequences of taking a risk. These methods address project management, events, as well as organizational contexts and systems.
It is also important to note that various stakeholders may have drastically different views on the expected benefits and therefore use different methodological evaluation tools. The industrial sector, for instance, often focuses more on economic objectives, while competition between teams or states may motivate certain risk-taking behaviors. Sometimes, this gamble can be justified when the risk of losing leadership or technological downgrade outweighs the development of sensitive research.
Finally, the notion of risk can vary according to the values specific to each culture. This point, like the previous one, explains the difficulty in achieving international coordination and standards. The genetic modification of human embryos by a Chinese laboratory in 2018 illustrates this divergence in values. The team used CRISPR technology to deactivate the gene encoding the HIV receptor. The justification for this experimentation in the name of medical interest collided with strong ethical convictions, triggering a public debate.
Due to the diversity of perspectives and societal stakes involved, effective governance of dual-use research of concern must encompass not only technical and scientific aspects but also a broader reflection on risks, expected benefits, and their fair distribution. Risk assessment is thus intrinsically linked to ethical considerations to determine the acceptability of risks considering anticipated benefits.
The notion of responsibility
The evaluation of benefits and risks is necessarily a collective effort and cannot solely rely on researchers. Various stakeholders may participate in conducting dual-use research of concern and can contribute to either limiting or increasing associated risks.
Scientists propose research projects that must align with the scientific policy and strategic directions of the institution hosting the team. These projects also need approval from funders to proceed. Furthermore, the results of research, especially academic research, are intended for publication. The editing and revision process represents a final step where scientific results are verified and validated, alongside an ethical compliance check. Here, the risk of disseminating “dual-use research” information may lead to requests for revision or, in extreme cases, publication rejection.
The choice can thus be made to steer research towards lower-risk alternatives, even before experiments begin. Funders and publishers can exert pressure in this direction by refusing to finance or publish projects deemed too dangerous, or more precisely, projects that represent a disproportionate risk. The evaluation steps for funding or publication thus serve as checkpoints to identify situations, including information, that pose “dual-use research” risks. This control is not incompatible with principles of academic freedom or open access. The former cannot be exercised without consideration of serious harm, while principles of open science advocate for science to be “as open as possible, as closed as necessary” (per recommendations of the European Commission). It is crucial to aim for maximum openness while protecting sensitive data to preserve international scientific cooperation, avoid centralization of power through inappropriate control of information, and enable scientific progress without unintended consequences.
These considerations on dual-use research of concern and the constraints they pose now make it necessary to cultivate a risk culture within the scientific community, particularly a good understanding of dual-use issues. Institutions as well as educational centers should actively work to promote this risk culture and raise awareness among scientists about these issues.
The question of responsibility for researchers in this regard is far from being resolved. The techniques and technologies used in research are not necessarily an end in themselves. They must be contextualized and distinguished from the objective of the research project in which they are employed and the consequences of their implementation. For example, genome editing (using techniques such as CRISPR) is distinct from gain-of-function research, an application often aimed at vaccine development or therapy. However, gain-of-function research can increase pathogen virulence and pose risks in case of laboratory accidents. It can also be misused for bioterrorism, exemplifying the misuse of legitimate therapeutic research applications. To what extent researchers are responsible for anticipating and preventing such misuses remains a matter of debate without consensus.
When it comes to military and defense research, states are responsible for implementing measures against the proliferation of weapons of mass destruction in accordance with their international commitments in this regard. This includes, for example, the 1972 Convention on the Prohibition of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on their Destruction, which entered into force in 1975, and the 1993 Chemical Weapons Convention, which entered into force in 1997, as mentioned above. These same states are obligated to ensure the scientific and economic competitiveness of the civilian research sector in a particularly competitive environment. This dual responsibility is exercised by ensuring the integrity and ethical standards of scientific research, which are necessary for their international credibility and that of their scientists and research institutions.
Finally, journalists, influencers, and more broadly, generalist or specialized media play a particular role and can therefore raise alarms, either positively or negatively, about dual-use research of concern. Depending on their intentions, ethics, or beliefs, they have the ability to promote certain research projects but also to criticize, rightly or wrongly, the conduct of experiments deemed dangerous. For instance, when two independent teams conducted selection pressure on the H5N1 virus, making it transmissible among mammals, the scientific controversy over the justification of this research was widely debated in the public sphere. While media involvement can promote action and stimulate debate beyond the circle of scientific and public health specialists, it can also expose “infohazards” – risks associated with the misuse or manipulation of scientific information. Highlighting potential dangers or disclosing sensitive information increases the likelihood that malicious entities might exploit these ideas, especially as techniques for manipulating life are now accessible outside of complex technical facilities (so-called “Do-It-Yourself science”). With the use of social networks as an alternative source of information to traditional media, this issue now concerns not only journalists but also the public, particularly every user of these communication channels.
Governance Strategies
Several governance strategies, not mutually exclusive, can be considered to support the control of dual-use research of concern and ensure their secure conduct.
First, access to premises can be secured and controlled, with authorized personnel undergoing screening and tracking. In France, the General Secretariat for Defense and National Security (SGDSN), responsible for protecting the nation’s scientific and technical potential, has regulatory tools through the classification of certain laboratories as Restricted Regulatory Zones (ZRR). Access to these zones involves procedures that include personnel declaration upon entry and traceability.
This screening process is accompanied by a restriction on the number of affected locations. It can also, on a case-by-case basis, limit the number of authorized personnel conducting dual-use research of concern if it involves Restricted Regulatory Zones.
The establishment of a regulatory process with a “top-down” approach could be considered following a specific classification after risk identification. This approach is sometimes proposed. It can “secure” the responsibility of researchers and organizations under the cover of prior authorization by a decision-making third party. Certain specific domains are already subject to regulations, as mentioned earlier. Internationally, it has been previously discussed, for example, in the Chemical Weapons Convention, which has a verification process focusing mainly on the development, production, and storage of weapons. Verification operations essentially involve spot-checking the compliance of declaration obligations by the 190 State Parties. As for the Biological Weapons Convention, it lacks a verification mechanism, and declarations related to activities that could directly or indirectly contribute to the proliferation of weapons are not coercing even though strongly encouraged. This situation arises from the impossibility of establishing a list of biological weapons and from the veto of certain States against proposals to adopt a verification process. One limitation of this “top-down” approach is that it can also become an additional deterrent administrative burden, especially since it can be lengthy or challenging to adapt to evolving risks and scientific practices.
In general, even if there were a list of dual-use research of concern, measures would not be able to encompass unpredictable risks and would inevitably lag behind technological developments. Therefore, this approach alone would not effectively counter risks, and deploying overly strict restrictions could even be detrimental. An intuitive idea might be to systematically increase containment levels and enhance security measures as a precaution. While this could mitigate residual risks, it comes with significant constraints and does not guarantee the elimination of security or safety failures. Moreover, implementing uniform heavy measures for both low and high risks creates a false perception of risk and increases the likelihood of non-compliance. The principle of “justified need” is crucial to tailor security measures to the actual level of risk. This argues, rather than for a systematic “list-based” approach, for an approach that is tailored to each project, considering the context in which it is conducted.
Administrative control is often perceived as an opportunity for unbearable constraints and interference with the principle of academic freedom, especially if its justification is insufficient or poorly understood. These constraints can discourage researchers from pursuing certain research paths; if not applied equitably, they can also be seen as a missed opportunity for those subjected to them compared to competitors who are not. In this regard, discernment, and the principle of “justified need” must remain the focus of regulatory objectives. Moreover, the reflex to abandon research in response to increased regulation can lead to unintended consequences: loss of knowledge and expertise in sensitive areas, increased vulnerability of states, and inadequacy in responding to crisis situations. Therefore, the practical outcomes of research should be considered over the long term. For example, research conducted on anthrax in the 1990s proved beneficial during the anthrax attacks in 2001. Do the benefits of retaining a certain number of experts working on sensitive subjects not justify taking risks based on trust in personnel aware of their responsibilities and the need to maintain their skills? These benefits could be lost by adopting an approach that fosters distrust of scientific personnel through punitive legal measures.
In the 1980s, Ryan and Deci formulated the paradox of external regulation, observing that the use of rewards or external constraints diminishes individuals’ intrinsic motivation to perform a task. To establish and maintain an environment conducive to fostering a culture of safety and responsibility regarding dual-use research of concern, the National Council for Biological Security (CNCB), a French institution jointly overseen by the National Academy of Sciences and the SGDSN (General Secretariat for Defense and National Security), proposed that relevant institutions establish a “committee for monitoring dual-use research of concern.” This committee operates on a principle of self-regulation by researchers. In this “bottom-up” governance model, the projects in question are evaluated and monitored throughout their development, ideally initiated by the researchers leading the projects themselves. The committee, composed of peer scientists, evaluates the balance of benefits and risks using a simple analytical framework and discusses each case individually. These exchanges aim to propose less risky alternatives if they exist.
The role of the institution is crucial in ensuring the effectiveness of self-regulation in this “bottom-up” model. To achieve this, the institution must foster a culture of reflection and risk management around research through implementation of training or awareness-raising actions. Additionally, it should establish efficient internal processes to identify and report risks, fostering a relationship of trust that encourages researchers to engage with the monitoring committee. The transparency of such a system encourages exemplary behavior driven by ethical values and awareness of one’s reputation, as well as that of the institution or scientific community, especially in a context marked by societal pressures that should not be underestimated. The “bottom-up” approach relies on self-regulation mechanisms for several reasons: individual researchers are in constant contact with their research, capable of detecting emerging risks; a collective of peers or colleagues represents an optimal level of competence and expertise in the relevant field; and a scientific institution, like each scientist within it, is committed to maintaining trust in science, driven by intrinsic motivations, ethical considerations, or external constraints related to regulatory, funding, or publishing conditions. This combination of factors aims to ensure efficiency in the process, whereas a “top-down” approach characterized by increased regulatory oversight faces two limitations: difficulty in adaptation and potential ease of circumvention.
To ensure the quality of monitoring, committee members must possess both the scientific and ethical expertise necessary for the task. Additionally, to foster a relationship of trust, the selection of members should consider potential conflicts of interest: the fear of sharing information with competitors should not discourage scientists from seeking guidance from this committee. The overall success of the peer review process in scientific publishing provides a promising adaptation model for monitoring dual-use research of concern.
In the self-regulation model, as with other governance strategies discussed, it remains challenging to overcome certain limitations. Researchers, like all humans, can be vulnerable to financial motivations, personal ambitions, or a desire to push the boundaries of their work. Additionally, ego or intellectual and technical challenges may drive individuals to engage in risky endeavors with limited societal benefits. On these issues, even if the majority of the community acts with integrity, the actions of a single malicious or negligent person can lead to significant harm. Comprehensive awareness among all stakeholders in the scientific community regarding the dual-use issues is therefore necessary. This awareness can enable funders and publishers to establish additional barriers, discouraging the development of imprudent or deviant behaviors.
The governance of dual-use research of concern is thus a complex challenge that requires a comprehensive and concerted approach. It is imperative that the various stakeholders involved acknowledge their responsibility, both individually and collectively, in managing these risks while promoting scientific progress. By consistently emphasizing transparency, international collaboration, and ethics, it is possible to reconcile security imperatives at all levels, including internationally, with the fundamental need for advancing knowledge.
Note: The Pugwash Movement |
The development of the atomic bomb raised awareness about the potentially catastrophic impact of certain research endeavors. This realization led to the founding of the Pugwash Movement (or “Pugwash Conferences on Science and World Affairs”) in 1957 by Joseph Rotblat and Bertrand Russell, following the publication of the Russell-Einstein Manifesto in 1955. The manifesto highlighted the dangers of weapons of mass destruction (nuclear weapons at that time) and called upon scientists to participate in a conference addressing these dangers. The Pugwash organization was awarded the Nobel Peace Prize in 1995. |
We warmly thank Raphaël Bouganne and Tom David for their review.
Original version of the report: https://www.effisciences.org/en/blog/strategies-gouvernance-recherches-duales-a-risques
References:
Principal reference:
Round table of the 25th of march organized by EffiSciences at the philosophy department of the ENS Ulm, with Jean Claude Dupont (Institut Pasteur), Tom David (Institut Montaigne and PRISM EVAL), Raphaël Bouganne (SGDSN), and Patrice Binder (CNCB)
Inaccuracy or misinterpretation would solely be our responsibility.
Report from the Académie des sciences (Academy of Sciences), 2008, « Les menaces biologiques, biosécurité et responsabilité des scientifiques »
Report from the CNCB, 2018, « Recherches duales a risque, recommandations pour leur prise en compte dans les processus de conduite de recherche en biologie »
Article from the INRAE, « Données de la recherche et science ouverte : le principe d’un accès aussi ouvert que possible, aussi fermé que nécessaire »
Csillik A., Fenouillet F., « Chapitre 13. Edward Deci, Richard Ryan et la théorie de l’autodétermination », in: Philippe Carré éd., Psychologies pour la formation. Paris, Dunod, « Éducation Sup », 2019, p. 223-240.
McDermott W., Rogers D.E., « Social ramifications of control of microbial disease ». The John Hopkins Medical Journal, 1982, p. 151:302-312
Bostrom N., « Information Hazards: A Typology of Potential Harms from Knowledge », Review of Contemporary Philosophy, 2011, Vol. 10 : pp. 44-79
Raposo VL. « The First Chinese Edited Babies: A Leap of Faith in Science. », JBRA Assist Reprod, 2019, p. 22;23(3):197-199
Executive summary: Effective governance of dual-use research of concern requires balancing scientific progress with risk management through a combination of regulatory approaches, self-regulation, and fostering a culture of responsibility among researchers. Key points:
Dual-use research of concern involves legitimate scientific pursuits that carry risks of accidental harm or potential misuse for malicious purposes.
Assessing benefits and risks of such research is challenging and must consider evolving technological, social, and geopolitical contexts.
Multiple stakeholders share responsibility in managing dual-use research risks, including researchers, institutions, funders, publishers, and policymakers.
Governance strategies include securing research facilities, regulating access, and implementing “top-down” regulatory processes, but these approaches have limitations.
A “bottom-up” self-regulation model involving researcher-led monitoring committees and fostering a culture of safety is proposed as a potentially more effective approach.
Comprehensive awareness and international collaboration are crucial to reconcile security imperatives with scientific advancement.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.