Governments’ weakening control over innovation calls for updated methods of arms control and oversight. Universities must play their part in forming security policy and enforcing good practice, argue Brett Edwards and David Galbreath.
David Galbreath and Brett Edwards presenting at the meeting ‘Biological and Chemical Security in an Age of Responsible Innovation’ Held at the Royal Society, London and hosted by the Biochemical Security 2030 Project. Image: R.guthrie.
In October, the United States government announced a “funding pause”—since partially lifted—on gain-of-function research into three viruses, including influenza. These studies attempt to make a pathogen more deadly or infectious in order to improve understanding of how to treat and vaccinate against the disease. The safety and security implications of such work are obvious, and this latest policy development followed a number of emergency meetings at national and international level and a voluntary moratorium among scientists.
But governments cannot, as they once might have done, hope to exercise absolute control over who does what to pathogens. Developments in biotechnology point to a future in which techniques seen as cutting-edge today are increasingly accessible and widespread. Technological changes in areas such as gene sequencing, synthesis and editing promise to speed up laboratory procedures that for now are time-consuming and arduous. And economic and scientific globalisation continue to make technologies more accessible worldwide.
This will influence not only where innovation happens but also who is involved. The emergence of amateur biologist groups, sometimes called biohackers, who operate outside traditional research environments, has already become a symbolic focal point in this regard. Their capacity may be limited for now, but some claim that their emergence foreshadows an increasingly nimble and decentralised model of biotechnology innovation.
The histories of the modern state, technological progress and security are tightly interwoven. But biohacking shows how systems of innovation and security are moving beyond the scope of government control.
The consequences for arms-control regimes will be profound. In the 20th century, states were the main developers and the main targets of cutting-edge weapons. The most devastating of these were developed primarily through state programmes. Countries sought to curtail the use of certain weapons on the battlefield and against civilians primarily through international agreements, out of mutual self-interest and on humanitarian grounds.
At the international level, we remain heavily dependent on 20th-century treaty systems. Biological and chemical weapons, for example, are addressed through two separate treaties: the Biological Weapons Convention, which came into force in 1975, and the 1997 Chemical Weapons Convention.
As with all international treaties, these are products of their time, as well as of politically fraught negotiations. They are imperfect: the treaty on biological weapons, for example, lacks a formal system for the verification of state compliance.
The treaty on chemical weapons, in contrast, boasts a legally binding system of verification through on-site inspections of facilities. However, there are ambiguities about non-lethal chemical weapons, which the treaty permits for use in law enforcement. Advances in science and technology are making such weapons increasingly versatile and sophisticated and, as such, potentially more attractive. For both biological and chemical weapons, there is concern that such failings will undermine state confidence in treaty regimes, potentially resulting in international arms races.
Changes in security, science and technology are making these issues increasingly significant, and might force a transformation in how we think about preventing the development and use of biological and chemical weapons. International security, for example, is no longer solely a question of state-on-state military threats: it has come to include public health, transnational crime, international terrorism and environmental issues.
At the University of Bath we are working on the Biochemical Security 2030 project, which is funded by the Economic and Social Research Council and the Defence Science and Technology Laboratory. It involves studying technologies that challenge existing modes of oversight, and emerging sites of policy development that lie outside international treaty negotiations.
An interesting example of both comes from synthetic biology and its implications for security. This interdisciplinary field has its roots in academic efforts in the US, dating from about 2003, to bring engineering into biology. Several products of synthetic biology have already reached the market, and it is claimed that the field could yield breakthroughs in areas such as drug development, chemical and fuel production, and public health.
Aspects of synthetic biology have also become symbolic of the security challenges posed by 21st-century biotechnology. This stems from a long history of countries weaponising scientific advances, and the concern that the increasing accessibility of synthetic biology may facilitate illegal drug development and terrorism. In most fields, such concerns have followed scientific advances; synthetic biology is unique in that concerns were present from the start, and in some senses have run ahead of its concrete achievements. A number of horizon-scanning, risk-assessment and educational initiatives have been carried out, revealing many of the conceptual, practical and political challenges involved in pre-emptively discussing and addressing concerns related to single pieces of research.
To investigate what UK universities can do and are doing to adapt to this changing security environment, we have sought to develop an understanding of the obstacles and opportunities for policy-making in this area, starting at a local level. We examined the feasibility of forming a network of universities with chemistry and biology departments in south-west England, and hosted a meeting that brought together researchers in relevant fields, staff responsible for biological and chemical safety, and policy-shapers at national and international levels. In particular, we were interested in how such a network might contribute to technology foresight, the education of scientists about misuse, and the development and sharing of good practice in terms of ethics and laboratory security.
Regional Innovation Biochemical Security Meeting, Held by the Biochemical Security 2030 Project, Universtity of Bath, May 1st 2014. Image: R.Guthrie
The first thing we found, perhaps not surprisingly, was that people dealing with international biochemical security and people working in laboratories often used different terms to discuss similar issues. At the local level, for example, laboratory safety is not always thought of as being a security issue.
As a consequence, safety practices are not always conceived of and communicated in security terms. This is significant for two reasons. First, outsiders might be led to underestimate security at an institution. For example, many activities that take place at a local level under the label of health and safety, particularly those related to physical containment, are actually already performing security functions.
Second, preventing certain threats requires going beyond normal biosafety infrastructures. Think, for example, about the development of a novel pathway that would make it easier to synthesise a controlled substance, or the possibility of a terror threat from inside an institution, or the theft and diversion of research data. Managing such threats requires work with researchers and their support staff and managers to develop a fuller understanding of what biochemical security is, and how to practise it in the workplace.
A second observation was that there were already a number of local and national networks devoted to developing and sharing best practice in biosafety at universities. Given the right resources and incentives, these networks could facilitate the development of a culture of biochemical security in universities.
Finally, during our meeting and follow-up, it became clear that universities varied in how they treated the relationship between the processes of ethical review and health and safety. It is not always clear who has the responsibility or expertise to flag up concerns about the potential for research to be misused, or at what point of the research process, from grant funding through to publication, such concerns should be raised.
These issues, particularly in relation to pathogen research, are probably set to receive more attention in the future. Recent biosecurity reviews in the US and the Netherlands were followed up by one in Germany, where a new legal framework for pathogen research was recommended. And The Guardian reported in December that UK labs handling human and animal pathogens had “reported more than 100 accidents or near misses to safety regulators in the past five years”.
Although we often think of security as a brake on innovation, providing biosecurity also requires continued innovation to develop the necessary technologies and workable practices. The south-west alone has many university health and safety professionals and bench researchers who could help create educational initiatives and improved university policies. And even though the UK is already part of international initiatives on biological and chemical security, there is untapped potential to innovate in this area—particularly in shaping good practice linked to the objectives for responsible innovation laid out by Research Councils UK.
Universities alone cannot fully negate the risks posed by innovation. But they can raise researchers’ awareness of security concerns and choose to deal only with companies that have appropriate ethical, safety and security accreditation. Some synthetic biologists at American universities have already contributed to the emergence of security screening by adopting this approach when working with companies that produce DNA sequences.
Finally, universities can facilitate and encourage the creation of testbeds to improve foresight and responsiveness to the potential security concerns raised by scientific advances. Several such efforts already exist, such as the Synthetic Biology Engineering Research Center in the US and the Centre for Synthetic Biology and Innovation at King’s College London, as well as our more modest initiative at Bath. Given the breadth and seriousness of the issues, there is plenty of scope, and need, for more.
By Brett Edwards and David Galbreath
Brett Edwards and David Galbreath work in the department of politics, languages and international studies at the University of Bath.
Something to add? Email comment@ResearchResearch.com
This article also appeared in Research Fortnight
This article was published in Research Professional, the UK’s leading independent source of news, analysis, funding opportunities and jobs for the academic research community.