Imagine how different the world would be if, decades ago, when we were developing the internet, we could have anticipated the types of cyber security threats we face every second of every day today. What would we have done differently then to ward off the future seizure of hundreds of thousands of individuals’ credit card and social security numbers or foreign interference in political affairs?
That pivotal moment in history is exactly where we are today when it comes to the development of yet another powerful technology with the potential to change life as we know it: biotechnology.
Biotechnology can provide the cures to debilitating disease, feed millions, and build a more sustainable planet. But, it can also be used for incredible harm. We’ve already seen the backlash that results when a well-meaning scientist flies in the face of current ethical and social standards and applies a gene-editing technology not yet ready for prime time in humans.
We are still in the early days of biotechnology. We have the chance, right now, to think ahead, to establish policies and protocols that will enable us to increase our chances of a secure future. Biotechnology is here to stay; now is the time for everyone, from graduate student to seasoned CEO, to think about biosecurity.
Ensure the intended, avoid the unintended
While the definition of biosecurity can vary widely from country to country (if you ask a scientist in Australia, they will probably define biosecurity as ensuring that new pests and diseases don’t enter the country and become established there), but the biosecurity everyone needs to be thinking about is simple in concept yet broad in scope: doing everything you can to ensure your technology has as many of its intended consequences as possible while minimizing unintended consequences.
But what, exactly, does that look like?
Participants in the iGEM competition get an idea of what that looks like early on: the program emphasizes the importance of thinking about the social and security implications of a project — in fact, for a team to receive a Silver or Gold medal in the competition, they must demonstrate that they have “thought carefully and creatively about whether [their] work is responsible and good for the world.” It was this experience that got Tessa Alexanian, an undergraduate participant in the iGEM competition who is now a Software and Automation Engineer at a Bay Area biotech company, really thinking about biosecurity and trying to encourage others to do the same. She is now a member of the Safety and Security Committee and an executive member of the Human Practices Committee for iGEM and an active member of the East Bay Biosecurity Group.
“The way I think about biosecurity is that it’s the sphere of all of the other concerns [beyond biosafety] you have to think about as you’re carrying out science,” says Alexanian. “How do you make sure that those bacteria don’t escape from the lab and get into the environment, how do you make sure that when you’re publishing something about your research you’re not inadvertently publishing information that could be used by a bad actor? How do you make sure that the outcomes of your experiments are biased toward good outcomes?”
Building a culture around biosecurity
To get people thinking about their science in this way necessitates building a more proactive culture around biosecurity. Too often today, says Alexanian, we approach biosecurity in a reactive manner — something bad happens and then we push for policy changes, rather than enacting policy changes before it’s too late.
Some of the most active players in the space pushing toward a more proactive biosecurity culture are in academia, such as Megan Palmer at Stanford and Gigi Gronvall and others at the Johns Hopkins Center for Health Security. Alexanian thinks it may be harder for industry players to get involved because “it can feel more more risky to be one of the first groups that decides to be public about a biosecurity opinion, since [a group’s] commercial work might get overshadowed by biosecurity engagement.” This is different than the academic landscape, where such involvement is essentially part of the job description.
Nevertheless, some are breaking the mold: Ryan Ritterson at Gryphon Scientific has done a considerable amount of risk assessment for the synthetic biology industry, while Mammoth Biosciences is carefully considering the ethical implications of democratizing diagnostics using technologies like CRISPR. And, Twist Biosciences, Ginkgo Bioworks, Batelle, and One Codex have also been active players in the biosecurity space, working together on IARPA’s Functional Genomic and Computational Assessment of Threats (Fun GCAT) program, which aims to develop cutting-edge computational and laboratory-based approaches to assess the threat potential of genetic sequences.
Though it might not be what you first think of when you think of biosecurity, many of these experts also emphasize the importance of considering ethics and public opinion. But we only need to remember the Jesse Gelsinger case, or the recent CRISPR twins story to see how important public opinion is for the success of new, controversial technologies. “Going too fast, without adequate public ethical and private security considerations means that you run the risk of one of those really tragic events [happening], says Alexanian.
Public opinion matters — if you don’t involve the public early on, how will you know if you’re doing science that will serve the public good? To that end, Kevin Esvelt, an assistant professor at the MIT Media Lab working on gene drives, has put a lot of effort into determining novel ways for making a technology like gene drives, which will affect a large group of people, broadly acceptable. Esvelt describes it as “responsive science.”
It will also be helpful to avoid sensationalizing the stories, like the case of Josiah Zayner injecting himself with CRISPR onstage, that breed fear and negatively impact the public opinion on some of the technologies that could help them the most. Zayner is an active and opinionated member of the DIY bio community, but most “garage biohackers” aren’t as controversial as him. Instead, they are doing things like decoding the genetics of kombucha, purifying soil with mushrooms, and creating vegan cheese. And, says Alexanian, most are acutely aware of the importance of public perception and abide by rigid safety rules.
In fact, says Alexanian, “most of the community biologists have done a lot of really smart thinking about how to set norms to reduce the risk of something bad happening and impacting the community at large. There’s a lot I would like to learn from [them] about how to do that kind of norm setting, and whether it would be possible to extend that to industry.”
Putting on the brakes — or accelerating science?
Many may fear that focusing on improving biosecurity policies could have a negative, restrictive impact on science, hamstringing researchers and not allowing companies to take risks that could be necessary for developing the drug that could finally cure Alzheimer’s or for building the fermentation process that could help eliminate malnutrition in third-world countries.
But, building a proactive culture around biosecurity — through enforcing thought exercises early among iGEM students, by learning from the DIY Bio community, and by connecting those researching biosecurity with those who need to implement it — will accelerate biotechnology, not limit it.
Referencing something she once heard Renee Wegrzyn say during a talk on the importance of security for mitigating risk while advancing science, Alexanian sums it up: “You don’t put brakes on a car so that you can go slow, you put brakes on a car so it is possible to go fast.”
Learn more about the thought leaders, entrepreneurs, and academics who are leading the conversation on biosecurity at SynBioBeta 2019 October 1-3 in San Francisco. Extend your stay in the City for a day and make your voice heard at Catalyst: a collaborative biosecurity summit.3