“We didn’t take a broad enough view of our responsibility, and that was a big mistake.”
Before his grilling on Capitol Hill this week, Facebook co-founder Mark Zuckerberg issued an apology.
In a scripted opening statement, Zuckerberg explained that since its founding, Facebook has been an “idealistic and optimistic company”. He and the other early developers just wanted to connect people. For years, leadership at the company — himself included — failed to adequately consider how the tools they were developing could lead to bad outcomes. As CEO, said Zuckerberg, that was his mistake. But now the company is going through “a broader philosophical shift” in how it approaches its responsibility:
“It’s not enough to just connect people, we have to make sure those connections are positive. It’s not enough to just give people a voice, we have to make sure people aren’t using it to hurt people or spread misinformation. It’s not enough to give people control of their information, we have to make sure developers they’ve given it to are protecting it too. Across the board, we have a responsibility to not just build tools, but to make sure those tools are used for good. It will take some time to work through all the changes we need to make across the company, but I’m committed to getting this right.”
It is not unusual for a chief executive to have to own up to their company’s mistakes. That Zuckerberg is now aware of his years-long oversight is a good thing. But it didn’t have to be this way.
Sitting in their dorm room at Harvard, it was likely difficult for the Facebook founders to envision what they were actually up to. They were building a platform that would soon reach more than two billion users (and, by less transparent means, countless more). Their social network would quickly become a dominant source of news for many Americans (despite its lack of fact-checkers). And, one day, their software would begin to negatively impact elections around the globe.
Negative outcomes were not emphasized under Zuckerberg’s leadership, and strategies for containing damage at scale were apparently never developed. As he put it to Congress, “For most of our existence, we focused on all of the good that connecting people can do.” The 33-year-old billionaire insisted he would do things differently now if given the chance.
Though not an exact analogy, synthetic biology is in something like its dorm-room phase. The field is young, idealistic and optimistic. Thanks to the plummeting cost of DNA synthesis and sequencing, many people are now building better tools for manipulating living systems. This has obvious upsides, but — unlike Facebook — catastrophic outcomes are also easy to imagine.
You don’t need be a trained molecular biologist or a bioethicist to understand that tinkering with life comes with ethical concerns. Melinda Gates, whose massive charitable givings support both applied scientific research and global healthcare, is in a position to know. When asked what she sees as the biggest threat facing our species in the next decade, Gates replied with certainty: “A bioterrorism event. Definitely.”
Synthetic biology is not without its ethical pioneers. Jennifer Doudna is perhaps chief among them, but when asked why she spends so much time trying to engage with the public about biotechnology, Doudna told The Atlantic:
“There aren’t that many people who know the technology deeply and [are] willing to talk publicly about the societal and ethical issues. I have many science colleagues who don’t want to get involved.”
The social, ethical and other non-technological issues surrounding synthetic biology should not be put off until after billions of people have been affected. The time to emphasize safety, containment and accountability is right now. As Samuel Weiss Evans, former associate Director for Research at UC Berkeley’s Center for Science, Technology, Medicine, and Society put it:
“[T]he point of supporting synthetic biology is not about making sure that science can go wherever it wants: it is about making the type of society people want to live in.”