This year’s World Economic Forum is a wrap. Presidents, chief executives, policy wonks and futurists from around the world made their annual pilgrimage to Davos, Switzerland, to discuss the state of the economy, global governance, and — in a panel discussion that should be of particular interest to the synthetic biology community — “future shock”.
The phrase comes from the influential twentieth-century futurist Alvin Toffler. Future shock refers to the overwhelming feeling that too much change is happening in too short a period of time.
Five panelists, themselves influential futurists, scientists, global leaders and chief executives, gathered in Davos to discuss how some of today’s hottest technologies (CRISPR, gene drives, artificial intelligence, self-driving cars, low-orbit satellites and drones) are giving rise to future shock.
Is future shock warranted? Are world-changing technologies being developed without foresight? Who’s in charge — and who do we blame — if technology goes rogue?
It’s not every day that you get to see CRISPR co-inventor and MIT professor Feng Zhang on stage with a robotics guru, a billionaire CEO, the lead scout for innovation for a major South American country and the former President of the General Assembly of the United Nations – but that’s exactly what the WEF is for.
The tone of conversation oscillated between a gleaming enthusiasm for our collective future (engineering the ocean microbiome “sounds great”, exclaimed the moderator early on) and a collective sense of, well, future shock (“it also sounds terrifying.”)
Several key themes and questions emerged from the discussion:
World-changing technology is coming – how should we relate to it?
The panel coalesced around a single answer: it should augment us.
“Technology is just a tool,” stressed Duke University roboticist Mary Cummings. “Bring it in as a tool and not as a panacea.”
Salesforce CEO Mark Benioff agreed, noting that for more than a year, the board discussions at his software company — one of the largest and most disruptive on Earth — have increasingly been augmented by artificial intelligence in the form of a proprietary system they call Einstein. “Each and every time I ask it [what it thinks], it always has an insight about an executive or territory or product that I would never have seen. There’s too much data for me to understand what’s going on.”
In another example of how technology could augment ongoing work, Benioff and UN Special Envoy for the Ocean Peter Thompson imagine a future in which low-flying satellites capable of collecting real-time global imagery could be harnessed by environmentalists to help spot illicit whaling or trawling vessels on some of the most inaccessible parts of the high sea. And perhaps no one will need to pore over the data — if AI can drive a car, then surely it can spot a malicious boat and alert the appropriate authorities.
“There’s more data than ever,” Benioff went on to say. “And there’s more data coming from all these sensors. Not just for aspects of the environment — we were talking about DNA sequencing — there are massive amounts of new types of data available. Through that we have the ability to have analytics and insights that we’ve really never had before. We’re in a data revolution. I think coupling it with artificial intelligence – that’s where it can get exciting.”
Proceed with caution
It’s easy to slip into a futurists’ fantasy in a setting as idyllic as Davos. The panelists insisted we shouldn’t.
“I do think we need to be careful about engineering microorganisms and releasing them out into the environment”, said Feng Zhang. Even with the best of intentions, said Zhang, it’s hard to predict how even one change to a protein will interact with the rest of biology. “Biology is really complicated.”
“For biological systems,” Zhang explains, “it would be important to engineer containment mechanisms. Rather than doing something that is irreversible, we engineer a circuit in these biological organisms so that once we release it into the environment — and just in case something goes wrong — we can switch it off. We can recall this strategy back.”
Cummings also urged caution, noting that some of the safety precautions Zhang was suggesting are already common practice in conventional engineering. But complications are still possible even with more developed technologies.
“It turns out for driverless cars, for example, one of the things that was recently ‘discovered’ in the last six months is that using very easy passive hacking techniques (i.e. just putting a few stickers on a stop sign) can trick computer vision from seeing a stop sign and makes it see a speed limit of 45 miles per hour sign. We had no idea that these things were possible until just the last six months. And so as a researcher, what I would worry about is: if we’re still finding out these emergent properties in these technologies — CRISPR, AI — yet there are many companies and agencies that want to take these technologies and start deploying them in the real world, it’s still so nascent that we’re not really sure what we are doing. I do think that it needs to be more of a collaborative arrangement between academia, governments and companies to understand what’s really mature and what is still very experimental.”
Who sets the policy if technology affects the whole world?
“When it comes to the ocean, there’s only one ocean,” said Thompson. “Fish don’t recognize national boundaries.”
How then can synthetic biology and other promising technologies be regulated?
“We can be augmented through artificial intelligence, as human experts, and also [through] global governance” said Benioff. “Global governance, I think, gets heightened in this environment.”
“One of the challenges for us in the government,” said Marcos Souza, Brazil’s Secretary of Innovation, “when a professor *gesturing towards Zhang* is developing a technology, are the impacts.”
“How do we simulate different scenarios more accurately? What are the impacts that [will be generated] from the technology? This is a long process. It involves a lot of scientific data. How can we use technology, use scientific data — evidence-based decisions — faster and more precisely? Because it’s important to try to separate what is ideology (because there’s a lot in this field) from evidence-based decisions.”
Peter Thomson offered another way of framing the issue of regulation: as a conversation about ethics. In his previous job as President of the UN General Assembly, Thomson tried to stimulate discussion about the ethics of innovation, even inviting Silicon Valley pioneers to take part. But he feels the global conversation is still missing. “I’m sure the scientists would appreciate it as much as the governments would.”
Technology — and the public’s perception of it — takes time to evolve
“I’m a futurist and a technologist, so clearly I’m going to be positive about the future,” said Cummings. “I think it’s worth looking back though. Ten years ago this community, even here at the World Economic Forum, but more globally, was very anti-drone.”
Cummings has been developing drone technology since 2001. In those early years, public perception of the nascent technology was gloomy. But following Jeff Bezos’ announcement in 2013 that Amazon would start making deliveries using drones, opinions changed. “Wait wait wait – this technology that’s going to kill us all is now going to be used in a different way?”
Now, Cummings notes, one-third of all non-military drones are being used in conservation efforts. “I work with companies who are using drones to track elephants, for example.”
Time for tempered optimism
In the end, it seems that remedies for future shock are prudence, conversation and time. Those eager to invent a brighter future should balance their optimism with healthy doses of humility, openness and patience. But there are good reasons to be hopeful.
“Rather than seeing rogue technology as something that is going to get out of hand and destroy all the life in the ocean,” said Thompson, “what I think we can do is use technology in helping us in reversing the cycle of decline. I’m very confident about that.”
“I’m glass half full on this”.0