October 31, 2018
Engineered cell therapies for the 21st century
“The 20th century was the century of small molecules, but the 21st century is going to be the century of biologics.”
Thus began Ryan Cawood, CEO and founder of Oxford Genetics, at a special session of SynBioBeta 2018 devoted to engineered cell therapies. His pronouncement summarized the consensus of the session’s four speakers, who each described the myriad ways in which synthetic biology paradigms will be essential for understanding, developing, and exploiting the next generation of clinical therapeutics.
Leveraging cellular computation to treat disease
“Cell therapy is one of the most exciting fields of medicine,” said the first speaker of the session, Alec Nielsen, CEO of Asimov. Asimov is dedicated to programming cells using synthetic biology, artificial intelligence, and design automation. Echoing Cawood’s premise about the current evolutionary trajectory of therapeutic technologies, he elaborated:
“If you look at the past couple of millennia of medicine, it’s moving towards increasing control. In the beginning, we could crush up plants and isolate molecules. Later, synthetic chemistry gave way to small molecule development. There, if something fails, you have some recourse, but there’s only so much functionality you can add using chemistry. The next major leap came with large molecule development at Genentech in the 80s.”
Today, Nielsen argued, we are witnessing yet another revolution in progress: “This idea that we can now use genetics to scaffold the construction of atomically precise large molecules like antibodies and use them in the human body—that’s transformational and so now we’re at the stage where we’re zooming out one more layer; we’re striving for even greater levels of control.”
In his view, the next step is to manipulate not drugs or proteins but whole cells. The idea, in short, is to take advantage of the panoply of biochemical pathways that have arisen over the course of evolution and repurpose them to yield clinical benefits.
Take for example a T cell, which Nielsen sees as “one of the most exquisite sensor actuator packages in the human body. It circulates and detects disease, whether that be cancer or pathogens, and it’s able to eradicate it. And if you were to peer under the hood of this cell, you would see a sort of chaotic molecular milieu that’s performing really sophisticated computation. As synthetic biologists, we can envision building on top of those endogenous computational capabilities to tackle new types of disease.”
Expanding the toolkit with synthetic biology
To achieve these goals, biotech and pharma companies will require a new set of methods to manipulate cellular therapeutics and cope with their inherent complexities.
“Can synthetic biology provide the control and precision necessary to significantly enhance cell therapy?” asked second speaker Helge Bastian, vice-president and general manager at Thermo Fisher Scientific, a biotechnology product development company. “From my perspective, the answer is a very clear yes.”
Bastian went on to describe various ways in which Thermo Fisher is helping to enrich the methodological armamentarium of synthetic biology, focusing especially on novel ways of synthesizing, delivering, and manipulating nucleic acids. “Synthetic biology at Thermo Fisher is all about engineering nucleic acids in cells, but also in vitro, and the technology portfolio we have goes all the way from amidites to oligos to whole genes, making pathways based on larger pieces of genomes, and then eventually just generating cells.”
The efficiency of transfection or gene editing is of vital clinical importance in situations where successful treatment requires that the vast majority of cells receive a modification of interest. Accordingly, a key focus at Thermo Fisher is improving the ways in which we deliver nucleic acids and nucleases to target cells.
The company is pursuing such improvements on multiple fronts: “We come up with enzyme designs that bring the active material straight into the nucleus; the whole formulation and design of the enzyme make a difference. But before we even bring it to the nucleus, of course you also want to make sure that it’s efficiently going into the cell. And this works together with a third thing, which is the transfection agent. So, we have a lot of smaller steps that together allow efficiencies of up to 90 percent.”
In addition to methods for introducing gene editing reagents into living cells, Thermo Fisher is also refining the payloads themselves. Unlike many other players in the field who have gone all in on CRISPR, however, Thermo Fisher is continuing to develop transcription activator-like effector nuclease (TALEN) technology, an alternative method of gene editing that requires more design effort on the front end, but has greater flexibility than CRISPR because it is not dependent on specific sequences.
“What it means is that you can target any sequence in the genome, and this is the benefit over CRISPR in areas where it can’t cut,” Bastian explained. He went on to emphasize that TALENs are already battle-tested: For those who have forgotten, the first CAR T-cell treatment that was done on the planet was with TALEN, not CRISPR.”
Building viruses with viruses
Construction of a genetic tool is only the first challenge that must be overcome. To be effective, we have to find ways to get these tools into the relevant cells.
The delivery of nucleic acid reagents to living cells — whether for gene expression or gene editing — continues to be a challenge throughout the industry. In the laboratory, researchers often use engineered viruses to insert nucleic acids of interest into their target cells, and biopharma hopes to adopt this approach in clinical therapeutics. However, virus production is far less straightforward than synthesizing a small-molecule drug.
“Making a virus is much more complicated, and manufacturing is a major bottleneck for the industry,” said Cawood. “When people are getting to phase III clinical trials, they can’t manufacture enough of these materials to be able to run the trial.” Hence, it is essential to develop new ways to generate unprecedentedly large quantities of viral vectors for clinical use. To introduce Oxford Genetics’ solution to this problem, he first explained the basic biology of an important viral vector.
“The most successful of the gene therapy vehicles at the moment is AAV, which stands for adeno-associated virus. AAV is a tiny little virus that sits in your cells and waits for adenovirus, the second virus to come along. When that infects the same cell, it activates AAV, which can then replicate.”
From this standpoint, the critical advantage of AAV is that it can’t replicate on its own, but can still infect cells and deliver nucleic acid payloads. The critical disadvantage is…the same: AAV can’t replicate on its own. Therefore, making AAV at large scale requires the use of plasmids (which are expensive and low-yield) or adenovirus, which is great at making AAV but even better at making itself, causing any adenovirus-assisted AAV prep to be massively contaminated by unwanted (and even potentially hazardous) adenovirus.
Fortunately, Oxford Genetics has come up with a way to manipulate the adenovirus life cycle to take advantage of its ability to replicate AAV while avoiding contamination. “I’m pleased to say it really does work,” said Cawood. “We can completely get rid of the contaminating adenovirus from our AAV preps. There’s no other solution on the market that does that.”
Even better, the method for eliminating contamination actually increases yield of the useful virus: “It’s as if the adenovirus being there in the cell, but not killing the cell, allows the AAV to package more of its particles. You get five times more virus compared to a plasmid system. So it really is a next-generation system for manufacturing AAV.”
A ‘full stack’ approach to harnessing complexity
Running throughout each of the previous speakers’ talks was the theme of complexity. It is clear that the next generation of cellular therapeutics will pose enormous challenges to their creators, from conception to implementation. “The interconnectivity of these molecular networks is staggering,” said Asimov’s CEO Alec Nielsen. “It’s above anything the human mind can conceive of, and so we need techniques like machine learning and other statistical inference to start to actually guide the design.”
It was fitting, then, that the final speaker described ways to cope with this complexity. Kevin Holden is head of synthetic biology at Synthego, which sponsored the session. As such, he leads the team responsible for integrating the synbio workflow — including aspects such as CRISPR and genome engineering — into novel automation platforms for cell engineering.
“Biology is actually becoming a data science,” Holden argued. “So in order for us to get better data and develop better models of disease, and then eventually have better outcomes including the development of gene therapies, we really need to be able to generate better methods and come up with more consistent results. And so, as a company, Synthego became very interested in understanding how we can automate the cell engineering process.”
What does this mean in practice? Synthego, which considers itself a general engineering company, has assembled a large team of more than 100 scientists, engineers, and computational biologists who are taking an engineering approach to answering biological questions. To this end, they developed a platform that allows them to scale up genome engineering, automate the process, and make it smart and consistent.
“We’d like to call this approach ‘full stack genome engineering’,” he continued. “We’re not focusing just on one part, but we’re developing an entire process that allows us to apply this type of thinking to all of genome engineering. This has allowed us to develop not only reagents for people to use, as well as biogenetics tools that we’ve developed internally and provide to the research community — but also to engineer cells themselves.”
Specifically, Synthego has created a suite of methods that enable CRISPR-based gene editing at extremely high throughput. The data they collect in these experiments can in turn be used to make gene editing more efficient and effective in the future.
“Because we’re doing tens of thousands of these genome-engineering edits over several months at a time, it allows us to layer in a computational biology approach and develop models that help us predict exactly what happens in a genome editing outcome. And so we can predict in silico what’s going to happen, and this allows us to more efficiently conduct our cell engineering projects.”
Thus, by accumulating experience and familiarity with challenging biological systems, companies like Synthego are getting a handle on the inherent complexity of the systems being manipulated.
These efforts — along with the methodological advances described by Nielsen, Cawood, and Bastian, and in the works at dozens of other companies around the world — are catalyzing the coming century of biologic therapeutics, in which synthetic biology strategies will play a vital role.