reproducibility
Home » Emerging technologies » Automation and hardware » Synthetic biology and reproducibility meet on the Internet (of Things)

Synthetic biology and reproducibility meet on the Internet (of Things)

How the Internet of Things could help synthetic biology in its quest towards standardization.

Every engineering discipline relies on repeatable measurements and consistent ways to catalog and convey information. It is standardized parts, for example, that enables electrical engineers to read a blueprint and, with the necessary power supply, wires and switches, build circuits that produce predictable behaviors. As synthetic biology evolves into a fully-fledged engineering discipline, it must tackle current inconsistencies in measurements and devise new methods to enhance the reproducibility of experiments.

Perhaps it is time to look outwards for answers.

A (not so) novel solution to reproducibility: the Internet of Things

Biology is already hard to engineer – data should not be hard to collect and interpret. In a data-driven field like synthetic biology, there is an ever-increasing desire to test, record and standardize data. But despite persistent efforts to make biological engineering more consistent, new ‘parts’ are constantly created, exciting new methods are developed and even the best experimental practices overlooked. Research laboratories often do things their own way and so, rather than relying on individuals to conform, a better solution may be to instead enhance the ways that data is collected and analyzed.

The key to achieving these goals: The Internet of Things (IoT).

The concept behind the IoT is simple: Internet-enabled sensors can be integrated with, or connected to, almost any piece of equipment, from vending machines to car-assembling robots, to collect vast amounts of data and automatically store it in the cloud. In synthetic biology, this means that lab equipment can be monitored to ensure that experimental parameters between runs are consistent. The digital data can also be readily accessed by members of the lab or shared with external collaborators. By seamlessly connecting laboratory equipment and pooling data in a single, online database, users can always go back after an experiment and pore through the data to determine sources of inconsistency in measurements.

But bringing IoT to the lab is not without its challenges. For one, the use of IoT-enabled devices to troubleshoot experimental inconsistencies requires that many machines be connected to the cloud. This generates loads of data, which is not necessarily better unless there are corresponding analytics tools to expedite deciphering it all. It can also be difficult to augment existing equipment with IoT-enabled sensors because scientific manufacturers do not necessarily build freezers, incubators, and plate readers with this functionality in mind.

Closing the Gaps: Towards IoT-Equipped Labs

Many companies are pursuing the ambitious aim of bringing the IoT to research laboratories, but two have gone the extra mile in addressing existing challenges.

TetraScience, a Boston-based ‘IoT platform for science’, offers an all-in-one service that monitors all IoT-enabled equipment in the lab, streaming their experimental data and relevant parameters 24/7 and then seamlessly storing the data online. The company also offers lab monitoring, which means that important pieces of equipment can be continuously monitored and users alerted in the case of unexpected deviations. But their most useful product is perhaps their Data Integration Platform, which centralizes and standardizes data from the entire lab and couples it with visualization tools so that anybody can easily access and analyze information.

TetraScience

TetraScience helps companies eliminate data silos.

Elemental Machines, a Cambridge, Massachusetts-based start-up, is also rewriting the way that synthetic biologists collect and analyze data. Element-D, one of their hallmark products, is ‘an IoT data collection device’ that pulls information from equipment, aggregates the data and stores it on a cloud-based dashboard. These plug-and-play sensors also stream data continuously and can be attached to almost anything. The output data includes fluctuations in equipment settings or parameters and experimental results from every single use of the machine. Elemental Machines also offers equipment monitoring, which tracks the performance of instruments over time and can predict when a machine should be repaired or replaced.

Elemental Machines
Sridhar Iyengar, CEO & Founder of Elemental Machines, at SynBioBeta SF 2017.

Importantly, both companies have developed integrated platforms that hold the potential to dramatically enhance reproducibility of measurements. They are also addressing the common IoT-related challenges in unique ways; by expanding the number of parameters monitored during an experiment – including temperature, oxygen and any relevant deviations from the expected settings – researchers will be better equipped to understand data anomalies and standardize experimental protocols.

Elemental Machines

The Elemental Machines platform delivers powerful insights that help customers optimize processes and maximize utilization.

TetraScience and Elemental Machines also understand that more data is not necessarily better unless powerful analytics tools are built-in to the platform. To address this dilemma, they have developed analytics solutions that help users glean as much information as possible from a given data set. The physical sensors sold by each company are also relatively agnostic – Element-D can be retrofitted to almost any piece of equipment and is completely wireless, while TetraScience works directly with equipment manufacturers to provide integrated, IoT solutions that are commercially-supported. The wide range of equipment that the sensors can be connected to, coupled with the powerful ability to store all data in one place, means that experiments can be analyzed faster and more consistently.

Biology is not simple, and cells do not yield their secrets so readily. In a discipline where genetically identical organisms can give rise to different phenotypes, even scientific papers with the most rigorous documentation do not guarantee experimental reproducibility. Fortunately, companies are devising and implementing new tools to make biology, if not easier to engineer, simpler to understand. Biological measurements must be consistent for the field to progress. Thanks to the Internet (of all things), we are well on our way.

Niko McCarty

Niko McCarty

Science | Tech | Bioengineering PhD Student @ CalTech | Formerly SynBio at Imperial College London

Stay updated on the latest news in synthetic biology

Join our weekly newsletter

Sign up

Job opportunities

More