Zachary N. Russ

As readers of a technology trade magazine, you have probably heard of Moore’s Law. You may even be sick of hearing about it, and for good reason—the exponential growth of technological capabilities describes many fields, including biotechnology.

The practice of manipulating life became a technology with a common currency—where MIPS or transistor count might serve as measures of computer sophistication, sequencing cost and construct length became some of the measures for synthetic biology.

Talk of biotech has since become inundated with computing analogies: DNA as computer code, gene clusters as circuits, etc. If so, we are currently writing the binary code by reverse-engineering poorly documented hardware held together with duct tape and gum, operating in an area with frequent brownouts over an extraordinarily slow modem. Even so, we do see Moore’s Law-style advancements in the tools to engineer life. As a result, biotech needs to look closely at another prominent feature of the computing world: obsolescence.

Obsolescence is a concept that affects us at every level: products become obsolete, as do skills, technologies, and mindsets. Even governments and species can find themselves pushed into the pages of history or the fossil record. We adapt as best we can, learning new skills when demand for our current set declines; even politicians have to heed the call of reality now and then.

Technological development moves faster than it used to—more scientists and engineers means more minds thinking, developing, creating. Advancements in communications and information management further amplify this effect—reinventing the wheel has been almost invented out of existence. What happens with your computer—a new version every year, faster, more efficient, more powerful—is now the rule for biotech research.

Sequencing of single genomes and proteins, once an appropriate subject for doctoral theses, is now a service done inexpensively in a few weeks of work. The move from groundbreaking to routine seems to be happening faster every day, and this means that the value of time is ever-increasing, each minute more valuable than the last.

The same amount of time buys you more and more information the longer you wait, just as waiting gets you more features on your new laptop for the same price. There’s just one catch: you can’t afford to wait, especially as the competition continues to grow.

In research, as in anything else, we want to maximize results generated per unit of effort expended. We can’t know beforehand whether a particular avenue of research will yield valuable results or not, but there are some considerations that may help:

1. Literature search. This is why communications technology and data repositories are so valuable: Every piece of information acquired is another experiment that doesn’t have to be done—if you trust the results.

2. Quick experiments. If there’s a fast way to rule out a particular idea, do that first. With so many alternatives, erroneously excluding a project is less harmful than investing heavily in a dead end.

3. Develop alternatives. Genome sequencing and DNA synthesis continue to get faster and cheaper. The life cycles of organisms and physical laws remain the same. Clinical trials don’t get shorter, either. A new technique to circumvent the fundamental limitations of an assay is both useful for a project and a valuable discovery in itself.

Examples include copying DNA without waiting for replication (PCR), highlighting cells without using antibodies or stains (GFP), or finding a model system (human cytochrome panels) or measurable representative (biomarkers). Making a process more amenable to automation and parallel processing is another good idea.

4. Upgrade and reevaluate regularly. New techniques and data appear at such a rate that it’s worthwhile to check whether your approach is using the latest and greatest ideas. This is especially true in synthetic biology—upgrading to brighter fluorescent proteins gave me more definitive answers in experiments, and switching plasmids to newer assembly methods promises to be quite a time-saver when it comes to building multipart libraries.

Addressing the ever-quickening world of research is a challenge at every level—from the individual scientist who must keep up-to-date on new techniques and develop generalized, transferable skills (and avoid being stuck with methods that are superseded in a few years) to the research groups, corporations, and institutions struggling to generate timely, relevant results to secure continued funding.

Everyone benefits from time saved by identifying dead ends and staying current, but ways around fundamental bottlenecks are supremely valuable. Portable computing worked around sluggish battery development by dropping power usage—how can we work around the life cycle of organisms? The duration of filings and clinical trials?

Just like in computing, each workaround increases the value of others—dropping the power usage of chips might bring a laptop’s battery life from two hours to four; doubling its battery capacity at this point means a workday’s worth of power on a single charge. Similarly, automation and parallel processing were two ways to address the cell growth time—but what if you didn’t need to wait for so many generations? Ultimately, the best way to outrun obsolescence is to be driving it.

Previous articleBiopharma Job Outlook
Next articleGreen Tea and Gold Nanoparticles Destroy Prostate Tumors