Next-generation sequencing (NGS) is increasingly employed to dive deeper into complex biological mechanisms in a myriad of disciplines. But as with most methodologies, the quality of the input material determines the quality of the resultant data. High-quality sample preparation is the foundation of NGS, and the choices made for this step are of paramount importance to those seeking to maximize the utility of this technology.
First, the nucleic acid of interest must be extracted from the sample material, which can range from fresh samples, such as blood or tissues, to formalin-fixed, paraffin-embedded (FFPE) blocks that have been in storage for decades. The sample type can impact not only the quality of the nucleic acid but also the amount. Next up is library preparation to prepare the DNA or RNA for sequencing.
Many different commercial and noncommercial extraction and library prep solutions are available. The crucial element is choosing the right one for the sample, the molecule type, and the desired analysis to avoid introducing bias into the experiment. The fortunate aspect is that kit suppliers continue to simply these starting steps—extraction and library preparation—to simplify workflows and reduce the risk of human error.
Established protocols, consistent results
NGS users hoping to obtain consistent results face several challenges. For example, there is a lack of standardization in various NGS sample prep tasks. These include the isolation of RNAs from liquid biopsy samples, says Robert J. Yamulla, PhD, senior marketing manager, cancer research, Illumina. He notes that this task can be carried out with various commercial kits or with noncommercial protocols such as phenol chloroform extraction. However, the identities of the RNAs that are isolated can vary greatly depending on which kits and protocols are used. Some kits tend to bias results toward smaller or larger RNAs, whereas others may capture DNA or other unintended contaminants.
Another challenge is that following nucleotide isolation, few library preparation solutions are linked to analysis. Different analysis methods can create variability in results and data quality.
“It is important to understand the underlying biology of your system and to consider, measure, or mitigate how a particular nucleotide isolation method or library prep kit may bias results,” Yamulla says. “My graduate school advisor stressed that if you put ‘garbage into your experiment, expect garbage results to come out.’ Controls and high-quality measurements should always be part of an experimental design, and steps like quantification via qPCR or fragment size analysis via a bioanalyzer should be considered.”
According to Yamulla, well-established workflows and kits can offer a comprehensive solution—one that starts at library prep and ends with software analysis. Such a solution can lessen the need for inference when interpreting results, increasing consistency between experiments. Yamulla adds that when the wet and dry bench steps are designed together, the benefits include workflow optimization and increased accuracy.
NGS sample prep is accommodating an expansion of sample types and use cases. The former include biofluids, FFPE blocks, and environmental samples, and the latter include single-cell, spatial, and long-read sequencing. NGS sample prep is increasing efficiency, decreasing pipetting time, and simplifying analyses. It is improving its performance overall, but because it is building on a foundation of well-established and accessible workflows, it retains versatile entry points for researchers who are new to NGS.
Automation and simplification
“Getting reliable NGS data ultimately depends on the sample you put in—especially when evaluating yield and quality,” advises Angie Cheng, director of R&D, sample preparation, Thermo Fisher Scientific. A number of different sample types are used from fresh frozen tissues and blood. (The main one is FFPE tissue.) Each of these sample types poses unique yield and quality challenges.
“As we push the limits of NGS and challenge it with smaller sample inputs, for example, we will need to prepare the samples accordingly, either providing a more efficient extraction process or streamlining the back end through library prep while ensuring not to introduce bias,” Cheng says. Because sample prep processes can be manual and time-consuming, automation will be key to reducing bottlenecks and providing consistent and reliable workflows.
Many extraction and library prep solutions are available. Choosing the right one depends on a number of factors such as analysis type, sample type, and molecule type (DNA or RNA). Researchers should weigh their options carefully, especially with real-world samples, and carve out the best workflow for their needs. “Also, select the right vendor to partner with for support, troubleshooting, and a reliable supply chain,” Cheng emphasizes.
Thermo Fisher Scientific provides complete workflow solutions with automated sample extraction offerings that are validated with the Ion Torrent sequencing solutions, for example, the Ion Torrent Genexus System. This system can automate sample and library prep, sequencing, analysis, and reporting for a seamless end-to-end workflow. In addition, Thermo Fisher Scientific has validated the KingFisher Purification System and MagMAX kits to target the general sample preparation audience by providing solutions for a number of different analytes, sample types, and downstream applications, as well as by supporting different throughput needs.
A specialized approach for difficult samples
Sample prep results may suffer if factors such as sample quality, quantity, purity, and preservation technique are unfavorable. The capture of unique DNA/RNA molecules may be inadequate, resulting in sequencing libraries that are lacking in complexity. Ultimately, coverage depth and variant-calling sensitivity may be reduced.
“Choosing the right NGS sample library preparation is key,” says Ellie Juarez, PhD, NGS business unit manager, Integrated DNA Technologies (IDT). “While there are general library preps that use enzymatic or mechanical DNA fragmentation, there are also specialized library preparation workflows specifically developed to overcome a plethora of issues in sample input.”
A specialized approach is IDT’s xGen cfDNA & FFPE DNA library preparation kit. It was designed to help overcome DNA inputs that are low in quality and/or quantity—inputs such as FFPE, cfDNA, or ancient DNA. Powered by novel ligases and sequential splint ligation adapters, the kit enables the capture of the highest number of unique molecules of DNA from degraded and low-abundance samples. This, coupled with adapter modifications and in-line UMIs (unique molecular identifiers), provides reduced dimer formation and error correction for diverse NGS libraries and ultralow variant calling.
According to Juarez, most major companies have some version of a target enrichment workflow. Nevertheless, the market demands faster, better, and more sensitive approaches to complex questions in fields such as oncology, immunology, and infectious disease.
“Development of new, faster, and more flexible library prep and hybridization capture workflows will continue to power the genomics industry to make novel important discoveries well into the future,” Juarez maintains. “Whether your research focuses on decoding the genomic sequences of complex microbial communities, or identifying inherited germline SNPs from degraded samples, IDT has a portfolio of specialized xGen DNA and RNA library prep kits for epigenetics research, difficult samples, and transcriptomic experiments for a broad range of inputs.”
Kit options and improvements
“There are more choices than ever to get started with NGS,” says Betsy Young, PhD, product marketing manager for next-generation sequencing (NEBNext), New England Biolabs. “This is a dream come true for seasoned pros, but novices and infrequent users can get overwhelmed with the abundance of options.”
The trick is to choose the right library prep kit tailored to your nucleic acid of interest and available input amount. This will help to prevent issues with under- or overamplification. “Be sure to follow the kit manufacturer’s guidelines,” Young advises. “If you need to call for technical support, that is the first question you will be asked.”
A further word of caution from Young, specific to bead cleanups, is to ensure the beads do not become too dry before eluting. Different atmospheric conditions can affect how quickly beads dry, and it can be difficult to elute nucleic acids away from dried beads. Pay attention to any videos provided for a visual reference of when to elute.
The NEBNext product line offers a broad range of NGS DNA and RNA library prep kits and modules for a “right-sized approach” to library prep. The library prep kits include reagents that are needed to go from input to library, and the modules offer reagents for individual steps in larger workflows. Modules can be used in concert with each other, or to custom tailor a complete workflow.
The NEBNext ARTIC SARS-CoV-2 Library Prep Kits—available for both Illumina and Oxford Nanopore Technologies—are based on the ARTIC workflow, which is a multiplexed, amplicon-based whole-viral-genome sequencing approach out of the University of Birmingham lab led by Joshua Quick, PhD. Upstream and downstream of library prep, NEB offers Monarch nucleic acid purification products, NEBNext rRNA depletion and mRNA enrichment selections, enzymatic fragmentation solutions, and the NEBNext qPCR-Based Library Quant Kit.
“Although the instruction manual for making an NGS library has changed little, the enzymatic toolset has changed significantly,” Young observes. “The newer enzymes and enzyme mixes have been designed for faster, more precise, and more robust activities, increasing library yields by an order of magnitude and enabling a proportional decrease in input amounts, overall workflow, and hands-on time.”
Simplifying the process
“The quality of the material and libraries that go into a sequencer directly dictates the quality of what comes out,” says Joe Mellor, PhD, co-founder and chief scientific officer of SeqWell. Challenges include human error, sensitive steps in protocols, and cost. Sequencers are expensive technologies, so laboratories seek to ensure that every run is as effective as possible, which depends on the control, quality, uniformity, and robustness of the library prep process.
SeqWell kits aim to reduce complexity and allow laboratories to implement library prep efficiently, with minimal preparation, and at a higher scale. “All the steps—from sample to sequencer—must work in conjunction with each other and work well enough on a wide variety of samples,” Mellor insists. “You do not want to repeat experiments.”
Auto-normalization is at the core of SeqWell’s three main product lines—ExpressPlex, PlexWell, and PurePlex—to minimize the number of steps needed to control sample variability. If laboratories require more autonomy or need to design their own assays, a fourth product line, Tagify, provides for custom-designed, multiplexing assays.
As technology matures, features become more integrated. As an example, ExpressPlex takes several different reaction or reagent components and combines them in a single tube to eliminate traditional sequential processing. “You put your DNA in, close up the plate, and get a finished library out,” Mellor remarks.
The emergence of new sequencing platforms and the development of long-read sequencing technologies creates a need for new library tools that take advantage of these platforms’ capabilities. “The library tools to support those platforms are in their infancy,” Mellor says. “We are also transitioning to an era where the next generation of library prep tools will be more purpose built. Rather than a one-size-fits-all approach, genomics products are being designed in a holistic way to address key application areas. The fit for ExpressPlex in synthetic biology is a great example of this.”
MaryAnn Labant focuses on the advances of science and their effects on future generations.