|SEND TO PRINTER|
Feature Articles : Sep 1, 2007 ( )
Use and Support of Forensics Flourishes
Increasing Ease-of-Use and Automation Reduces Backlog
The biggest change in DNA forensics may be the level of support it is gaining. Yet, automation, new techniques, and tie-ins to drug development also are bolstering its advancement, as labs work to whittle away a case backload that, in the U.S., is estimated at 500,000 for rape alone.
Differential Extraction Revolution
One of the most revolutionary advances, a way to process vaginal swabs from sexual assault cases without centrifugation or vacuum filtration, is being developed by Alex Garvin, Ph.D., director of molecular diagnostics at Bureco (www.bureco.com). This method relies only on pipetting and has been confirmed, according to Dr. Gavin, as just as effective on postcoital swaps by the FSS lab in Birmingham, U.K., the lab that developed the traditional method of separating the DNA of the victim and the suspect.
Like the standard method, the process begins by digesting the epithelial cells, but not the sperm, using proteinase K. “At this point, the digest consists of two components, the unwanted victim’s DNA in solution and the suspect’s DNA in sperm heads,” Dr. Garvin elaborates. The standard method requires separation and involves centrifugation and washing.
Dr. Garvin’s method requires only a pipetting step and destroys more than 99% of the victim’s DNA while leaving the sperm DNA intact, he says. “Multiple experiments indicate that this method yields a dominant male short tandem repeat (STR) profile from postcoital swabs collected up to 40 hours later, when sperm levels are quite low. Since most women go to the police within 24 hours, most swabs that police collect should have enough sperm,” Dr. Garvin adds. He currently is developing a prototype kit for beta testing. “With this kit and a simple robotic work station, it is now possible to go from swab cutting to PCR-ready DNA without operator input.”
In the next month or two, Promega (www.promega.com) plans to launch an automated version of Differex™, a 96-well platform differential extraction method that separates sperm from epithelial cells. “This method can extract up to 40 sexual assault samples in as little as four hours, instead of the one to two days that labs need to extract only a few samples manually,” notes Curtis Knox, product manager of Promega’s genetic identity group.
“The trend in forensics is toward automation,” Knox says. “The President’s DNA Initiative put out significant funding (more than $1 billion over five years) to crime labs to increase throughput and reduce their backlogs,” Knox notes, but “labs are rarely allowed to increase their headcount. So whenever a lab can increase throughput with instruments or chemistries, it allows them to refocus their personnel.”
Promega, therefore, has put many of its chemistries on 96-well platforms for high- throughput extraction at large database labs.
Most of the labs, though, are smaller-case labs. They deal with fewer, but more varied, samples. “They need an automated, low- to medium-throughput system,” Knox says. Promega introduced the Maxwell® 16 about a year ago to crime labs, after earlier introductions to clinicians and academic researchers. It features preloaded protocols for DNA, RNA, and protein and processes 1–16 predispensed cartridges per run, “so you’re not wasting reagents,” Knox adds. “Processing time is 18–30 minutes, depending on what you’re doing, with virtually no hands-on time, and it runs in forensics or research mode.”
Another advance is the real-time quantitative PCR kit called Plexor™ HY, which combines quantitative assays for total human and male DNA in one kit. “Forensic DNA samples regularly contain mixtures of female and male DNA,” Knox explains. Rather than running two qPCR assays to get all the information you need, Plexor HY provides the information in one assay.
Advances like these are great, but, “the biggest change in the past few years is the environment,” says James W. Schumm, Ph.D., vp and CSO at Bode Technology (www.bodetech.com). U.S. states and the federal government are increasing their support of forensics profiling and, he says, labs are getting new technology, training, and in some cases, new facilities.
Despite this, Dr. Schumm reports, “the backlog hasn’t lessened.” His lab has an additional challenge. Unlike the U.K., which has a generally centralized approach, the U.S. has no such standardized approach. “We handle about 200,000 samples per year from about 20 clients and nearly all have different approaches to collecting and interpreting samples.” To combine high-throughput sample-preparation analysis with customized approaches, Bode is developing an artificial intelligence approach that combines automation with information technology.
First, Dr. Schumm says, Bode automated liquid handling, which was complicated by the variables in the types of samples themselves and the materials on which they were found.
“Now, we are looking at pattern recognition and data quality evaluation,” he reports, to help researchers focus on the best quality samples by identifying those that are under- or overamplified, for example.
Bode is also developing a crime scene collector kit to offer “an easy way to store and dry a sample.” It also is developing the Bodepen, which records content as it is written and wirelessly uploads it into a computer to get data from the field to the lab quickly.
The push toward automation extends to the FBI’s forensics lab, along with projects to increase assay sensitivity. Richard Guerrieri, chief of the FBI’s DNA analysis unit I (nucleic DNA), says “There are a lot of parallels between the mtDNA and nucleic DNA units in terms of automation.” His unit is focused on two categories—the national DNAdata bank and crime lab functions.
The lab underwent process-reengineering in 2003. It consequently automated DNA quantitation, moving from single samples to 96-well plates. The next step, amplifying the sample in a 96-well format, is being validated now. The last step will introduce the amplified crime lab samples to a 16-capillary genetic analyzer. A 48-capillary format will be used for samples for the national DNA database, Guerrieri says, because of their higher quality.
His counterpart, Alice Isenberg, Ph.D., chief of DNA analysis Unit II (mtDNA), explains, “You can’t get a positive ID with mtDNA,” yet that’s the approach that must be used when nucleic DNA isn’t available.
“We’re focused on increasing the speed of analysis, but degradation is the limiting factor. We’re hoping in the near future to analyze mixtures,” Dr. Isenberg says. Right now, her unit is validating a method to determine base compositions of DNA using mass spectrometry. With that methodology, two DNA profiles can be separated at a cost of about $100 per sample, versus the $800 the current method costs for the materials alone. “We hope to have it online in the next few months.”
The downside is that the method doesn’t map gene sequences, “but,” Dr. Isenberg says, “that doesn’t negatively impact the outcome.”
Work undertaken by Phillip Danielson, Ph.D., associate professor of biology at the University of Denver, may help. Dr. Danielson and his team are developing two promising methods of mitochondrial DNA (mtDNA) analysis. The most advanced, Flipars®, is a software prototype that has elicited interest from Transgenomic (www.transgenomic.com). It has just completed its forensic validation study. It allows mixed mtDNA to be separated with “more than 99.99 percent accuracy,” Dr. Danielson says.
Early tests in Dr. Danielson’s lab showed that only 25–30% of mixed mtDNA samples could be separated with 100% accuracy, he says. “Therefore, we looked at partial separations to determine the linkage phase.” In comparing two mtDNA samples, the mixed bases are enriched or depleted together, so samples with only two differences in their bases have overlapping electrophoretic peaks. By teasing them apart, one can look at the subtle shifts in overlapping peaks.
In analyzing 27,000 mixtures, comparing changes in known DNA versus changes in electrothermogram peak heights, accuracy was “99.99 percent or higher, in most cases,” Dr. Danielson says. “For the method to be effective, a minimum seven percent change is needed, but we usually see a change of at least 30 percent.”
The other advance provides fast, inexpensive, presumptive mtDNA identity testing. For about $100 and 10–15 minutes, the same technology can determine whether a sample from a crime scene and a reference sample from a suspect actually match. That advance, now in the final stages of forensic validation, will make it possible for police to quickly and inexpensively exclude suspects.
The increased capability and capacity are greatly needed. Since 2005, those arrested on federal charges have had their DNA added to the national database, increasing the annual throughput by 250,000, from about 150,000 samples, for convicted offenders. As samples from non-U.S. citizens are added, the potential caseload increases by about one million samples per year, along with the opportunities to link criminals to unsolved crimes.
Forensics has always been based on linkages. Now those linkages link field investigators, labs, biochemistry, and biotechnology. Myriad Genetics (www.myriad.com) is a good example. It is not involved with forensics today, but it was. When Myriad was building its business on hereditary cancer genetics, it used forensics to absorb its lab’s extra capacity and to generate a cash flow. Since then, with the primary business established, forensics’ role has been diminished and eventually relinquished, spokesman Bill Hockett explains.
Links to Drug Development
DNAPrint Genomics (www.dnaprint.com) sees forensics and drug development as complementary endeavors. Recently, the company began tests with various cancer therapeutics to determine favorable or adverse reactions. The first part of that program was to be able to determine eye color from forensics data and thus identify markers that were relevant to predicting cancer therapeutics’ success.
While the ability to accurately predict eye color from DNA advances DNAPrint’s drug and assay pipeline, it is also proving of immediate value to law enforcement.
The resultant product, called Retinome®, is available only to law enforcement and is used to develop a general profile of a given suspect. It has about 3,000 eye shots and photos of 6,000 volunteers in a database, so police can have a photo of individuals with similar DNA scores and the genes that match them. It is 92% accurate, according to DNAPrint.
In a related project, called DNA Witness, DNAPrint has refined its technology to allow scientists to identify percentages of sub-Saharan, East African, Southeast Asian, and northern European gene lines from a DNA sample and to drill down further to separate northern and southern European and Middle East genetics. For forensics work, DNA Witness uses 176 SNPs to determine the general area from which a suspect genetically hails. If European, a 140 SNP panel, called EuroWitness, is then run to further refine the results.
Richard Gabriel, CEO and president at DNAPrint, says this is just one more tool to provide “more quantitative data to help solve a crime.” To illustrate its value, he cited a case in Colorado in which a woman “in a fairly homogenous community was beaten and raped.” Investigators looked at the high school student body and football team and her immediate family as potential suspects.
“They hadn’t looked at Hispanics,” he says, “yet the DNA evidence indicated the suspect was primarily Hispanic, based upon the mixture of native American, Southeast Asian, and sub-Saharan traits in the DNA evidence. That allowed some suspects to be eliminated immediately and pointed investigators in a direction they hadn’t previously considered.”
Such advances are needed because, as Dr. Schumm says, the backlog is not decreasing and the databases are growing. “Anybody convicted of a felony is in the database, but some states are adding samples from those arrested, but not convicted, for a felony.” At the same time, the industry is moving from STRs toward SNPs. The U.S. National Institute of Justice is now taking both, but “it will be a decade before SNPs replace STRs,” he adds. The challenge is that SNPs are more expensive and need more loci for evaluation. Eventually, Dr. Schumm predicts that SNPs, STRs, and mtDNA can be analyzed on one chip, but that will be a while.
Phillip Danielson, Ph.D., and colleagues at the University of Denver are working on technology for analyzing mtDNA including Flipars®, which separates out mixed mtDNA.
© 2012 Genetic Engineering & Biotechnology News, All Rights Reserved