Advances like these are great, but, “the biggest change in the past few years is the environment,” says James W. Schumm, Ph.D., vp and CSO at Bode Technology (www.bodetech.com). U.S. states and the federal government are increasing their support of forensics profiling and, he says, labs are getting new technology, training, and in some cases, new facilities.
Despite this, Dr. Schumm reports, “the backlog hasn’t lessened.” His lab has an additional challenge. Unlike the U.K., which has a generally centralized approach, the U.S. has no such standardized approach. “We handle about 200,000 samples per year from about 20 clients and nearly all have different approaches to collecting and interpreting samples.” To combine high-throughput sample-preparation analysis with customized approaches, Bode is developing an artificial intelligence approach that combines automation with information technology.
First, Dr. Schumm says, Bode automated liquid handling, which was complicated by the variables in the types of samples themselves and the materials on which they were found.
“Now, we are looking at pattern recognition and data quality evaluation,” he reports, to help researchers focus on the best quality samples by identifying those that are under- or overamplified, for example.
Bode is also developing a crime scene collector kit to offer “an easy way to store and dry a sample.” It also is developing the Bodepen, which records content as it is written and wirelessly uploads it into a computer to get data from the field to the lab quickly.
The push toward automation extends to the FBI’s forensics lab, along with projects to increase assay sensitivity. Richard Guerrieri, chief of the FBI’s DNA analysis unit I (nucleic DNA), says “There are a lot of parallels between the mtDNA and nucleic DNA units in terms of automation.” His unit is focused on two categories—the national DNAdata bank and crime lab functions.
The lab underwent process-reengineering in 2003. It consequently automated DNA quantitation, moving from single samples to 96-well plates. The next step, amplifying the sample in a 96-well format, is being validated now. The last step will introduce the amplified crime lab samples to a 16-capillary genetic analyzer. A 48-capillary format will be used for samples for the national DNA database, Guerrieri says, because of their higher quality.
His counterpart, Alice Isenberg, Ph.D., chief of DNA analysis Unit II (mtDNA), explains, “You can’t get a positive ID with mtDNA,” yet that’s the approach that must be used when nucleic DNA isn’t available.
“We’re focused on increasing the speed of analysis, but degradation is the limiting factor. We’re hoping in the near future to analyze mixtures,” Dr. Isenberg says. Right now, her unit is validating a method to determine base compositions of DNA using mass spectrometry. With that methodology, two DNA profiles can be separated at a cost of about $100 per sample, versus the $800 the current method costs for the materials alone. “We hope to have it online in the next few months.”
The downside is that the method doesn’t map gene sequences, “but,” Dr. Isenberg says, “that doesn’t negatively impact the outcome.”
Work undertaken by Phillip Danielson, Ph.D., associate professor of biology at the University of Denver, may help. Dr. Danielson and his team are developing two promising methods of mitochondrial DNA (mtDNA) analysis. The most advanced, Flipars®, is a software prototype that has elicited interest from Transgenomic (www.transgenomic.com). It has just completed its forensic validation study. It allows mixed mtDNA to be separated with “more than 99.99 percent accuracy,” Dr. Danielson says.
Early tests in Dr. Danielson’s lab showed that only 25–30% of mixed mtDNA samples could be separated with 100% accuracy, he says. “Therefore, we looked at partial separations to determine the linkage phase.” In comparing two mtDNA samples, the mixed bases are enriched or depleted together, so samples with only two differences in their bases have overlapping electrophoretic peaks. By teasing them apart, one can look at the subtle shifts in overlapping peaks.
In analyzing 27,000 mixtures, comparing changes in known DNA versus changes in electrothermogram peak heights, accuracy was “99.99 percent or higher, in most cases,” Dr. Danielson says. “For the method to be effective, a minimum seven percent change is needed, but we usually see a change of at least 30 percent.”
The other advance provides fast, inexpensive, presumptive mtDNA identity testing. For about $100 and 10–15 minutes, the same technology can determine whether a sample from a crime scene and a reference sample from a suspect actually match. That advance, now in the final stages of forensic validation, will make it possible for police to quickly and inexpensively exclude suspects.
The increased capability and capacity are greatly needed. Since 2005, those arrested on federal charges have had their DNA added to the national database, increasing the annual throughput by 250,000, from about 150,000 samples, for convicted offenders. As samples from non-U.S. citizens are added, the potential caseload increases by about one million samples per year, along with the opportunities to link criminals to unsolved crimes.
Forensics has always been based on linkages. Now those linkages link field investigators, labs, biochemistry, and biotechnology. Myriad Genetics (www.myriad.com) is a good example. It is not involved with forensics today, but it was. When Myriad was building its business on hereditary cancer genetics, it used forensics to absorb its lab’s extra capacity and to generate a cash flow. Since then, with the primary business established, forensics’ role has been diminished and eventually relinquished, spokesman Bill Hockett explains.