Genetic science has progressed significantly in recent years, and it influences how crime is being investigated in the US and elsewhere.
DNA analysis started being used to identify crime suspects about 30 years ago, with the first conviction thanks to DNA evidence happening in 1986.
It’s easy to assume from watching TV crime drama series that DNA evidence is irrefutable, because that’s how it’s portrayed in fictional criminal courts. The Guardian interviewed Professor Shari Forbes of the Centre for Forensic Science at the University of Technology, Sydney about this matter:
The problem we find is that juries increasingly expect DNA to be collected from every single crime scene, and when it’s not, either because it can’t be found or it wasn’t required, we end up spending a lot time explaining why.
Forbes also mentioned that people in the general public who become jury members will often assume that if no DNA was found in a crime scene, that means the perpetrator wasn’t there.
So it might surprise many that there is a controversy regarding the accuracy of two particular DNA analysis methods.
Up until recently, DNA forensics labratories would only use DNA samples larger than a few hundred picograms. A picogram is one trillionth of a gram. Smaller quantities of DNA are more difficult to test, especially with older DNA analysis methodologies.
Since 1999, the UK Forensic Science Service has used a DNA profiling technique called Low Copy Number, which can analyze DNA samples as small as 100 picograms.
The DNA laboratory in the office of New York City’s chief medical examiner has introduced two DNA profiling techniques designed to analyze even smaller DNA samples.
The lab’s Dr Theresa A Caragine, a forensic scientist, developed the high-sensitivity testing method, and implemented it in 2006. After several years of experience with that method, Caragine and Dr.Adele A Mitchell invented the Forensic Statistical Tool, which is specialized forensic DNA analysis software. Both methods are still being used to test really tiny DNA samples, as well as DNA samples which might contain genetic material from more than one person. That’s not how forensic DNA testing was done in the 1990s or even in the first decade of the 21st century.
According to the lab’s former director, Dr Mechthild Prinz, in 2009:
A couple of years ago, DNA testing was limited to body fluids – semen, blood, and saliva. Now every laboratory in the country routinely receives swabs from guns.
Semen, blood, and saliva provide much larger DNA samples than can be acquired from traces of skin sebum or sweat which is left on objects. Plus, tiny DNA traces found on objects are a lot more likely to be mixed with DNA from other people.
The forensic analysis of very tiny amounts of DNA is a difficult area. According to a report from Promega, a biotechnology firm:
Every lab faces samples with low amounts of DNA. Laboratories and DNA analysts need to choose whether or not to attempt an ‘enhanced interrogation technique’ such as increasing the cycle number, desalting samples or higher CE (capillary electrophoresis) injection. If such an approach is taken, validation studies need to be performed to develop appropriate interpretation guidelines and to assess the degree of variation that can be expected when analyzing low amounts of DNA.
Deciding where to stop testing or interpreting data can be challenging. Some laboratories stop testing based on a certain amount of input DNA, using validation data to underpin a quantitation threshold. Others set stochastic thresholds that are used during data interpretation to decide what STR-typing data are reliable (ie, are not expected to have allelic drop-out at that locus).
Both the high-sensitivity testing and Forensic Statistical Tool methodologies are now being legally contested. A group of defense lawyers have asked the New York State inspector general’s office to launch an inquiry into thousands of criminal cases that have used the methodologies in New York City’s DNA lab.
Because the lab uses cutting-edge techniques, they also test DNA samples provided by police departments all across the United States, not just in New York. On September 1, the Legal Aid Society and the Federal Defenders of New York alleged that the medical examiner’s office in New York
… has engaged in negligent conduct that undermines the integrity of its forensic DNA testing and analysis.
Dr Eli Shapiro, the former mitochondrial DNA technical leader in the DNA lab, wrote to The New York Times saying that he had retired early due to the stress of having to approve lab reports generated by the Forensic Statistical Tool. He has said in court that he finds the Forensic Statistical Tool’s process to be “very disturbing”.
The Legal Aid Society and the Federal Defenders of New York have contested two specific criminal cases that were heard in court between 2012 and 2014. Both cases involved the Forensic Statistical Tool, and they were denied access to the tool’s source code.
Dr Bruce Budowle, who helped design the FBI’s national DNA database, believes that the New York lab’s statistical methods are “not defensible”. According to Budowle, the FST was designed with the incorrect assumption that every DNA mixture of the same size was missing information or had been contaminated in just the same way. He said:
Five-person mixtures can look like three-person. Four contributors can look like two-person mixtures. It’s almost impossible to actually be accurate.
FST’s developers have acknowledged a margin of error of 30% in their method of quantifying the amount of DNA in a sample. But they still stand behind the accuracy of their software.
It seems that New York’s criminal court might have been too hasty in accepting new DNA forensics methodologies that have yet to be proven to have reliable accuracy – and it’s possible therefore that there could be thousands of people in American prisons who were falsely convicted due to forensic DNA technologies that weren’t properly studied before they were deployed.