Skip to content
Naked Security Naked Security

Fears raised about accuracy of new forensic DNA techniques

Lawyers are challenging convictions in which DNA analysis played a part, claiming that a new tool isn't reliable

Genetic science has progressed significantly in recent years, and it influences how crime is being investigated in the US and elsewhere.

DNA analysis started being used to identify crime suspects about 30 years ago, with the first conviction thanks to DNA evidence happening in 1986.

It’s easy to assume from watching TV crime drama series that DNA evidence is irrefutable, because that’s how it’s portrayed in fictional criminal courts. The Guardian interviewed Professor Shari Forbes of the Centre for Forensic Science at the University of Technology, Sydney about this matter:

The problem we find is that juries increasingly expect DNA to be collected from every single crime scene, and when it’s not, either because it can’t be found or it wasn’t required, we end up spending a lot time explaining why.

Forbes also mentioned that people in the general public who become jury members will often assume that if no DNA was found in a crime scene, that means the perpetrator wasn’t there.

So it might surprise many that there is a controversy regarding the accuracy of two particular DNA analysis methods.

Up until recently, DNA forensics labratories would only use DNA samples larger than a few hundred picograms. A picogram is one trillionth of a gram. Smaller quantities of DNA are more difficult to test, especially with older DNA analysis methodologies.

Since 1999, the UK Forensic Science Service has used a DNA profiling technique called Low Copy Number, which can analyze DNA samples as small as 100 picograms.

The DNA laboratory in the office of New York City’s chief medical examiner has introduced two DNA profiling techniques  designed to analyze even smaller DNA samples.

The lab’s Dr Theresa A Caragine, a forensic scientist, developed the high-sensitivity testing method, and implemented it in 2006. After several years of experience with that method, Caragine and Dr.Adele A Mitchell invented the Forensic Statistical Tool, which is specialized forensic DNA analysis software. Both methods are still being used to test really tiny DNA samples, as well as DNA samples which might contain genetic material from more than one person. That’s not how forensic DNA testing was done in the 1990s or even in the first decade of the 21st century.

According to the lab’s former director, Dr Mechthild Prinz, in 2009:

A couple of years ago, DNA testing was limited to body fluids – semen, blood, and saliva. Now every laboratory in the country routinely receives swabs from guns.

Semen, blood, and saliva provide much larger DNA samples than can be acquired from traces of skin sebum or sweat which is left on objects. Plus, tiny DNA traces found on objects are a lot more likely to be mixed with DNA from other people.

The forensic analysis of very tiny amounts of DNA is a difficult area. According to a report from Promega, a biotechnology firm:

Every lab faces samples with low amounts of DNA. Laboratories and DNA analysts need to choose whether or not to attempt an ‘enhanced interrogation technique’ such as increasing the cycle number, desalting samples or higher CE (capillary electrophoresis) injection. If such an approach is taken, validation studies need to be performed to develop appropriate interpretation guidelines and to assess the degree of variation that can be expected when analyzing low amounts of DNA.

Deciding where to stop testing or interpreting data can be challenging. Some laboratories stop testing based on a certain amount of input DNA, using validation data to underpin a quantitation threshold. Others set stochastic thresholds that are used during data interpretation to decide what STR-typing data are reliable (ie, are not expected to have allelic drop-out at that locus).

Both the high-sensitivity testing and Forensic Statistical Tool methodologies are now being legally contested. A group of defense lawyers have asked the New York State inspector general’s office to launch an inquiry into thousands of criminal cases that have used the methodologies in New York City’s DNA lab.

Because the lab uses cutting-edge techniques, they also test DNA samples provided by police departments all across the United States, not just in New York. On September 1, the Legal Aid Society and the Federal Defenders of New York alleged that the medical examiner’s office in New York

… has engaged in negligent conduct that undermines the integrity of its forensic DNA testing and analysis.

Dr Eli Shapiro, the former mitochondrial DNA technical leader in the DNA lab, wrote to The New York Times saying that he had retired early due to the stress of having to approve lab reports generated by the Forensic Statistical Tool. He has said in court that he finds the Forensic Statistical Tool’s process to be “very disturbing”.

The Legal Aid Society and the Federal Defenders of New York have contested two specific criminal cases that were heard in court between 2012 and 2014. Both cases involved the Forensic Statistical Tool, and they were denied access to the tool’s source code.

Dr Bruce Budowle, who helped design the FBI’s national DNA database, believes that the New York lab’s statistical methods are “not defensible”. According to Budowle, the FST was designed with the incorrect assumption that every DNA mixture of the same size was missing information or had been contaminated in just the same way. He said:

Five-person mixtures can look like three-person. Four contributors can look like two-person mixtures. It’s almost impossible to actually be accurate.

FST’s developers have acknowledged a margin of error of 30% in their method of quantifying the amount of DNA in a sample. But they still stand behind the accuracy of their software.

It seems that New York’s criminal court might have been too hasty in accepting new DNA forensics methodologies that have yet to be proven to have reliable accuracy – and it’s possible therefore that there could be thousands of people in American prisons who were falsely convicted due to forensic DNA technologies that weren’t properly studied before they were deployed.


10 Comments

What exactly does this 30% error margin apply to? Because that is a very high error margin. I mean, what other things are based on how they quantify a DNA sample? If they get it wrong 3 out of 10 times how did this ever get used to convict someone?

Reply

Read it more carefully: “FST’s developers have acknowledged a margin of error of 30% in their method of quantifying the amount of DNA in a sample.”
The 30% applies to their guesstimate of how much DNA they have to work with, not in the results of any DNA analysis.

Reply

I did read it carefully. That is exactly why I asked what this margin applies to, where this “quantification of DNA samples” comes to effect. If it’s just “30% of the time our method doesn’t work, i.e. doesn’t return any results” then this margin doesn’t matter at all in terms of their accuracy. If the 3 out of 10 times their method “fails” they just say “inconclusive” and move on, good. If the 3 out of 10 times their method still gives a result that they base other stuff on and they treat it like the other 7 times, not good.

Reply

There is a 30% error rate in the amount needed. I believe this means that if they need 70 units for a reliable test, then they need to actually have 100 units available to guarantee they have 70 for certain. (Plus some more, to cover the fact that the 30% figure also includes a margin of error.)
If they don’t have 100 units available at the start, then they can’t reliably do the test. (They don’t want to waste the precious little they have if it will produce a contestable set of findings.)

Reply

I believe the 30% is the margin of error for quantifying the amount of DNA in the sample. Which means that although the FST can be used for much smaller quantities, there is a 30% chance that it won’t be enough to create a DNA profile. It’s more than ‘3/10 DNA convictions are incorrect’- this article doesn’t really provide any background into the entire procedure of Forensic DNA Analysis, it focuses on one aspect of it, used in one part of the world, that has a unnaturally high rate of error. Also not quite sure which part of this article is ‘Computer security news’

Reply

I commissioned this because DNA is increasingly a strand of discussion about identification, and, as we’re discovering, it isn’t a failsafe way of identifying criminals, as we once thought. And also because more broadly there are issues of privacy around DNA testing, such as we discussed in our piece at the end of last year about the dilemma of twins’ DNA testing. https://nakedsecurity.sophos.com/2016/12/29/guest-post-whose-genes-are-they-anyway/

The science of DNA testing is broadly interesting if you’re of a sciencey bent and it has wide implications for identity, identification and security. Hope you found it interesting; I absolutely did. It’s not a mainstream concern for Naked Security, but I think it’s worth putting on the radar from time to time.

Reply

The question is, can they accurately tell it wasn’t “enough” to create a proper profile? Or do they still create one based on false assumptions? If whatever their error margin applies to has no effect on the further process, mentioning it doesn’t make sense on their part, and the critisisms of the other DNA specialists don’t make sense either. If that error margin does have an effect on their further process and can taint their results, then 30% is very high.

Reply

As a student I studied the principle involved in DNA forensics but for use in research. Even then the techniques were incredibly powerful and could in theory amplify and detect a single strand of DNA. But the problem there being any environment is flooded wit DNA fragments. So the narrower the focus of your DNA collection the more noise you will get. A blood sample is going to contain 99% DNA from a single source but a swab will have DNA from hundreds.

Reply

I believe the 30% margin of error applies to the ability to quantify the DNA when using FST procedure. This doesn’t mean ‘30% of DNA tests are wrong’, rather ‘30% of tests using FST fail to return a quantifiable amount of DNA to generate a profile’. In which case the 30% would not be used as DNA evidence at all. A bit of background or at least a summary of the DNA analysis procedure and where FST fits into it would be great to help clarify for readers. The only takeaway readers will get from this is ‘OMG 30% margin of error in DNA testing!’. Also quite curious as to where the ‘computer security news’ fits in.

Reply

If that is the case then I have no idea how that 30% margin is supposed to relate to the critisisms of those other people in this article. They said their methodology is based on false assumptions. And the answer to that seems to be “30% error margin, but very accurate”. Even if they just got results for 10% of samples, as long as those results are accurate, where is the problem? So I fail to see the connection of all the things in this article. Is there something wrong with this technique? Is it inaccurate? Or is it just not applicable in 30% of the cases? Or do they use all their results knowing that 30% of them have no correct basis? That would be an important clarification.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!