If you're accused of a crime based on an algorithm's analysis of the evidence, you should have a right to refute the assumptions, methods, and programming of that algorithm. Building on previous wins, EFF and its allies turned the tide this year on the use of these secret programs in criminal prosecutions.
One of the most common forms of forensic programs is probabilistic genotyping software. It is used by the prosecution to examine DNA mixtures, where an analyst doesn't know how many people contributed to the sample (such as a swab taken from a weapon). These programs are designed to make choices about how to interpret the data, what information to disregard as likely irrelevant, and compute statistics based on how often the different genes appear in different populations—and all of the different programs do it differently. These assumptions and processes are subject to challenge by the person accused of a crime. For that challenge to be meaningful, the defense team must have access to source code and other materials used in developing the software.
The software vendors claim both that the software contains valuable secrets that must not be disclosed and that their methods are so well-vetted that there's no point letting a defendant question them. Obviously, both can't be true, and in fact it's likely that neither is true.
When a was finally able to access one of these programs, the Forensic Statistical Tool (FST), they discovered an undisclosed function and shoddy programming practices that could lead the software to implicate an innocent person. The makers of FST submitted sworn declarations about how they thought it worked, it had been subject to 'validation' studies where labs test some set of inputs to see if the results seem right, and so on. But any programmer knows that programs don't always do what you think you programmed them to do, and so it was with FST: in trying to fix one bug, they unwittingly introduced another serious error.
That's why there's no substitute for independent review of the actual code that performs the analysis.
Fortunately, this year saw two very significant wins for the right to challenge secret software.
First, in a detailed and thoughtful opinion, a New Jersey appellate court explained in plain language why forensic software isn't above the law and isn't exempt from being analyzed by a defense expert to make sure it's reliable and does what it says it does.
Then, the first Federal court to consider the issue also ordered disclosure.
But that's not the end of the story. In the New Jersey case, the prosecution decided to withdraw the evidence to avoid disclosure. And in the federal case, the defense says that the prosecution handed over unusable and incomplete code fragments. The defense is continuing to fight to get meaningful transparency into the software used to implicate the defendant.
With the battle ongoing, we're also continuing to brief the issue in other courts. Most recently, we filed an amicus brief in NY v. Easely, where the defendant was assaulted by a half dozen people and then arrested, accused of unlawful possession of a firearm based solely on the fact that he was near it and the DNA software said the DNA mixture on the gun likely contained some of his DNA. To make matters worse, the software at issue is closely related to the version of FST that was found to contain serious flaws.
Given the history of junk science being used in the courtrooms, we must be vigilant to protect the rights of defendants to challenge the evidence used against them. We also fight to protect the public's interest in fair judicial proceedings, and that means no convictions based on the say-so of secret software programs.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2021.