This post was written by EFF legal intern Alexa Chavara.

Black box technology has no place in the criminal legal system. That’s why we’ve once again filed an amicus brief arguing that the both the defendant and the public have a right to information regarding face recognition technology (FRT) that was used during an investigation to identify a criminal defendant.

Back in June 2023, we filed an amicus brief along with Electronic Privacy Information Center (EPIC) and the National Association of Criminal Defense Lawyers (NACDL) in State of New Jersey v. Arteaga. We argued that information regarding the face recognition technology used to identify the defendant should be disclosed due to the fraught process of a face recognition search and the many ways that inaccuracies manifest in the use of the technology. The New Jersey appellate court agreed, holding that state prosecutors must turn over detailed information to the defendant about the FRT used, including how it works, its source code, and its error rate. The court held that this ensures the defendant’s due process rights with the ability to examine the information, scrutinize its reliability, and build a defense.

Last month, partnering with the same organizations, we filed another amicus brief in favor of transparency regarding FRT in the criminal system, this time in the New Jersey Supreme Court in State of New Jersey v. Miles.

In Miles, New Jersey law enforcement used FRT to identify Mr. Miles as a suspect in a criminal investigation. The defendant, represented by the same public defender in Arteaga, moved for discovery on information about the FRT used, relying on Arteaga. The trial court granted this request for discovery, and the appellate court affirmed. The State then appealed to the New Jersey Supreme Court, where the issue is before the Court for the first time.

As explained in our amicus brief, disclosure is necessary to ensure criminal prosecutions are based on accurate evidence. Every search using face recognition technology presents a unique risk of error depending on various factors from the specific FRT system used, the databases searched, the quality of the photograph, and the demographics of the individual. Study after study shows that facial recognition algorithms are not always reliable, and that error rates spike significantly when involving faces of people of color,  especially Black women, as well as trans and nonbinary people.

Moreover, these searches often determine the course of investigation, reinforcing errors and resulting in numerous wrongful arrests, most often of Black folks. Discovery is the last chance to correct harm from misidentification and to allow the defendant to understand the evidence against them.

Furthermore, the public, including independent experts, have the right to examine the technology used in criminal proceedings. Under the First Amendment and the more expansive New Jersey Constitution corollary, the public’s right to access criminal judicial proceedings includes filings in pretrial proceedings, like the information being sought here. That access provides the public meaningful oversight of the criminal justice system and increases confidence in judicial outcomes, which is especially significant considering the documented risks and shortcomings of FRT.