EFF, together with 41 national, state, and local civil rights and civil liberties groups, sent a letter today urging the ethics board of police technology and weapons developer Axon to hold the company accountable to the communities its products impact—and to itself.
Axon, based in Scottsdale, Arizona, is responsible for making and selling some of the most used police products in the United States, including tasers and body-worn cameras. Over the years, the company has taken significant heat for how those tools have been used in police interactions with the public, especially given law enforcement’s documented history of racial discrimination. Axon is now considering developing and incorporating into existing products new technologies like face recognition and artificial intelligence. It set up an “AI Ethics Board” made up of outside advisors and says it wants to confront the privacy and civil liberties issues associated with police use of these invasive technologies.
As we noted in the letter, “Axon has a responsibility to ensure that its present and future products, including AI-based products, don’t drive unfair or unethical outcomes or amplify racial inequities in policing.” Given this, our organizations called on the Axon Ethics Board to adhere to the following principles at the outset of its work:
- Certain products are categorically unethical to deploy.
- Robust ethical review requires centering the voices and perspective of those most impacted by Axon’s technologies.
- Axon must pursue all possible avenues to limit unethical downstream uses of its technologies.
- All of Axon’s digital technologies require ethical review.
With these guidelines, we urge Axon’s Ethics Board to steer the company in the right direction for all its current and future products. For example, the Ethics Board must advise Axon against pairing real-time face recognition analysis technology to the live video captured by body-worn cameras:
Real-time face recognition would chill the constitutional freedoms of speech and association, especially at political protests. In addition, research indicates that face recognition technology will never be perfectly accurate and reliable, and that accuracy rates are likely to differ based on subjects’ race and gender. Real-time face recognition therefore would inevitably misidentify some innocent civilians as suspects. These errors could have fatal consequences—consequences that fall disproportionately on certain populations.
For these reasons, we believe “no policy or safeguard can mitigate these risks sufficiently well for real-time face recognition ever to be marketable.”
Similarly, we urge Axon’s ethical review process to include the voices of those most impacted by its technologies:
The Board must invite, consult, and ultimately center in its deliberations the voices of affected individuals and those that directly represent affected communities. In particular, survivors of mass incarceration, survivors of law enforcement harm and violence, and community members who live closely among both populations must be included.
Finally, we believe that all of Axon’s products, both hardware and software, require ethical review. The Ethics Board has a large responsibility for the future of Axon. We hope its members will listen to our requests and hold Axon accountable for its products.[1]
Letter signatories include Color of Change, UnidosUS, South Asian Americans Leading Together, Detroit Community Technology Project, Algorithmic Justice League, Data for Black Lives, NAACP, NC Statewide Police Accountability Project, Urbana-Champaign Independent Media Center, and many more. All are concerned about the misuse of technology to entrench or expand harassment, prejudice, and bias against the public.
You can read the full letter here.
[1] EFF’s Technology Policy Director Jeremy Gillula, has chosen to join Axon’s Ethics Board in his personal capacity. He has recused himself from writing or reviewing this blog, or the letter, and his participation on the board should not be attributed to EFF.