As authorities companies proceed to push for the deployment of facial recognition techniques, you needn’t look far to see why that’s unhealthy information. To illustrate the purpose, the ACLU performed a take a look at of Amazon’s Rekognition software program — facial recognition tech at present being utilized by US regulation enforcement — by which it incorrectly recognized 26 California lawmakers as matches in a prison database.
We’ll pause when you chuckle on the “politicians are criminals” jokes operating by way of your head.
It’s the second time the ACLU has run this kind of take a look at. In the primary, a take a look at performed final 12 months, Rekognition was wildly inaccurate, churning out incorrect and racially biased outcomes when making an attempt to match members of Congress.
Detailed at present, the most recent ACLU take a look at ran 120 photographs of California lawmakers towards a database of 25,000 mugshots. Amazon’s Rekognition software program produced false positives about 20 p.c of the time.
Phil Ting, a San Francisco Assembly Member, and one of many incorrect matches, used the outcomes to drum up help for a invoice that may ban use of the know-how in police physique cameras. “We wanted to run this as a demonstration about how this software is absolutely not ready for prime time,” Ting mentioned throughout a press convention. “While we can laugh about it as legislators, it’s no laughing matter for an individual trying to get a job, if you are an individual trying to get a home.”
An Amazon spokesperson advised TNW:
The ACLU is as soon as once more knowingly misusing and misrepresenting Amazon Rekognition to make headlines. As we’ve mentioned many instances prior to now, when used with the advisable 99% confidence threshold and as one a part of a human-driven resolution, facial recognition know-how can be utilized for a protracted checklist of helpful functions, from helping within the identification of criminals to serving to discover lacking kids to inhibiting human trafficking. We proceed to advocate for federal laws of facial recognition know-how to make sure accountable use, and we’ve shared our particular recommendations for this each privately with coverage makers and on our weblog.
ACLU lawyer Matt Cagle, who labored with UC Berkeley to independently confirm the outcomes argued towards the criticism. In a remark to Gizmodo, Cagle mentioned that the ACLU didn’t use a 99 p.c confidence threshold as a result of it caught with the default settings in Amazon’s software program — which is an 80 p.c confidence rating.
Amazon refuted the declare, pointing to a weblog put up by which it notes that Rekognition shouldn’t be used with lower than a 99 p.c confidence stage. Of course, this solely results in extra questions. Specifically, why isn’t 99 p.c the software program’s default setting?