Amazon’s face recognition system mistook 26 California law-makers as criminals

Tram Ho

Recently, the American Civil Liberties Union (ACLU) conducted an experiment with Amazon’s Rekognition software – the face recognition technology currently used by US law enforcement; The results show that it misidentified 26 California law-makers as criminals in the police database!

This is the second time the ACLU has conducted this type of test. For the first time, conducted last year, Rekognition performed extremely badly, constantly giving false and racist results when identifying members of the US Congress.

In this second test, ACLU sought to compare 120 images of California lawmakers with a database of 25,000 photo identifiers taken by government agencies. Amazon software produces results with a false rate of about 20%.

Phil Ting, a member of the San Francisco city council, and one of the wrong people, used this test result to show his support for a bill to ban public use. face recognition technology on cameras mounted on the police when on duty.

We want to test this software as proof that this software is not fully ready to be used,” Ting said in a press conference. ” Although we can joke about it as legislators, it is not something that can be joke for people trying to find a job, or want to have a place to live. “.

Hệ thống nhận dạng khuôn mặt của Amazon nhầm 26 nhà làm luật California là tội phạm - Ảnh 1.

An Amazon spokesman said this issue as follows:

ACLU once again deliberately misused and misrepresented the Amazon Rekognition to attract attention. As we said many times before, when used to support human-made decisions, with 99% recommended reliability, face recognition technology can be used to serve many useful purposes, from helping to identify crime to helping to find lost children, to preventing trafficking. We will continue to call for face-recognition technology into federal law to ensure it is used responsibly, and we have shared specific recommendations on this issue with the policymakers in private and on their blog publicly “.

ACLU’s lawyer, Matt Cagle, who works with the University of Berkeley to confirm the results as an independent party, argued over the issue. He said ACLU does not use “99% recommended reliability” because they cannot change the Amazon software’s default settings to 80%.

Amazon denied that claim, pointing out in a blog post that its Rekognition should not be used with a 99% certainty. But Amazon’s words seem to be slapping themselves, when we can ask the opposite: why don’t they set 99% by default in their software but put it at 80%?

Reference: TheNextWeb

Share the news now

Source : Trí Thức Trẻ