【英国警方人脸识别系统的错误率为81%】

【英国警方人脸识别系统的错误率为81%】根据埃塞克斯大学的研究结果,人脸识别技术将五分之四的无辜人士误认为通缉犯。该报告由苏格兰场委托,研究人员从其中六个位置测量了该技术的准确性,发现42个“可疑匹配”中只有8个是正确的,错误率为81%。该报告的作者表示,他们的研究结果引起了人们的担忧。这并不是英国警方第一次因此类不准确而受到抨击 - 2018年,南威尔士警方错误地将2,300人视为潜在的罪犯。

 

UK police's facial recognition system has an 81 percent error rate

But officials inside the Metropolitan Police say otherwise.

Facial recognition technology is mistakenly targeting four out of five innocent people as wanted suspects, according to findings from the University of Essex. The report -- which was commissioned by Scotland Yard -- found that the technology used by the UK's Metropolitan Police is 81 percent inaccurate and concludes that it is "highly possible" the system would be found unlawful if challenged in court.

The report, obtained by Sky News, is the first independent evaluation of the scheme since the technology was first used at Notting Hill Carnival in August 2016. Since then it has been used at 10 locations, including Leicester Square and during Remembrance Sunday services. Researchers measured the accuracy of the technology from six of these locations and found that of 42 "suspect matches," only eight were correct, giving an error rate of 81 percent.

However, the Met measures accuracy in a different way, by comparing successful and unsuccessful matches with the total number of faces processed by the system. If interpreted in this way, the error rate is just 0.1 percent. In response to the report, the Met's deputy assistant commissioner, Duncan Ball, said the force was "extremely disappointed with the negative and unbalanced tone of this report." The authors of the report, meanwhile, said their findings posed "significant concerns." This is not the first time UK police have come under fire for such inaccuracies -- in 2018 South Wales Police misidentified 2,300 people as potential criminals.

The use of facial recognition technology has skyrocketed in recent times, with systems being installed in public transport hubs and at large events. Despite some apparent "successes" -- such as the identification of an illegal traveler at Dulles airport just three days after the system was launched -- the technology continues to pose a number of ethical and legal dilemmas. In China, for example, facial recognition is being used to monitor ethnic minorities and track children's classroom behavior. Meanwhile, a number of tech giants have made clear their apprehensions about the technology. Microsoft has been outspoken about its desire for proper regulation, while both Apple and Google have expressed similar concerns. As this new report demonstrates, the technology still has a long way to go before it can be considered truly reliable.

https://www.engadget.com/2019/07/04/uk-met-facial-recognition-failure-rate/


Comments are closed.



无觅相关文章插件