Stay up to date with latest news about technology, food, fashion, games, business and everything else that you need.

Sunday, May 6, 2018

Facial Recognition Used by Wales Police Has 90 Percent False Positive Rate

Thousands of attendees of the 2017 Champions League final in Cardiff, Wales were mistakenly identified as potential criminals by facial recognition technology used by local law enforcement.
According to the Guardian, the South Wales police scanned the crowd of more than 170,000 people who traveled to the nation’s capital for the soccer match between Real Madrid and Juventus. The cameras identified 2,470 people as criminals. 
Having that many potential lawbreakers in attendance might make sense if the event was, say, a convict convention, but seems pretty high for a soccer match. As it turned out, the cameras were a little overly-aggressive in trying to spot some bad guys. Of the potential criminals identified, 2,297 were wrongly labeled by the facial recognition software. That’s a 92 percent false positive rate.
Wired obtained additional data regarding the Wales system and found it’s been consistently bad. At a boxing match last year, the cameras produced a 90 percent false positive rate. at a rugby match, 87 percent of the supposed criminals it spotted were falsely identified.
The South Wales Police department, for its part, has defended the technology and issued the following statement regarding its use of facial recognition tools:
Of course no facial recognition system is 100 percent accurate under all conditions. Technical issues are normal to all face recognition systems which means false positives will continue to be a common problem for the foreseeable future. However since we introduced the facial recognition technology no individual has been arrested where a false positive alert has led to an intervention and no members of the public have complained.
According to the department, there have been a total of 2,000 positive matches made since Wales started using the technology nine months ago. Those positive scans have resulted in more than 450 arrests. 
Matt Jukes, the chief constable in South Wales, told to the BBC the face scanners are essential to operations for law enforcement, especially at major events. “We need to use technology when we’ve got tens of thousands of people in those crowds to protect everybody, and we are getting some great results from that,” he said. 
While it appears South Wales has been judicious in reviewing matches made by its surveillance cameras as they have yet to wrongly arrest someone, one has to wonder just how time-saving the technology has been. The facial scanners misidentified more people at one event than it has correctly spotted in nine months. Even assuming the software produces slightly better results in less crowded situations, a 90 percent false positive rate creates extra work for law enforcement to sift through and verify.
“These figures show that not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool,” Silkie Carlo, director of privacy rights group Big Brother Watch, told Wired. He said the statistics from South Wales shows that the facial recognition software “misidentifies innocent members of the public at a terrifying rate, whilst there are only a handful of occasions where it has supported a genuine policing purpose.”
The police insist the cameras help facilitate the identification process and save time, which may well be true. It also makes innocent people into suspects, entirely unbeknownst to them, until police can clear their name.

No comments:

Post a Comment