In early 2020, the London Metropolitan Police announced that it was going to start using a real-time facial recognition system on the streets of the British capital. Since the technology was deployed on February 27 in the Oxford Circus area, one of the busiest in the city, the system has detected eight matches to the police's criminal record data, however, only one has turned out to be correct.
According to data published by the London Metropolitan Police, in just seven days, the real-time facial recognition system registered the faces of 8,600 passers-by. Of the seven false positives, five resulted in interactions with the authorities, where they personally contacted the individuals concerned to verify that they were on the police's wanted list.
The London Police decision is being strongly contested. Big Brother Watch is one of the most recent to show its discontent. On its Twitter, the British non-profit organization indicates that the fact that 86% of the alerts detected by the system are false undermines the arguments of London police. This is a human rights disaster, a violation of the most basic freedoms and a disgrace to the capital, states in a publication.
It should be remembered that, as an argument for the use of technology, the London authorities stated that it would help them in the fight against crime and in protecting the most vulnerable. Police also indicated that the presence of the chambers would be clearly marked where they were placed. The images collected by the system that did not identify any type of criminal presence would later be eliminated, and the authorities would only act if necessary.