A challenge to the use of automated facial recognition technology has argued ‘that the software breaches privacy rights and will “radically” alter the way Britain is policed’ and that it is racially discriminatory.’ The case has been brought by a Cardiff resident who believes that South Wales Police has captured the biometrics of over half a million faces, most of whom are not under suspicion of any wrongdoing.
What is facial recognition technology?
Facial recognition technology is software that uses biological measurements to map the facial features of an individual from a photograph, or in this case, video footage. This information can then be compared with a database of known faces to identify an individual.
How is facial recognition commonly used?
There are a wide number of applications for the use of facial recognition technology and much of it is within everyday use. Apple phones use facial recognition and social media channels will often identify a face in a photo and ask you to tag the individual it believes it’s identified. Your phone will automatically create folders of pictures with the same individual in.
Facial recognition technology is widely used at airports and security checkpoints for large buildings and in large shopping centres and retail stores. In addition to security, marketers will use facial recognition to assess an audience and target that audience in a meaningful way.
What are the issues with facial recognition?
The case against South Wales Police has been brought by an individual whose face was scanned on two occasions. Once during a shopping trip to Cardiff and on a second occasion on a peaceful protest in the City which he has claimed, caused him ‘distress’. He claims that if the technology is rolled out nationally, it will radically change the way that the UK is policed.
He also added that the ability to be able to identify very large numbers of people, in addition to the large numbers of databased currently operated by police forces and other public bodies would mean that very soon, police forces would have images of the vast majority of the population.
Furthermore, he highlighted concerns about the potential for discrimination via automated facial recognition with an increased risk of ‘racial bias leading to erroneous police stops’.
This appeal has been crowded funded is being heard by the Court of Appeal this week. We await the outcome as this could have a significant impact for those using such technology from a GDPR (General Data Protection Regulation) basis.