Privacy Daily is a service of Warren Communications News.

UK Information Office Finds Police Using Facial Recognition Tech Responsibly

The U.K. Information Commissioner's Office's audit of two law enforcement agencies' use of facial recognition technology (FRT) found they were deploying the privacy-invasive tool responsibly. Meanwhile, an advocacy group asked the ICO to investigate two algorithmic tools the U.K. Home Office uses for processing migrants' personal data, including special category data.

Sign up for a free preview to unlock the rest of this article

Privacy Daily provides accurate coverage of newsworthy developments in data protection legislation, regulation, litigation, and enforcement for privacy professionals responsible for ensuring effective organizational data privacy compliance.

The office audited law enforcement agencies in England and Wales that are early adopters of FRT. A review published Wednesday of the South Wales Police and Gwent Police departments showed a "high level of assurance" that the two are complying with data protection law, Deputy Commissioner for Regulatory Policy Emily Keaney blogged.

The two forces use FRT in several ways, including live facial recognition, where cameras match images to a watchlist in real time, and retrospectively, where images are compared with a database, Kearney wrote.

The ICO also gave feedback on a pilot of operator-initiated facial recognition, where an officer uses an authorized mobile device to compare an image to a database, helping to identify someone for policing purposes.

This audit was a snapshot in time that surveyed just the two forces, Kearney said. The ICO will be auditing more police forces, she added.

In a complaint filed Tuesday, advocacy group Privacy International asked the ICO to investigate a pair of algorithmic tools the government uses when processing migrants' data.

The Identify and Prioritise Immigration Cases and Electronic Monitoring Review Tool technologies "appear to often involve limited human involvement," PI said.

Moreover, individuals don't receive meaningful information about what data the tools process or how they're used, and information that is provided is inconsistent and contradictory, it said. Inconsistent guidance for caseworkers encourages them to accept algorithmic recommendations with little scrutiny.

Specifically, PI claimed, the Home Office is violating the U.K. General Data Protection Regulation and the Data Privacy Act of 2018 by, among other things, not providing transparent, adequate information to data subjects about the nature and extent of the data collection and processing; and failing to provide a clear, foreseeable legal basis for processing data in violation of the lawfulness principle.