Select an area of expertise to find out more about our experience.
Find out more about our barristers and business support teams here.
As Frank Sinatra almost sang: “The memory of your face, no they can’t take that away from me.” Well, the ECtHR is doing its best to prove ol’ blue eyes wrong.
In Glukhin v Russia, app no. 11519/20, handed down in July, the ECtHR considered the lawfulness of the police’s use of facial recognition software in processing the data of a person participating in a political protest. Mr Glukhin went to Moscow in a one-man protest at the arrest of a political activist. He was identified and recognised by facial recognition cameras and was later arrested. Mr Glukhin was convicted of an administrative offence of failing to give notification of a protest, and his conviction was upheld on appeal.
The ECtHR considered the case in the context of art.10 (freedom of expression), concerning the lawfulness of the conviction, and art.8 (privacy) concerning the lawfulness of the use of facial recognition software. It is the art.8 element of the judgment that is particularly relevant. The Court held that the storage by the police of photographs of a person and the use of facial recognition software to identify a person constitutes an interference with that individual’s rights under art.8 (applying Gaughran v the United Kingdom, app. no. 45245/15 at [69-70]).
The Court, considering the state’s justification, recognised the legitimate aim of prevention and detection of crime and that cutting edge technology such as facial recognition was vital in that effort and did not seek to ask whether the use of facial recognition software generally was lawful. Instead, the Court focused on the facts of the case (see §85). Notwithstanding that, general points of important principle can be drawn:
First, that using this technology will require a precise and publicised legal framework that sets out the extent to which such software can be used and deployed. The ECtHR held “strong doubts” that the interference was in accordance with the law as the Russian legal regime effectively gave unfettered discretion to the police to use this technology as it saw fit. The Court criticised an apparent domestic legal framework that “does not contain any limitations on the nature of situations which may give rise to the use of facial recognition technology, the intended purposes, the categories of people who may be targeted, or on processing of sensitive personal data. Furthermore, the Government did not refer to any procedural safeguards accompanying the use of facial recognition technology in Russia, such as the authorisation procedures, the procedures to be followed for examining, using and storing the data obtained, supervisory control mechanisms and available remedies.”
Second, the ECtHR considered that the retention and use of images and facial recognition software was a particularly serious interference with Mr Glukhin’s art.8 rights and required “the highest level of justification”, especially as the data collected was special category data within the meaning of the GDPR as it arose as part of a political protest. The use of facial recognition software to identify a peaceful protester exercising his art.10 rights was not a ‘pressing social need’ so as to justify his identification for a minor administrative offence. As such, the use of facial recognition software in that case was not justified.
How does that square with the domestic jurisprudence in the UK?
In R (Bridges) v Chief Constable of South Wales Police & Information Commissioner ([2020] 1 WLR 5037), the Court of Appeal considered the lawfulness of automated facial recognition software by the police and found the interference with a person’s art.8 rights by the collection and storage of such data was ‘negligible’. That conclusion was based on the capture and storage for a limited period of all person’s faces. If the police had been using that software in the UK like the Russians did in Glukhin, it is very likely it would have been found to be a much greater interference and require a commensurately powerful justification. More fundamentally, following Bridges, the UK has a detailed legal framework to give the quality of law to the police’s use of cutting edge software.
What Glukhin makes clear, however, is that art.8 does set out the limits to which the state can use this software in investigating and prosecuting crime, especially crime that engages art.10/art.11 rights such as eco-protesting. Expect the principles to be tested both domestically and in Strasbourg as the police start to use this software more frequently to identify and prosecute members of the public.
A monthly data protection bulletin from the barristers at 5 Essex Chambers
The Data Brief is edited by Francesca Whitelaw KC, Aaron Moss and John Goss, barristers at 5 Essex Chambers, with contributions from the whole information law, data protection and AI Team.