Proving the Business Case for the Internet of Things

UK police under fire for use of facial recognition technology

Steve Rogerson
July 11, 2019



Police in the UK are facing calls to halt the public use of facial recognition software to search for suspected criminals after academics from the University of Essex found matches were only correct in one in five of cases and the system was likely to break human rights laws.
 
The software uses IoT technology to scan crowds in public places searching for known criminals based on facial recognition algorithms.
 
The report by researchers from the Human Rights, Big Data & Technology Project, based at the University of Essex Human Rights Centre, identifies significant flaws with the way live facial recognition (LFR) technology was trialled in London by the Metropolitan Police.
 
This is the first independently funded academic report into the use of LFR technology by a UK police force and it raises concerns about the Metropolitan Police’s procedures, practices and human rights compliance during the trials.
 
The authors of the report, Peter Fussey and Daragh Murray, conclude that it is “highly possible” the Metropolitan Police’s use of LFR to-date would be held unlawful if challenged in court. They have also documented what they believe to be significant operational shortcomings in the trials that could affect the viability of any future use of LFR technology.
 
In light of their findings Fussey and Murray are calling for all live trials of LFR to be ceased until these concerns are addressed, noting that it is essential that human rights compliance is ensured before deployment, and that there be an appropriate level of public scrutiny and debate on a national level.
 
“This report was based on detailed engagement with the Metropolitan Police’s processes and practices surrounding the use of live facial recognition technology,” said Fussey. “It is appropriate that issues such as those relating to the use of LFR are subject to scrutiny, and the results of that scrutiny made public.
 
“The Metropolitan Police’s willingness to support this research is welcomed. The report demonstrates a need to reform how certain issues regarding the trialling or incorporation of new technology and policing practices are approached, and underlines the need to effectively incorporate human rights considerations into all stages of the Metropolitan Police’s decision making processes. It also highlights a need for meaningful leadership on these issues at a national level.”
 
Murray added: “This report raises significant concerns regarding the human rights law compliance of the trials. The legal basis for the trials was unclear and is unlikely to satisfy the ‘in accordance with the law’ test established by human rights law. It does not appear that an effective effort was made to identify human rights harms or to establish the necessity of LFR. Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police’s systems from the outset, and was not an integral part of the process.”
 
 
To compile the report, Fussey and Murray were granted unprecedented access to the final six of the ten trials run by the Metropolitan Police, running from June 2018 to February 2019. They joined officers on location in the LFR control rooms and engaged with officers responding on the ground. They also attended briefing and de-briefing sessions, and planning meetings.
 
Among the main concerns raised in the report are that it is highly possible that police deployment of LFR technology may be held unlawful if challenged before the courts. This is because there is no explicit legal authorisation for the use of LFR in domestic law, and the researchers argue that the implicit legal authorisation claimed by the Metropolitan Police – coupled with the absence of publicly available, clear online guidance – is unlikely to satisfy the ‘in accordance with the law’ requirement established by human rights law, if challenged in court.
 
The report said there was insufficient pre-test planning and conceptualisation and the Metropolitan Police’s trial methodology focused primarily on the technical aspects of the trials. This meant the research process adopted gave insufficient attention to addressing the non-technical objectives identified.
 
The Metropolitan Police did not appear to engage effectively with the ‘necessary in a democratic society’ test established by human rights law, says the report. LFR was approached in a manner similar to traditional CCTV. This fails to take into account factors such as the intrusive nature of LFR, and the use of biometric processing. As a result, a sufficiently detailed impact assessment was not conducted, making it difficult to engage with the necessity test.
 
The mixing of trials with operational deployments raises a number of issues regarding consent, public legitimacy and trust, particularly when considering differences between an individual’s consent to participate in research and their consent to the use of technology for police operations. For example, from the perspective of research ethics, someone avoiding the cameras is an indication that they are exercising their entitlement not to be part of a particular trial. From a policing perspective, this same behaviour may acquire a different meaning and serve as an indicator of suspicion.
 
There were numerous operational failures, says the report, including: inconsistencies in the process of officers verifying a match made by the technology; a presumption to intervene; how the Metropolitan Police engaged with individuals; and difficulties in defining and obtaining consent of those affected.
 
Across the six trials that were evaluated, the LFR technology made 42 matches – in only eight of those matches can the report authors say with absolute confidence the technology got it right. Criteria for including people on the watchlist were not clearly defined, and there was significant ambiguity over the categories of people the LFR trials intended to identify.
 
There were also issues with the accuracy of the watchlist and information was often not current. This meant, for example, that people were stopped despite the fact their case had already been addressed.
 
The research methodology document prepared by the Metropolitan Police focuses primarily on the technical aspects of the trials. The report says there does not appear to be a clearly defined research plan that sets out how the test deployments are intended to satisfy the non-technical objectives, such as those relating to the use of LFR as a policing tool. This complicated the trial process and its purpose.
 
This is not the first time the police’s use of automated facial recognition technology has made the headlines; only a couple of months ago it was reported that a man had taken the South Wales Police to court over images that were taken of him without his consent.
 
Police forces across the UK are facing increasing cost pressures, with direct government funding having fallen 30% in the last eight years, and are turning to facial recognition as a way to increase efficiency while cutting costs. The Metropolitan Police commissioner Cressida Dick commented recently that facial recognition is “very useful” within law enforcement and that the technology needs to make its way towards public acceptance or Britain risks being “really, really, really left behind”.
 
Jason Tooley, board member of TechUK and chief revenue officer at software company Veridium, believes the police force would further benefit from biometrics by adopting a more holistic, open approach to the technology. It needn’t be the case for the police to scrap facial recognition software entirely, as the issue is more to do with public perception and education than success rate.
 
“Public perception of the maturity of biometrics such as automated facial recognition and their effective usage has strong links back to existing physical processes and public adoption,” said Tooley. “Fingerprint technology has high levels of consumer adoption due to use on mobile devices, and use cases such as airports using flatbed scanners, which is also widely understood and helps immensely with acceptance. There is clearly a need to focus on how biometrics as a whole, as technology matures, can support identity verification at scale and gain nationwide public acceptance as part of a wider digital policing initiative.”
 
He said it was imperative for police forces to take a strategic approach as they trial biometric technologies, and not solely focus on a single biometric approach.
 
“This open multifactor approach will strengthen evidence and decrease the risk of wrongful arrests,” he said. “This should alleviate human rights concerns, but the public need to be reassured that the technology is assisting in crime solving and not just merely being used as surveillance.”
 
With the rapid rate of innovation in the field, an open biometric strategy that delivers the ability for the police to use the right biometric techniques for the right requirements could accelerate the benefits associated with digital policing and achieve public acceptance by linking the strategy to ease of adoption.
 
“Instead of only using facial recognition perhaps for a single reason, the police should ensure they have strategically assessed which is the right biometric technique, for the right use case, based on the scenario,” said Tooley.
 
Another issue that has come to the fore, since the revelation that facial recognition as a standalone piece of evidence only correctly identified one in five suspects, is the high levels of customer expectations on the technology’s effectiveness.
 
“The public expectation is that results should be 100% accurate, as they are used to facial recognition technology on their phones and at airports for which the results are generally good,” said Tooley. “The public expectation of facial recognition for surveillance being as high as their other experiences in use of facial is unrealistic. Fingerprint is more successful and is migration of a physical to digital process, but requires a comparison versus existing data and is a one-to-one exercise. Police in other countries are using this approach successfully; the key is to digitalise the process using technology already embraced by consumers, making the application simpler and easier to use.”