Facial recognition technology out-pacing the law
The Equality and Human Rights Commission (EHRC) has called for the suspension of the use of automated facial recognition (AFR) and predictive algorithms in policing in England and Wales, until their impact has been independently scrutinised and laws are improved.
In evidence submitted to the UN on a range of civil and political rights issues, the EHRC has highlighted concerns about how the use of AFR is regulated, and has suggested that AFR may not comply with the UK’s obligation to respect privacy rights under the International Covenant on Civil and Political Rights (ICCPR). The report also raises questions about the technology’s accuracy and points to evidence that many AFR algorithms disproportionately misidentify Black people and women, and therefore could be discriminatory.
The EHRC has also expressed concerns over the use of predictive policing programmes, which use algorithms to analyse data and identify patterns, suggesting that such programmes could replicate and magnify discrimination in policing. Predictive technologies also rely on ‘big data’, which encompasses large amounts of personal information, and the EHRC has warned that this may infringe on privacy rights and result in self-censorship, having a chilling effect on freedom of expression and association.
Rebecca Hilsenrath, Chief Executive at the Equality and Human Rights Commission, said:
“With new technology comes new opportunities and new, more effective ways of policing crimes and protecting UK citizens. We welcome this opportunity and recognise the priority that everyone is kept safe. But these also bring new challenges and new risks which we need to meet in order to use any such technology effectively for the good of the community as a whole.
“It is essential that our laws keep pace with our evolving digital world so that new techniques to protect us don’t infringe on our rights in the process, and damaging patterns of discrimination, that we already know exist, are not reinforced.
“The law is clearly on the back foot with invasive AFR and predictive policing technologies. It is essential that their use is suspended until robust, independent impact assessments and consultations can be carried out, so that we know exactly how this technology is being used and are reassured that our rights are being respected.”
Judith Robertson, Chair of the Scottish Human Rights Commission, which has submitted similar evidence to the UN on issues in Scotland, said:
“We share the EHRC’s concerns that police use of new technologies such as ‘cyber kiosks’ and facial recognition is outstripping the adequate protection for people’s rights required from legal frameworks and oversight mechanisms. Our own report to the UN highlighted gaps in relation to these issues in Scotland and we are pleased that some progress is now being made, for example, recent legislation to establish a Scottish Biometrics Commissioner.”
In its submission to the UN, the EHRC has said that any decision to use such technologies must be based on the outcomes of independent impact assessments, and appropriate mitigating action must be taken, including the development of a rights-respecting legal and policy framework.
The EHRC’s wide-ranging report reviews the progress the UK Government has made, and the challenges it still faces, when implementing the ICCPR.
Other issues raised include the inadequacy of the justice system’s response to tackle the persistent and growing problem of violence against women and girls, and the need to ensure properly funded specialist services for all survivors. It also highlights concerns around immigration detention, including the continued practice of indefinite detention and the inadequacy of protections for individuals at risk of particular harm.
Click to read the EHRC Submission to the UN.