UK police and companies must stop using live facial recognition for public surveillance, a letter from politicians and campaigners has said.
It involves faces captured on CCTV being checked in real time against watch lists, often compiled by police.
Privacy campaigners say it is inaccurate, intrusive and infringes on an individual’s right to privacy.
Its makers say it helps protect the public as it can catch people like terror suspects in a way police cannot.
The letter, written by privacy campaign group Big Brother Watch, has been signed by more than 18 politicians, including David Davis, Diane Abbott, Jo Swinson and Caroline Lucas. Twenty-five campaign groups including Amnesty International and Liberty, plus academics and barristers also signed.
They argue it is being adopted in the UK before it has been properly scrutinised by politicians.
Director of Big Brother Watch, Silkie Carlo, said: “What we’re doing is putting this to government to say: ‘Please can we open this debate and have this conversation, but for goodness sake while it is going on there is now a surveillance crisis on our hands that needs to be stopped urgently’.”
The Kings Cross estate has recently been at the centre of facial recognition controversy, when it was revealed its owners were using the technology without the public’s knowledge.
It then emerged both the Metropolitan Police and British Transport Police had supplied the company with images for their database. Both had initially denied involvement.
South Wales Police was taken to the High Court over its trial of the technology, by a man who was caught on camera. The court ruled it was lawful, although that is now being appealed against.
Researchers have raised concerns that some systems are vulnerable to bias, as they are more likely to misidentify women than men and darker-skinned people than others.
Areeq Chowdhury, head of the Future Advocacy Think Tank, said this was due to things like colour contrasts on people of colour and systems being confused by cosmetics, while some systems have not been trained with enough diverse datasets of people from different demographics.
“You could see a situation where you are identifying innocent individuals who are from a particular minority. Which means they’ll be questioned by the police even though they’re innocent and they may even have their details and picture captured on record, despite having committed no crime.”
Digital Barriers, a worldwide supplier of the technology, says it is an essential tool for counter-terrorism.
Its CEO, Zak Doffman, said: “I know there are five individuals in central London that want to do harm on a massive scale to the public. Would you have public support to use facial recognition to try and intercept that group of individuals before they can do harm? I would suggest almost categorically you would.”
He added that he did not support indiscriminate use of the technology.
“I’ll give you the opposite example, an individual has been kicked out of the pub for drinking too much on a Saturday night. The pub has taken a photo of that individual, should they then be prevented from getting into that establishment or other establishments because of that incident? I think you’ll have very little public consent for that example.
“Unfortunately there’s no clarity. There’s no regulation that governs either case and that is the challenge.”
The UK Surveillance Camera Commissioner, Tony Porter, says there must be a set of strict standards governing how the technology is used, before it is formally adopted by police forces.
“There should be a standard around its siting, efficiency and effectiveness,” he explained. “I suppose you might say, ‘What is an appropriate force hit-rate that is tolerable against the totality?’ There needs to be a lot more assurance to the public that any notion of bias through ethnic background is eradicated.”