Skip to main content

San Francisco bans facial recognition

San Francisco has become the first US city to ban facial recognition software – and it is a move which has implications for transit agencies as well as police forces worldwide Big Brother is watching you’, goes the famous saying. Well, not in San Francisco he isn’t. Legislators in the Californian city – home to the tech gold rush and embracers of all things forward-looking – have decided that, after all, there should be limits to technology’s hold over us. By a margin of eight votes to one, the city’s
July 23, 2019 Read time: 4 mins
San Francisco has become the first US city to ban facial recognition software – and it is a move which has implications for transit agencies as well as police forces worldwide


Big Brother is watching you’, goes the famous saying. Well, not in San Francisco he isn’t. Legislators in the Californian city – home to the tech gold rush and embracers of all things forward-looking – have decided that, after all, there should be limits to technology’s hold over us.

By a margin of eight votes to one, the city’s Board of Supervisors has outlawed the use of facial recognition tools, passing the Stop Secret Surveillance Ordinance, a law which was authored by supervisor Aaron Peskin.

It is a move which will have obvious implications for police surveillance: many police forces are already using live facial recognition (LFR) to scan big crowds for potential troublemakers or to target criminals.

But this decision will also resonate with transit agencies: it was reported last year that Bay Area Rapid Transit (Bart) was considering the introduction of face recognition software on its cameras. That is no longer a possibility for Bart – and the decision by San Francisco’s lawmakers may form a precedent which other cities feel bound to follow.

Supporters say this is striking a much-needed blow for individual freedom in the digital age. The American Civil Liberties Union Northern California applauded the decision for “bringing democratic oversight to surveillance technology, and for recognising that face surveillance is incompatible with a healthy democracy”.

It went on: “By passing this law, the city gave the community a seat at the table and acted decisively to protect its people from the growing danger of face recognition, a highly invasive technology that would have radically and massively expanded the government’s power to track and control people going about their daily lives.”

Not everyone agrees: supporters of LFR say it helps to keep citizens safe and can make crime detection and prevention more effective. In transportation terms, facial recognition is considered to be a useful means of helping to move people more efficiently through busy transport hubs. So-called ‘pay-by-face’ systems could eliminate the need for ticket barriers altogether.

378 Cubic Transportation Systems, for instance, says its ‘gateless gateline’ prototype system “integrates future ticketing technologies, such as palm vein scanning and facial recognition, including the use of biometric technology for fare validation”.

The fast-track system could help double passenger throughput: as travellers walk through a corridor, their faces are scanned and synched with their smartphone in order to pay.

One potential area for facial recognition involving transport would be to scan car drivers as they pass through sections of road – not for tolling purposes but to look for criminals. Yet an early attempt to do this on New York’s Robert F. Kennedy Bridge was apparently not a success. There are technical issues here, as well as privacy concerns.

Given well-publicised worries over the way that increasing amounts of personal data is being used by online behemoths such as Facebook, it is a sensitive area all round. In this atmosphere, New York Metropolitan Transportation Authority’s wheeze to scare fare-dodgers by using a video feed in one of its stations attracted negative comment from travellers on social media. The agency put a monitor, urging people to pay and carrying the words ‘Recording In Progress’, in the Times Square subway – but not everyone was impressed.

Problems with facial recognition

Civil liberties group Big Brother Watch said last year that the London Metropolitan Police’s use of LFR in public spaces was “98% inaccurate – it identified people correctly only 2% of the time”. Elsewhere in the UK, South Wales Police’s LFR “was inaccurate 91% of the time and had resulted in the misidentification of 2,451 people”.

In an interim report earlier this year, the UK government’s Biometrics and Forensics Ethics Group concluded: “There are a number of questions about: the accuracy of LFR technology; its potential for biased outputs and biased decision-making on the part of system operators; and an ambiguity about the nature of current deployments.”

For more information on companies in this article

boombox1
boombox2