The Wall Street Journal reports that companies are using UK’s omnipresent security cameras as cultural permission to bring facial-recognition tech to semi-public spaces, tracking criminal history but also ethnicity and other personal traits. “Retailers, property firms and casinos are all taking advantage of Britain’s general comfort with surveillance to deploy their own cameras paired with live facial-recognition technology,” writes Parmy Olson for the Journal ($). “Companies are also now using watch lists compiled by vendors that can help recognize flagged people who set foot on company property.” For example:

Some outlets of Budgens, a chain of independently owned convenience stores, have been using facial-recognition technology provided by Facewatch Ltd. for more than a year. Facewatch charges retailers for the use of a computer and software that can track the demographics of people entering a store, including their ethnicity, and screen for a watch list of suspected thieves through any modern CCTV camera. The system works by sending an alert to a staff member’s laptop or mobile device after detecting a face on the watch list. Retailers then decide how to proceed.

Why this matters

  1. Assumptions about appropriate (or even inevitable) uses of tech become normalized quickly. As constant surveillance becomes the everyday, it’s all too easy to become resigned or indifferent as that surveillance deepens. Once the cultural foundation for a new technology sets, it’s difficult to change the associated expectations and assumptions—or see the status quo as anything other than inevitable, “just the way things work.” We see it in the decades-long expectation that online content is free and ad supported. We see it in the assumption that giving up personal data is just table stakes for using the internet. And now, with surveillance cameras—at least in the UK—we may be settling into a new expectation that simply moving through the world means that we are seen, tracked, monitored in a very granular, personal way.

    The Journal suggests that the UK’s “comfort” with surveillance cameras makes it ripe for this. A 2013 survey found that Britain had the highest density of surveillance technology outside of China. Since then, the number of surveillance cameras in the UK has nearly doubled from six million to 10 million—one camera for every seven people.

  2. This anti-theft surveillance affects more than just the guilty. Facial recognition is still pretty iffy in real-world conditions, and the false negatives these systems generate could lead to harassment for no good reason except that you walked into the store.

    James Lacey, a staff member at one Budgens store in Aylesbury, southern England, said the system can ping his phone between one and 10 times a day. People have been known to steal large quantities of meat from the store’s refrigeration aisle when staff members are in the stock room, he said. The new system has helped, he said, though about a quarter of alerts are false. A spokesman for Facewatch said a maximum of 15% of alerts are false positives, based on its own analysis.

    (Related: an ACLU study in 2018 found that Amazon’s facial-recognition service incorrectly matched the faces of 28 members of Congress to criminal mugshots.)

  3. Automated identification has implications beyond crime prevention. What’s OK for these corporate systems to track in the first place? Gender? Race and ethnicity? Income? Browser history? Social relationships? Voting record? Sexual preference? The folks at Facewatch promise vaguely that tracking ethnicity “can help retailers understand their marketplace.” This smacks of a shrugging sensibility that “we can do it, so why wouldn’t we?” And that’s the worst reason to use a technology.

  4. Regulation is evolving, but remains vague and often unenforced. Europe’s well-intentioned privacy regulation, the GDPR, puts facial and other biometric data in a special category that requires a company to have a “substantial public interest” in capturing and storing it. That’s fuzzy enough that companies can arguably allow companies to use the technology to fight crime. Tracking ethnicity to “help retailers understand their marketplace” seems like less of a slam dunk. There is also a gray area around how long businesses can hold on to such footage, or use it for other business purposes.

We should adopt a position on this stuff both culturally and civically. If we don’t, the technology will decide for us. What will your company’s position be? And how about you? What’s your stance as a practitioner designing the technology that will set the behaviors and expectations of the next generation?

Read more about...