Are Promises to Halt Police Facial Tracking Just a Tech Sleight of Hand?

Calls to stop, or severely regulate, the use of facial recognition software by police is all well and good, but we can’t let the tech companies play sleight of hand with us, agreeing to bow out of police surveillance while at the same time ramping up a multitude of other forms of invasive tracking and spying on us with the same technologies. Facial recognition software represents many deeper threats to our lives, values and democracy than just its use by the police. 

Law enforcement’s use of facial recognition cameras and software is attracting lots of attention in the media, public, and politics right now. We’re hearing about innocent people being falsely identified as criminals with it. We’re hearing about police using neighborhood citizens’ front door cameras to track criminals, causing us to wonder if we’re being watched as we innocently walk our dogs. Lately we are even hearing about big tech companies like IBM and Microsoft suddenly refusing to provide police with facial tracking software because of their concerns about its accuracy and potential for abuse. 

In 2018, the American Civil Liberties Union (ACLU) found that  facial recognition software Amazon sold to police incorrectly matched 28 members of Congress to faces picked from 25,000 public mugshots. Now the ACLU is demanding lawmakers take legislative action after a black man in Detroit was wrongfully arrested for shoplifting, because facial recognition can’t tell black men apart.  

I’ve written here previously about how facial recognition algorithms are written by white men who can’t seem to imagine or model people of color, or women either – a cause for alarm on multiple fronts. Study after study is reporting biases found in the algorithms (formulas) that power this software.  But it’s not just traditional people of color and women that are causing problems. Studies show errors tracking Asians, children and the elderly as well. Basically anyone who is not a 20-40-something white male.

A while back I also wrote about a car slamming into the side of a white truck, killing the driver, because the car cameras couldn’t distinguish between sun and white truck. But that was a car-piloting camera, not facial recognition, you might protest. Well, the fact is, all this stuff is related: if a camera can’t distinguish the color white on the side of a gargantuan hunk of moving metal from rays of sunshine, how can we expect it to detect the often fine detail difference between one nose and another, or one blue eye and another, a woman and a man? What makes a woman a woman, or a person of color a person of color, after all? Which stereotype are the programmers  folding into the algorithms they write to power this facial recognition software?  Does she have to be wearing a dress to elicit “gender: F” ?  How light or dark does her skin have to be to register as a person of color to an algorithm?

“Lawmakers need to stop allowing law enforcement to test their latest tools on our communities, where real people suffer real-life consequences,” declared ACLU senior legislative counsel Neema Singh Guliani. “It’s past time for lawmakers to prevent the continued use of this technology. What happened to the Williams family [when Robert Williams was incorrectly ID’d by facial software and arrested], should never happen again.”

So, the guy got released, and they apologized. No big deal, it was a mistake, right? Wrong. Listen to Mr.  Robert Williams’s own words:

Even if this technology does become accurate (at the expense of people like me), I don’t want my daughters’ faces to be part of some government database. I don’t want cops showing up at their door because they were recorded at a protest the government didn’t like. I don’t want this technology automating and worsening the racist policies we’re protesting. I don’t want them to have a police record for something they didn’t do—like I now do.

In response to all this, IBM CEO Arvind Krishna announced in a June 2020 letter to Congress that the company will no longer offer general purpose facial recognition or analysis software, nor will they continue to develop or research the technology. And following IBM, Microsoft and Amazon have also backed off their support of police use of facial tracking. 

– – –

Today over a quarter of law enforcement agencies are using facial recognition systems and databases. How many men and women already have been falsely arrested, or are in police databases? The call from public groups, citizen activists and the ACLU is justified and must be heeded. We are not supposed to be a police state, but a democracy, and democracy includes the right to privacy, free speech and assemblage and freedom from harassment. 

Yes, we need to ban the use of facial recognition, but once the public realizes what else facial recognition is being used for, their ban list will get longer . And once they read between the lines of the big tech facial recognition banning announcements, longer still . . . .

A careful read of the “bold” statement by IBM and others about banning facial recognition software uncovers the fact that they make no mention of abandoning the use of facial recognition for purposes other than police use.

IBM CEO Krishna said in his letter to Congress: “We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.” (emphasis is mine)

Krishna did not mention overseas use of the technology. He did not mention use by private businesses of facial technology. He did not mention retail business use of facial technology.

Remember way back in January 2020 when Facebook was ordered to pay $550 million in a class action suit over its illegal use of facial recognition software on its own platform?  Meanwhile, another company, Clearview, is under investigation because it built a facial recognition database –  to sell to both police and private industry – from three billion images it downloaded from social media sites. What other platforms and websites are you using that are collecting your self images, what are they collecting them for, and who are they giving (selling?) them to?

When you post a photo of yourself on social media platform you do so usually under a creative commons license – meaning that you are giving up your copyright claim and ownership of said image of yourself to other users on that social media platform. But did you also know that a third party company, say Amazon and their Rekognition subsidiary (creepy name, huh? but it is real), could come along and swoop up your photo to put in their own database, one they are using to “train” computers to recognize faces? My guess is your did not know this.

IBM explained to tech media outlet The Verge, that due to the Creative Commons license on social media platform Flickr, they were legally entitled to help themselves to millions of pictures of you and your fellow citizens, and that, if you didn’t like it,  you had the right to opt out. But did you know that when you uploaded your selfies? – either that they might be used by another company, for any reason, or that you had the opportunity to opt-out of such use? Apparently it was written somewhere in the 5 trillion lines of fine print that you never read, shame on you.

So, yeah, let’s talk about some of the  “other uses” of facial recognition. The ones outside of law enforcement. IBM and others aren’t just tracking you for the police, you see. And it’s important that you know this, and that we have a public dialogue about this, too. That we not let tech companies distract us with promises to stop law enforcement mass surveillance of us while they are simultaneously tracking us is other just as dangerous and insidious ways.

On New Year’s Day 2020, tens of thousands of college football fans poured into the Pasadena Rose Bowl Stadium to watch Oregon and Washington vie for the coveted annual prize that kicks off every new year in the US. What those fans did not realize was that while they were there as the spectators, someone else was spectating them.

 As the “spectators” entered the stadium, a large video screen carefully designed to attract their gazes, gathered information, data, on their faces, gender, age, and whether they matched a list of suspicious persons – using facial recognition as well as other data analytic software.  Even if the sole purpose of this scan was to identify terrorists in the crowd (which it was not), it may have been illegal, especially due the the surreptitious way that the information was gathered. The cameras were hidden within the video screen, and no one was notified that they were being scanned. But, take a look at the name of the company running this New Year’s Day Facial Invasion, VSBLTY, and the picture begins to come into better focus. 

VSBLTY is an advertising tech company, or “ad-tech” they call it – because coolness in a tech name always helps smooth any potentially ruffled feathers, and makes it feel more, you know, “non-techy.” Cameras hidden in this “welcome” video collected 30,000 points of data on each person who gazed into its screen according to news source One-Zero. That is a LOT of data – gender, age, race, hair color, on-any-terrorist-list, what else? 30,000 bits of data per person.

To hear VSBLTY or Rose Bowl or any other customer of this  tracking “service” talk, secretive tracking of human beings, as well as compilation of tens of thousands of data points on each of them is the greatest thing since sliced bread. This tech innovation will allow advertisers to know exactly how to track each individual with the perfect ads once they know enough about each of us. There is even a new name for this in advertising and marketing – “personalization.” 

That is what VSBLTY was doing at the Rose Bowl last New Year’s Day – a demonstration, a proof of concept, a test of this new chapter in facial recognition: the merging of facial recognition and personal datapoint tracking and analysis, in order to provide businesses and customers with the absolutely most perfect, most personalized advertising experience possible. Drum rolls and violins, please.

Future shopping experiences will not need to offer messy confusing choices, or shelf demanding abundant selection, because VSBLTY already knows who you are and what you want – and will just tell you and have it waiting at the exit for you. And, by the way, as an extra bonus, you’ll be able to walk through the event, museum, shopping mall, store, mega concert . . . . fill in the blank of your favorite public venue . . . with no fear of terrorists, thanks to the fact that facial tracking software isn also linked to a robotic police force, who will be weeding them out as you make your way through the venue.

Our faces are being tracked, our location is being tracked, what we buy is being tracked, when and where we buy it is being tracked, our habits are being tracked, what we watch is being tracked, even what we look at is being tracked.

The police, FBI and other government agencies may not legally be able to track us facially or otherwise when we are lawfully gathered, and may require a warrant when we are not. Not to be deterred, they now often rely on third party businesses to track our faces, our locations, even our tweets and Facebook posts. And the businesses that police rely on are not regulated in most instances. When San Francisco recently banned the police from using facial tracking software, what went largely unnoticed by most was that they did not ban private businesses from using it. We certainly can’t rely on these third party businesses  who build or use facial tracking to self-regulate, nor can we assume  that they are going to put the public interest above their own  business interests. 

The Intercept reported that since the recent wave of street demonstrations, FBI agent members of the Joint Terrorism Task Force questioned protest organizers at their homes — sometimes within hours of their having posted an event on social media. How do we think the FBI got their address and knew of their organizing roles – did they really get a warrant that quick? No, they use third party apps – unregulated third party apps. Who needs messy regulations?

IBM’s CEO Krishna is correct – we need deliberative dialogue, we need public oversight, and we need laws to protect the privacy and civil rights of all Americans when it comes to facial or any other kind of tracking – but the difference between what Krishna and big tech is telling you and what I am telling you is that we need all this for the tracking of us by police and everyone else!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.