Racist AI: New Podcast Episode Exposes Horrors of Surveillance Tech in Policing
Mounting evidence proves surveillance tech is racist. So why won't anyone stop it?
News Beat is a multi-award-winning podcast that melds hard-hitting journalism with hip-hop to inform, educate, and inspire.
Welcome back! Our latest episode is a disturbing, yet critically important examination of how police departments nationwide are adopting rapidly expanding surveillance technologies for investigative purposes, despite glaring flaws.
We feature the stories of two Black men from the greater Detroit area caught up in America’s ever-growing cyber-dragnet—cases made exponentially worse by a troubling combination of shoddy police work and discriminatory tech. Robert Williams and Michael Oliver were arrested within six months of one another. They were both innocent, yet a computer declared them guilty.
The phone call was so disturbing that Robert Williams, a father of two from a Detroit suburb, simply hung up. The police? Demanding he meet them at the police station? In Detroit? It was bizarre and, in Williams’ calculation, certainly a prank.
When Williams arrived home at 5:22 p.m. from work he found officers from the Detroit Police Department waiting outside. After pulling into his driveway, police wasted little time arresting him in connection with a jewelry store robbery.
The case unraveled quicker than a poorly tied ball of yarn. The so-called “evidence” police used to justify the arrest was a facial recognition scan based on a grainy surveillance photo taken from the violated shop. The image hardly resembled Williams—and that’s what he told the police.
“I hope you do not think all Black people look alike,” he scoffed.
It was so egregious that an officer purportedly admitted the error.
“Oh, I guess the computer got it wrong,” he told Williams, according to court documents in the Michigan father’s lawsuit against the Detroit Police Department.
It’s difficult to track how many wrongful arrests occur each year. However, experts estimate that thousands of innocent people are charged annually simply due to the scale of mass incarceration in America. Those that are wrongfully convicted are disproportionately Black. Now we’re learning that facial recognition software also has a race problem.
Why We Covered This Topic
Williams’ wrongful arrest is unique because it puts him at the center of a brewing debate over the use of facial recognition software and other surveillance tools. Opponents of this technology say they’re concerned about its privacy implications, accuracy, and extent to which they exasperate real-life racial biases.
All three known wrongful arrest cases tied to facial recognition share something in common: The falsely accused are all Black men—and we spoke to one of them directly.
Six months before Williams’ arrest, then-26-year-old Michael Oliver, also from the Detroit area, was arrested for an alleged phone theft while driving to work. As in Williams’ case, a facial recognition scan was used to justify Oliver’s apprehension. But unlike Williams, it took more than a month for the police to drop the charges. For Oliver, who was supporting his family, the damage was immediate and severe.
“I lost my job. I lost my car, and I was behind on rent for months,” Oliver tells News Beat in an interview for the podcast. “So it was really, really kind of hard for me.”
Flawed technology combined with shoddy police work can have serious and potentially long-lasting implications for people who are wrongfully arrested or convicted.
But that’s not the only reason we produced this episode.
The emergence of facial recognition technology in society—especially in policing—raises very serious privacy concerns for everyone.
Here’s how one of our guests, Andrew Ferguson, described being in New York City:
“Even in places like the financial district, you are under the eye of the Domain Awareness System. The Domain Awareness System is a product of Microsoft and the New York Police Department, NYPD, where they have basically built a network, system of cameras that can literally track you place to place. They also can roll back the tape to find out where you've been. Imagine you're wearing you know, a bright pink shirt and a red hat, and you’re carrying a black bag, well, they can roll back the tape and find all the people wearing those pink shirts and red hats.
They can find out where you came from, they can find out where you went. If you got into a car, they can run an automated license plate reader. If you are carrying a cell phone, your cell phone is giving up cell phone signals that are going to track you. If there are apps on your phone, which there are, they can use those apps to do it. If you are connected through WiFi—maybe you want to plug into one of the free WiFi’s or even use another WiFi—it's another avenue into not only the information in your cell phone, but also your contacts and the way you can get connected.”
What You’ll Learn in This Episode
Facial recognition and other surveillance technologies pose a significant risk to our civil liberties and expectation of privacy.
There’s a greater risk of misidentification of Black and Brown people because of inherent biases within the technology itself.
Experts are growing increasingly concerned that Black and Brown communities will be disproportionately targeted by this technology, simply because of the history of policing in America.
Anyone with a government issued-ID, such as a driver’s license or passport, can already be in a facial recognition database. Law enforcement leverages mug shot databases when trying to ID someone using this technology.
Facial recognition technology can be used in conjunction with real-time surveillance to track a person and their associates and learn their behaviors.
Who We Interviewed & What They Said
Andrew Ferguson, a law professor at American University, Washington College of Law, and author of the book “The Rise of Big Data Policing: Surveillance, Race and the Future of Law Enforcement”
“You take that sort-of technology that is filled with errors and bias. And you apply it in the real world to a policing system that is also filled with errors and bias. And you couple a broken or a problematic technology with a broken and problematic policing system in America, and you're just asking for trouble. You're asking for injustice. You're asking for mistaken identifications.”
Clare Garvie, a senior associate with the Center on Privacy and Technology, a think-tank based at Georgetown University Law Center
“Face recognition is a unique technology in a couple of ways. The first and main way is that it enables remote biometric surveillance—this is something that's never been possible before, the ability to monitor people's identities and whereabouts and associations, using where their face shows up on camera. Law enforcement or proponents of face recognition like to say, ‘This is not a problem, we've always been able to monitor or take videos and photographs at protests.’ But imagine if instead of taking a video at a protest, officers were somehow able to walk through a protest and access everybody's driver's licenses, look at who they are, who they're talking to, maybe who's the organizer. That would not be allowed under the First Amendment. And so face recognition falls into this gray area where it very much may start impacting our right to privacy, our right to free speech and free association under the First Amendment in ways that we really have not contemplated before.”
Phil Mayor, senior staff attorney at the American Civil Liberties Union of Michigan
“We have a moment here at the relative dawn of this technology to say ‘Enough, not not in our name, not to watch us.’ Let's spend our resources supporting and developing our communities rather than surveilling them. And if anybody thinks that it's stopping now, with the states that they currently do it and the communities that currently do it, if we don't lay a strong marker and say, ‘We do not wish to be watched everywhere we go, identified everywhere we go, evaluated everywhere we go,’ then, again, it's going to be very difficult to turn back later.”
Michael Oliver’s lawyer, David Robinson, a Detroit-based attorney for the law firm Robinson & Associates
“We're talking in Michael’s case and Robert’s case a loss of freedom. Yes, you can imagine facial recognition and this technology going to applications on the street, in real time, circumstances where it feeds into images at the scene of a traffic stop. And, you know, John Law goes over this and believes based on this flawed technology that he has some real desperado. It literally could lead to a greater deprivation than freedom. It could lead to a deprivation of life.”
Michael Oliver, a Michigan man wrongfully arrested for a crime based on facial recognition technology
“I was working, going to work every day. I was paying bills. I was a big help to my little circle, my little family that I got, that was dependent on me. I was paying bills. I had a car, you know, going to work every day. I lost my job. I lost my car, and I was behind on rent for months. So it was really, really kind of hard for me.”
The one-and-only Silent Knight, a prolific hip-hop recording artist, News Beat’s artist-in-residence, and due to a previous episode he performed on, an Ambassador at the nonprofit Innocence Project, which utilizes DNA to exonerate the wrongly convicted
Silent Knight has a reputation for being one of the hardest-working artists in independent hip-hop. And it’s easy to see why—from being emcee with The Band Called FUSE to hosting and curating their Line Up showcase in NYC to releasing almost a dozen albums independently in less than 10 years. Charismatic yet humble, SK’s genuineness comes across in his music and energetic and engaging stage show.
Study finds gender and skin-type bias in commercial artificial-intelligence systems
“Garbage In, Garbage Out” report by Clare Garvie examining flawed data and facial recognition technology
MIT researcher Joy Buolamwini’s viral Ted Talk “How I’m Fighting Bias in Algorithms”
“The Rise of Big Data Policing: Surveillance, Race and the Future of Law Enforcement” by Andrew Ferguson
Before You Go!
Subscribe to our Substack to ensure you receive information about episodes, relevant stories, and more.
You can listen to News Beat on your favorite podcast app. The button below will enable you to subscribe wherever you listen to pods.
Please share our Substack to help build this community and support independent media.
NEWS BEAT TEAM
Producer / Audio Editor / Host: Michael "Manny Faces" Conforti
Producer / Editor-In-Chief: Christopher Twarowski
Producer / Managing Editor: Rashed Mian
Executive Producer: Jed Morey
News Beat is a Morey Creative Studios and Manny Faces Media production.