Why Video Surveillance Is Not a Good Crime-Fighting Tool

The documentary film “Coded Bias” warns of the racial and gender biases in facial recognition software and the dangers of its use.

In February, the City of Minneapolis joined a growing number of cities and states that have banned their law enforcement and other government entities from using facial recognition software (FRS). The State of Virginia legislature in February also passed a bill banning FRS use by local and campus police (but not the state police).

Meanwhile, in Brooklyn Park, Minnesota, two of the seven candidates — City Council Member Boyd Morson and Benjamin Osemenam, running for mayor in a special primary election being held April 13 — are advocating for widespread video surveillance as a crime-fighting tool.

Morson, for example, has said in two virtual forums that he wants a surveillance system modeled after Project Green Light, a Detroit program in which businesses and property owners pay for cameras that send 24-hour-a-day video to a police monitoring center. There are over 700 Project Green Light camera locations throughout Detroit. Street intersection cameras are being added to the system, funded by taxes.

Project Green Light employs facial recognition software. In 2019 and 2020, the Detroit Police Department arrested and jailed two innocent Black men based on FRS identifications that were wrong. According to a 2020 Vice.com article, the Detroit chief of police admitted that the FRS software used by Project Green Light and provided by DataWorks, is inaccurate “96 percent of the time.”

A BuzzFeed News investigation found that Clearview AI, another FRS application, has sneaked past bans into law enforcement departments and government agencies from the local to the federal levels. From 2018 to 2020, Clearview AI offered a free trial period of unlimited searches, during which about 340,000 searches were performed by organizations and individual government employees that might not have had approval from their superiors.

These organizations included employees at the Minneapolis Police Department, University of Minnesota Police Department, Minnesota Commerce Fraud Bureau, and Minnesota Fusion Center. Clearview AI’s free trial period led to an estimated range of 174 to 700 FRS cumulative searches for those organizations.

The University of Minnesota Police Department told BuzzFeed News: “While some individual officers may have been offered trials of the software in the past, use of the program was not and is not part of regular business operations.” The Minnesota Commerce Fraud Bureau said it “evaluated the software, but did not purchase and did not implement it.”

The Minneapolis Police Department and the Minnesota Commerce Fraud Bureau did not respond to BuzzFeed News’s inquiry about their use of Clearview AI’s free trial.

The documentary film “Coded Bias,” released in 2020, aired on PBS this year, and now available on Netflix, warns of the racial and gender biases in facial recognition software and the dangers of its use, paired with mass surveillance by government and private businesses that have been justified as crime prevention.

Using FRS technology — when some police chiefs concede to its inaccuracies, leading to innocent people being arrested — indicates to me that public safety is not its true purpose. I am not the only one who has come to this conclusion: “Coded Bias” shows a man in London, England — a city that has half a million CCTV surveillance cameras — being stopped and fined for “disorderly behavior” by police for covering his face to keep it from being recorded by an FRS-augmented camera. A member of the organization Big Brother Watch UK, who was protesting against FRS surveillance, tried to defend the man’s right not to be recorded. 

Big Brother Watch UK is among a growing number of activist groups, scholars, scientists and others who assert that FRS paired with surveillance is less a crime preventative and more a means of privacy-violating, information-gathering for social control and for profit. The data created from FRS surveillance is paid for by both government and commercial entities to feed gender- and racially biased algorithms that profile people. These algorithms not only can lead to false arrests such as what happened in Detroit, but also can be used to deny someone credit or a job. 

Furthermore, research reports indicate that claims that mass surveillance systems like Project Green Light reduce crime are dubious. Crime in the U.S. overall has been decreasing during the same period that the Detroit Police Department credits crime reduction to Project Green Light. There have been no studies comparing Detroit to other cities that do not have the same level of mass surveillance. A 2011 Urban Institute study of live-streamed surveillance systems in Baltimore, Chicago, and Washington, D.C., showed the three cities had mixed results of crime reduction among neighborhoods and areas.  

Footage from a surveillance video camera that passively records and stores footage for later viewing can be useful for solving a crime — after it has been committed. However, mass deployment of live-streamed 24/7 video surveillance has not been proven to prevent or reduce crime.  

The expectation is that surveillance is monitored. Project Green Light does not have the personnel to watch every live feed of every video camera in the program. Law enforcement agencies generally do not have enough people to watch live video from hundreds of cameras 24 hours a day. The “solution” to this dilemma is monitoring by facial recognition software and algorithms that track, identify, and profile people — criminals and law-abiding citizens, innocent or guilty.

This creates an environment of policing by artificial intelligence, programmed with all the racial, gender, and class biases of our society. Everyone is subject to search and seizure of their image and information without a warrant. And, as with the Londoner who tried to hide his face from the video camera, refusal to comply is swiftly punished. 


Related Stories

click on image

Story Sources

Leave a Reply