“Coded Bias”

Film shows that technology is not free of bigotry

Thanks to Women’s Foundation of Minnesota  for supporting stories about reproductive justice. Women’s Foundation has been listening, advocating, and supporting leaders to work for racial and gender justice since 1983.

As a Massachusetts Institute of Technology Ph.D. candidate a decade ago, Joy Buolamwini found out that most facial recognition software (FRS) does not recognize brown-skinned or feminine-featured faces accurately. She discovered this through using her own African American female face, which FRS would not acknowledge unless she wore a white mask.

Buolamwini’s discovery is the launching point for “Coded Bias,” a film that starts with uncovering racial and gender bias in FRS. It documents the growing threat of invasive technologies powered by algorithms and data ridden with the prejudices of the almost completely white, cis-male people who create and control them.

In November, “Coded Bias” had its theatrical premiere at the Metrograph in New York City. Before this online screening, there was a video panel discussion on the film that featured director Shalini Kantayya, Joy Buolamwini, mathematician and author Cathy O’Neil, anti-FRS activist Tranae Moran, and teacher Daniel Santos. The panel was moderated by WIRED magazine senior staff writer Sydney Fussell.

Buolamwini affirmed that when she first published her findings in a thesis in 2018, there was very little public awareness of FRS, not even including the possibility that the technology may have racial or gender bias. However, “Now people are starting to see it in the real world. A major shift in 2020 was that IBM, Microsoft, and Amazon stepped back from selling facial recognition technologies, and this is in no small part because of the ongoing movement for Black liberation. The cold-blooded murder of George Floyd, coupled with the advocacy and the research, [is] all coming together in [this] moment.”

Tranaé Moran became an anti-FRS activist when, in 2018, the landlord of the Brooklyn, New York, apartment complex where she lived planned to implement FRS as a method of building security. “I had already started to question technology. This is when Snapchat started and when the iPhone X had just came out. I had recently gotten that phone with this facial recognition feature on it, and I read the terms on the iPhone, and [it] read that it would track my face ‘through time.’

“[So] when I got this notice that my building was going to get facial recognition systems, I automatically thought about the Apple terms for the iPhone on facial recognition systems. And I thought, ‘This is not going to be good at all.’”

Moran eventually organized other tenants in her complex to oppose and stop the landlord from using FRS.

In 2014, Houston, Texas, middle-school teacher Daniel Santos was informed that an algorithm to evaluate educators, developed by a company used by his school district, had deemed him ineffective, which threatened his continued employment. This happened a week after he was named teacher of the month at his school.

“It felt frustrating because I could see the injustice. I could witness daily the impact on my colleagues, on teachers, the demoralization,” Santos said. “It radicalized me. I experienced the injustice. I was faced with the notion [of] believing this [algorithm that said] I’m a bad teacher, when I could see all that I was able to contribute to my students and their lives.”

Santos and other teachers, along with their union, sued the school district, which paid a settlement in 2017 and ended the use of the algorithm.

Cathy O’Neil is the author of “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.” She contends that FRS, algorithms, and other technologies need to be vetted to ensure they don’t break any existing laws — especially anti-discrimination laws — or cause any harm before they are released for use on the public. “Just as drugs have to be [proven as] safe and effective before the FDA approves them, I claim that algorithms should have to prove that they are safe and effective.”

“Coded Bias” director Shalini Kantayya praised the scholarship, activism and heroism of her fellow panelists. “Because of the work of the people on this panel and the people in my film, and so many others, we have a real ‘moonshot moment’ to call for more ethics in technology.”

“Coded Bias” is screening online through a variety of theaters across the country. For more information, go to codedbias.com.


Stephani Maari Booker (she/her), author of “Secret Insurrection: Stories from a Novel of a Future Time,” writes nonfiction, speculative fiction, erotic fiction, and poetry. goodreads.com/athenapm


Editor’s Note

We are featuring Stephani Maari Booker’s “Judie-Junkie Blues” story in monthly installments. Find the first installment here.

Related Reading

“The Prison Cell Is Everywhere,” by Stephani Maari Booker, about a Walker Art Center exhibition

http://www.otc-certified-store.com/beauty-products-medicine-usa.html https://zp-pdl.com/online-payday-loans-in-america.php

http://www.otc-certified-store.com/antidepressants-medicine-europe.html https://zp-pdl.com/how-to-get-fast-payday-loan-online.php