Review of Shalini Kantayya’s Coded Bias


Review by Mitch Carr
Developmentally Edited by Alexandra Hidalgo

Copy Edited and Posted by Iliana Cosme-Brooks

Coded Bias (2020). 1 h 30 min. Written, Directed, and Produced by Shalini Kantayya. Starring Joy Buolamwini, Cathy O’Neil, and Meredith Broussard.

An image of Buolamwini sitting in front of a screen and keyboard, holding up a white mask in front of her face. The text "face detected" is visible at the bottom of the screen. A constellation-esque map of blue lines and red dots is superimposed on top of her masked face.

Where is the line drawn where technology becomes hurtful instead of helpful? Depending on your physical appearance, this answer may be different for you. The documentary Coded Bias, directed by Shalini Kantayya, tells the story of an MIT Media Lab researcher, Joy Buolamwini, who discovers a racially-biased algorithm during her research. Racial biases are a form of implicit bias. Implicit bias can be defined as attitudes, stereotypes, and actions that affect an individual’s understanding and decisions in an unconscious—or conscious—manner. Coded Bias argues that finding these biases within coded technology means that the coders implement their own implicit biases into the technology they built. Because of (sub)conscious encoded biases living within technology, people of different races and genders experience technology differently. For Buolamwini, it was facial recognition technologies that were unable to recognize her face  because of her dark skin. Using Buolamwini’s unsettling discovery as a thread throughout the film, Kantayya knits together numerous national and international experiences of bigoted technologies to point audiences toward second guessing technology. 

Buolamwini’s startling discovery of racial bias in facial recognition algorithms put a spotlight on an unseen defect in modern technologies. Interview shots piece together a narrative of Buolamwini living her childhood dream of being a researcher at MIT. However, when she tried to create an Aspire Mirror—a mirror that would superimpose a famous person’s face over yours to offer you inspiring quotes—she realized it might have been more of a nightmare. Kantayya shows the audience how the facial recognition algorithms did not detect Buolamwini’s dark skinned face when she looked into the mirror. The shot continues as Buolamwini places a white face mask on, and the mirror’s algorithm kicks in as it was programmed to do. Something was wrong. Kantayya, and other social-justice-driven fierce women, lead the charge for ethical uses of technology. This fast-moving film makes sharp use of classic sci-fi style visual concepts to depict how we are living in the technology-driven world that, for decades, so many of us dreamed of but also feared. The music score (from composer Katya Mihailova) is meant to metaphorically (we hope) raise the audience’s blood pressure to remind us that the struggle between humans and machines is getting messier the larger the role technology plays in our lives. 

This documentary pushes for legislative change for the social justice and equity that lives in our technologies. Through Buolamwini’s discovery and Kantayya’s approach to telling her story, social justice for machine learning technology and algorithms is attainable. As noted by Devika Girish from the New York Times, Kantayya’s eye for detail depicts how action is attainable. Through this lens, the documentary focuses on people’s failing and vulnerabilities and ultimately to (hopefully) use their powers for good.

A Houston school district implemented a new way to evaluate teachers’ annual reviews. The process went from an in-person committee review to machine-reviewed classroom statistics and scores. A teacher featured in Coded Bias recounts receiving an arbitrarily poor algorithmic evaluation during his most recent review. The school district ignored his many years of experience and awards and fired him based on the algorithm’s suggestion. 

A watchdog group in London challenged the police’s use of A.I.-based facial recognition technology that often misidentifies and racially profiles pedestrians. This watchdog group fought the London police’s unjust use of technology when the police detained a 14-year-old Black student in an alley near a London tube station to be searched and interrogated. When this watchdog group intervened, the police fingerprinted the student and learned that the technology misidentified him.  

After seeing these examples of biased technology, Kantayya returns to Buolamwini’s experience of biased technology. Buolamwini’s efforts achieved the first legislative changes to fight biased artificial intelligence. The political implications of technology being used in elections are shown briefly in the film—enough to resonate with the audience. Kantayya shows how technology is not immune to prejudice and discriminatory practices. While watching, you will (re)think your use of technology. Buolamwini’s discovery of racist algorithms makes the audience feel as if they’re living through an episode of Black Mirror. However, in this episode, justice is attainable. Despite the discomfort the film will put you through, this is a must-watch documentary to understand the gatekeeping practice of (un)conscious technical biases.    

Coded Bias is a documentary that is written and directed by a woman. Not only is this a film that promotes tangible change, but it also promotes women and their voices. In a world that is becoming more and more dependent on technology, it is crucial to understand the roles that the digital play in our lives. Coded Bias is a trailblazing documentary that helps us face the consequences of the evolving world of technology we are all so deeply enmeshed in.  

You can stream Coded Bias on Netflix. Connect with Mitch by visiting his profile.