Coded Bias Documentary Reflection
“Coded Bias” Documentary Reflection
Coded Bias illustrates the issues of personal information and privacy as well as the racism and sexism that is integrated within the evolving industry of artificial intelligence and facial recognition software. MIT researcher Joy Buolamwini starts by demonstrating how a sample facial recognition software quite literally requires her to wear a white mask to be fully detected. Hearing this, I was quite surprised by how blatantly racist the system was, but upon her explanation it makes full sense. Artificial intelligence learns from the data it is given, and since the technological industry is dominated by white men; women, people of colour, and minority groups are recognized with the lowest accuracy.
In another real-life example given, a high-tech company using AI as a first scan through resume applications was shown to actively reject female applications from continuing on in the hiring process. This just goes to show how a system learning from societies flawed ways skews itself and continues on the cycle. I don’t believe that these softwares are inherently bad, but without societal change in the way we view roles within the industry, the “coded bias” will continue to discriminate.
Coded Bias also addresses the personal profile of information gathered from all people on the web. Before watching the documentary I was aware of how many large companies collect personal information, like social media and google, but I think it’s important to recognize how these companies share your information for things like targeted advertising. The film definitely highlighted how little I know about company’s data policies and their terms and conditions.