In her new documentary, Coded Bias, director Shalini Kanyayya raises issues with biases imbedded in technology, particularly computer programs. Her main argument is that computers have built-in bias that reflect the ones held by their programmers, which are usually men. Her film explores the A.I. facial recognition experiments of Joy Buolamwini, who discovered that these programs have trouble registering faces of women more than men. Buolamwini also investigates potential race-based bias too, suggesting that coders may have neglected to consider their personal prejudices when programming their A.I. algorithms. These algorithms are then used by authorities, for example, law enforcement to identify criminals. However, according to the documentary, 117 million Americans have uploaded their photos into the facial recognition network but it has never been audited for accuracy. As Kanyayya explains, a small number of corporations are creating facial recognition software for profit, while the people using the technology do not even know how accurate they even are. U.S. Congress has even admitted it is aware of the danger this coding bias can have on personal privacy. Ultimately, police could misidentify people as criminals before they even have a chance to prove their innocence.