Citation

Code-for-bias

Author:
Kantayya, Shalini
Year:
2020

Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence (AI) increasingly governs our liberties? And what are the consequences for the people AI is biased against? When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software does not accurately identify darker-skinned faces and the faces of women, she delves into an investigation of widespread bias in algorithms. As it turns out, AI is not neutral, and women are leading the charge to ensure our civil rights are protected.Director Shalini Kantayya illuminates our mass misconceptions about AI and emphasizes the urgent need for legislative protection. From facial scanning used for policing and surveillance to automated HR systems that mirror and magnify workplace prejudices, these technologies are created with fundamentally biased building blocks. Emboldened by these remarkable and troubling discoveries, Buolamwini charts a way forward by joining ranks with other concerned experts to form a justice league committed to increasing awareness of the biases that underlie the technology that shapes our lives yet is largely free from legislative and public scrutiny.YEAR 2020 CATEGORY U.S. Documentary Competition COUNTRY U.S.A./United Kingdom/China RUN TIME 90 min COMPANY 7th Empire Media WEBSITE http://7thempiremedia.com EMAIL shalini@7thempiremedia.com PHONE (646) 942-7444