![]() She describes how a Black man was arrested by Detroit police because a facial recognition algorithm incorrectly flagged him as a match for a shoplifter, reflecting the tendency of such programs to produce false matches for people of color, who are underrepresented in the images used to train those programs. ![]() Telling the stories of individuals from marginalized communities who have been wronged by technology, the author shows how design and conceptual failures produce unfair outcomes. “The biases embedded in technology are more than mere glitches they’re baked in from the beginning,” argues Broussard ( Artificial Intelligence), a data journalism professor at New York University, in this scathing polemic. ![]()
0 Comments
Leave a Reply. |