Algorithmic bias like human bias can result in exclusionary experiences and discriminatory practices.

MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn’t detect her face — because the people who coded the algorithm hadn’t taught it to identify a broad range of skin tones and facial structures.

Now she’s on a mission to fight bias in machine learning, a phenomenon she calls the “coded gaze.” It’s an eye-opening talk about the need for accountability in coding … as algorithms take over more and more aspects of our lives.

ALGORITHMIC JUSTICE LEAGUE

The Coded Gaze: the algorithmic bias was last modified: November 22nd, 2017 by admin

Also published on Medium.

CC BY-NC-SA 4.0 The Coded Gaze: the algorithmic bias by admin is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

LEAVE A REPLY

Please enter your comment!
Please enter your name here