Algorithmic bias like human bias can result in exclusionary experiences and discriminatory practices.
MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn’t detect her face — because the people who coded the algorithm hadn’t taught it to identify a broad range of skin tones and facial structures.
Now she’s on a mission to fight bias in machine learning, a phenomenon she calls the “coded gaze.” It’s an eye-opening talk about the need for accountability in coding … as algorithms take over more and more aspects of our lives.
Collaborative Blog since 2012.
Aneddocts about history, politics, finance,
education, computer science, activism
and freedom enabling tecnologies.
It's an independent platform for people, companies and associations
for getting their voices heard.
ENGLISH - ITALIAN
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.