A chatbot turns hostile. A test version of a Roomba vacuum collects images of users in private situations. A Black woman is falsely identified as a suspect on the basis of facial recognition software, which tends to be less accurate at identifying women and people of color.
Are tomorrow’s engineers ready to face AI’s ethical challenges?
Reader’s Picks
-
A new study from the Kinsey Institute reveals that only 50.5% of women who were targets of sexual harassment during [...]
-
2022 was a record year for music festivals in Spain, hitting historic highs just two years after the entire country [...]
-
Community organizations making child welfare reports to Oranga Tamariki say it’s not uncommon to be met with delayed decisions or [...]