A database of more than 10,000 human images to evaluate biases in artificial intelligence (AI) models for human-centric computer vision is presented in Nature this week. The Fair Human-Centric Image Benchmark (FHIBE), developed by Sony AI, is an ethically sourced, consent-based dataset that can be used to evaluate human-centric computer vision tasks to identify and correct biases and stereotypes.
Human-centric photo dataset aims to help spot AI biases responsibly
Reader’s Picks
-
Sports fans all know that rosy feeling of happiness when we hang out with others who support our favorite team. [...]
-
The New South Wales parliament recently released a report exploring the impacts of pornography on people’s mental, emotional and physical [...]
-
In a study published in the Journal of Coastal and Island Archaeology, Dr. Thomas Leppard and his colleagues, John Cherry [...]
