{"id":943,"date":"2016-09-11T13:13:35","date_gmt":"2016-09-11T10:13:35","guid":{"rendered":"http:\/\/artificialbrain.xyz\/?p=943"},"modified":"2016-11-20T13:23:27","modified_gmt":"2016-11-20T10:23:27","slug":"a-beauty-contest-was-judged-by-ai","status":"publish","type":"post","link":"https:\/\/www.newworldai.com\/a-beauty-contest-was-judged-by-ai\/","title":{"rendered":"A beauty contest was judged by AI"},"content":{"rendered":"

A beauty contest was judged by AI and the robots didn’t like dark skin. The first international beauty contest decided by an algorithm has sparked controversy after the results revealed one glaring factor linking the winners<\/p>\n

The first international beauty contest judged by \u201cmachines\u201d was supposed to use objective factors such as facial symmetry to identify the most attractive contestants. After Beauty.AI launched this year, roughly 6,000 people from more than 100 countries submitted photos in the hopes that artificial intelligence, would determine that their faces most closely resembled \u201chuman beauty\u201d.<\/p>\n

Out of 44 winners, nearly all were white, a handful were Asian, and only one had dark skin.<\/p>\n

Winners of the Beauty.AI contest in the category for women aged 18-29.<\/p>\n

\"ai-beauty-contest\"<\/p>\n

Alex Zhavoronkov, Beauty. AI\u2019s chief science officer, said \u201cIf you have not that many people of color within the dataset, then you might actually have biased results,\u201d \u201cWhen you\u2019re training an algorithm to recognize certain patterns \u2026 you might not have enough data, or the data might be biased.\u201d<\/p>\n

Humans who create the algorithms have their own deeply entrenched biases. That means that despite perceptions that algorithms are uniquely objective, they can often reproduce existing prejudices.<\/p>\n