Amazon and Percent

men: Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits. ; The researchers said that in their tests, Amazon's technology labeled darker-skinned women as men 31 percent of the time, according to The Japan Times. Lighter-skinned women were misidentified 7 percent of the time. Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities. Darker-skinned men had a 1 percent error rate, while lighter-skinned men had none. The new study, released late Thursday, warns of the potential of abuse and threats to privacy and civil liberties from facial-detection technology. Artificial intelligence can mimic the biases of their human creators as they make their way into everyday life. (news.financializer.com). As reported in the news.

The content, information, trademarks and multimedia posted on this blog copyrights to their original owners and herein blogged in good faith for the purpose of commentary, speech, opinion and debate.

financializer news

A weblog highlighting financial topics making news in the international media.