【AI 人脸辨识技术具有严重的肤色和性别歧视】

【AI 人脸辨识技术具有严重的肤色和性别歧视】MIT 和斯坦褔大学合组的研究团队针对三种不同的面部辨识技术研究,发现它们都会因为对象的性别和肤色而有着不同的效果,报告中特别把面孔辨识技术对于不同性别和种族对象的准确度划分,发现美国主流科技公司声称具有超过 97% 准确度的面部辨识系统,其资料库是由超过 77% 男性和超过 83% 白人的面孔资料组成。所以面部辨识系统对于白人或男性以外的面孔,AI 的错误率会急升。报告全文将会在本月稍后发布。http://www.looooker.com/?p=53218


AI facial analysis demonstrates both racial and gender bias


Researchers from MIT and Stanford University found that that three different facial analysis programs demonstrate both gender and skin color biases. The full article will be presented at the Conference on Fairness, Accountability, and Transparency later this month.

Specifically, the team looked at the accuracy rates of facial recognition as broken down by gender and race. "Researchers at a major U.S. technology company claimed an accuracy rate of more than 97 percent for a face-recognition system they'd designed. But the data set used to assess its performance was more than 77 percent male and more than 83 percent white." This narrow test base results in a higher error rate for anyone who isn't white or male.

In order to test these systems, MIT researcher Joy Buolamwini collected over 1,200 images that contained a greater proportion of women and people of color and coded skin color based on the Fitzpatrick scale of skin tones, in consultation with a dermatologic surgeon. After this, Buolamwini tested the facial recognition systems with her new data set.

The results were stark in terms of gender classification. "For darker-skinned women . . . the error rates were 20.8 percent, 34.5 percent, and 34.7," the release says. "But with two of the systems, the error rates for the darkest-skinned women in the data set . . . were worse still: 46.5 percent and 46.8 percent. Essentially, for those women, the system might as well have been guessing gender at random."

There have certainly been accusations of bias in tech algorithms previously, and it's well known that facial recognition systems often do not work as well on darker skin tones. Even with that knowledge, these figures are staggering, and it's important that companies who work on this kind of software take into account the breadth of diversity that exists in their user base, rather than limiting themselves to the white men that often dominate their workforces.



Comments are closed.