Facial recognition technology has seen improvements in recent years, but there are still some issues that need to be fixed to make the technology reliable across people of different races and gender.
According to a study by researchers from MIT Media Lab and Stanford University, facial recognition technology currently works best for white men, but not so well for dark-skinned women.
Experiments undertaken by the researchers showed facial recognition systems had inaccuracies in identifying gender, depending on an individual’s skin color.
The results show that biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition, the New York Times reported.
The study involved facial recognition systems developed by Microsoft, IBM and China’s Megvii.
When it came to determining the light-skinned males’ gender, the error rates of the three systems were never larger than 0.8 percent. But for darker-skinned women, the error rates grew to over 20 percent in one case and over 34 percent in the other two systems.
The darker the subjects’ skin, the more errors there were.
Joy Buolamwini, an MIT Media Lab researcher who was involved in the study, sent the results of the research to the companies that developed the facial recognition systems, New York Times reported.
Responding to the findings, IBM said in a statement it had steadily polished its facial analysis software and that it was “deeply committed” to “unbiased” and “transparent” services.
The company said it will launch an improved service which has accuracy on darker-skinned female boosted with a virtually 10-fold expansion.
Microsoft said that it had taken steps to raise the accuracy of its facial recognition technology, and that it was investing in research “to recognize, understand and remove bias.”
In another research study, a widely used facial-recognition data set was determined to be over 75 percent men and over 80 percent white, according to the New York Times.
In 2015, Google had to apologize because its image-recognition photo app initially labeled African Americans as “gorillas”. This is an example of “computer vision miscues” and occasionally evidence of “discrimination”, the Times noted.
This article appeared in the Hong Kong Economic Journal on Feb 13
Translation by Jonathan Chong with additional reporting
[Chinese version 中文版]
– Contact us at [email protected]