White male dominance in the technology industry impacts the effectiveness of facial recognition software that’s increasingly a part of our lives. It works to near perfection for White men.
MIT and Stanford University researchers found skin color and gender bias in three commercially released facial analysis programs, according a report released on Feb. 11. A “major U.S. technology company” in the study claimed a 97 percent accuracy rate for its face recognition software. However, it utilized a data set comprised of 83 percent White people and 77 percent men. When the researchers diversified the data set to include people of color and women, accuracy decreases sharply. The failure rate exceeded 46 percent for Black women.
The accuracy of facial recognition software will continue to be skewed in favor of White men because of the lack of diversity in Silicon Valley. Software engineers and programmers design their products based on their needs and test the products on themselves. Implicit biases guide every step of their decision making. The consequences for facial recognition systems are far reaching.
“The same data-centric techniques that can be used to try to determine somebody’s gender are also used to identify a person when you’re looking for a criminal suspect or to unlock your phone. And it’s not just about computer vision. I’m really hopeful that this will spur more work into looking at [other] disparities,” said Joy Buolamwini, an MIT researcher and co-author of the study.
This problem will likely persist since Silicon Valley has a long way to go to achieve racial diversity. Blacks, Latinos and Native American are underrepresented in the industry by 16 to 18 percent, according to Wired.