Research shows facial recognition software gender issue

Date:30 October 2019 Author: Leila Stein Tags:,

A new study reveals that while facial recognition software is pretty accurate at identifying between male and female, it struggles to identify transgender and non-binary people accurately.

A research team from the University of Colorado collected 2,450 images of faces from Instagram. These had been labelled by the poster with their gender identity. They were then divided by the research team into seven groups, (#women, #man, #transwoman, #transman, #agender, #agenderqueer, #nonbinary), and analysed by four of the largest facial recognition analysis services (IBM, Amazon, Microsoft and Clarifai).

“We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders,” lead author Morgan Klaus Scheuerman, a Ph.D. student in the Information Science department said in a statement, “While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.”

On average, the systems got the gender of cisgender women (those born female and identifying as female), right 98.3% of the time and cisgender men accurately 97.6% of the time.

But trans men were wrongly identified as women up to 38% of the time and those who identified as agender, genderqueer or nonbinary—indicating that they identify as neither male or female—were mischaracterised 100% of the time.

The results of this research are important because this technology is becoming increasingly prevalent in society. From dating apps, face ID unlocks, to airport security and law enforcement surveillance.

“These systems run the risk of reinforcing stereotypes of what you should look like if you want to be recognized as a man or a woman. And that impacts everyone,” said Scheuerman.

The authors say they’d like to see tech companies move away from gender classification entirely and stick to more specific labels like “long hair” or “make-up” when assessing images.

These results come out after the American Civil Liberties Union (ACLU) released a statement showing Amazon’s facial recognition software incorrectly matching the faces of major athletes to criminal mugshots.

Image: Pixabay

Latest Issue :

May / June 2021