Three leading face-recognition systems correctly identified white men 99 per cent of the time but did badly at identifying women with darker skin

Face-recognition software can guess your gender with amazing accuracy – if you are a white man.

Joy Buolamwini at the Massachusetts Institute of Technology tested three commercially available face-recognition systems, created by Microsoft, IBM and the Chinese company Megvii.

The systems correctly identified the gender of white men 99 per cent of the time. But the error rate rose for people with darker skin, reaching nearly 35 per cent for women. The results will be presented at the Conference on Fairness, Accountability, and Transparency in New York later this month

Face-recognition software is already being used in many different situations, including by police to identify suspects in a crowd and to automatically tag photos.

This means inaccuracies could have consequences, such as systematically ingraining biases in police stop and searches.

Biases in artificial intelligence systems tend to come from biases in the data they are trained on. According to one study, a widely used data set is around 75 per cent male and more than 80 per cent white.

Read more: Is tech racist? The fight back against digital discrimination

This article appeared in print under the headline “Face recognition’s biases on show”

More on these topics:

Post Author: admin

You may also like

Exclusive: Trump calls meeting on biofuels policy blamed by bankrupt refiner

U.S. President Donald Trump has called a meeting early next

Crush the right rock and spread it on farms to help soil and the climate

Researchers run the numbers on what could be a win-win.

SUGGESTED READING

TOPIC TAGS