International Women’s Day: how can algorithms be sexist?

Even though the first person to write an algorithm was a woman in the 19th century, artificial intelligence may now be discriminating against women.

Two centuries on from the first example, algorithms “have the ability to push us back decades” in gender parity, explains Susan Levy, a researcher at University College Dublin who is part of a project to prevent Artificial Intelligence algorithms from learning gender bias.

“They can exacerbate toxic masculinity and the attitudes we have been fighting for decades in society,” she adds.

The burden of history

Artificial Intelligence (AI) learns from data that is made available, and most of it is biased, says Levy.

The problem is that machines learn from data from the last 10 to 20 years which can unwittingly reproduce prejudices from the past. Moreover, without incorporating more recent social advances regarding gender and attitudes, the language and phraseology used in the data can perpetuate out-of-date stereotypes.

For example, most AI hasn’t heard about the global feminist movement #MeToo or the Chilean anthem “a rapist in your path.”

“We continue repeating the mistakes of the past,” says the researcher.

And this bias in programming has an impact on the daily life of all women: from job searches to security checkpoints at airports.