'Hardwired subservience': Siri and Alexa reinforce sexist stereotypes, says UN

A UNESCO report has asked why AI assistants are always female - and what effects is this having on how we speak to and expect women to act.

A UN report has found AI assistants, usually female, are promoting a subservient image of women to tech users around the world.

A UN report has found AI assistants, usually female, are promoting a subservient image of women to tech users around the world. Source: Getty

A UN report has found female AI assistants are reinforcing gender stereotypes and promoting gender-based verbal abuse. 

Be it Apple's Siri, Amazon's Alexa, Microsoft's Cortana, or Google's Assistant, the vast majority of automated assistants have a female voice.

While voice-command technology may be the way of the future, UNESCO said it promotes an image of women from the dark ages to the hundreds of millions of people using the technology.
Amazon's electronic home assistant Alexa is one of the popular voice-operated assistant's accused of promoting unhealthy gender stereotypes.
Amazon's electronic home assistant Alexa is one of the popular voice-operated assistant's accused of promoting unhealthy gender stereotypes. Source: SBS
"It sends a signal that women are obliging, docile, and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’," the report sates.

"As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically."
The report found voice command software promotes acceptance of sexual harassment and verbal abuse in how automated assistants respond to their users.
UNESCO's Saniye Gülser Corat said voice assistants like Siri and Alexa are teaching people how to speak to women.
UNESCO's Saniye Gülser Corat said voice assistants like Siri and Alexa are teaching people how to speak to women. Source: YouTube ' TEDx Talks'
For example, when a user calls Siri a "b---h", she responds "I'd blush if I could". 

When given a same insult, Alexa replies "Well, thanks for the feedback".
It also concluded AI technology "makes women the face of glitches and errors" and forces the female personality to defer questions to a higher "and often male" authority.

UNESCO's director of gender equality Saniye Gülser Corat said this "hardwired subservience" was showing people how to speak to women and teaching women how to respond.
"Obedient and obliging machines that pretend to be women are entering our homes, cars and offices," Ms Corat said.

"To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

The UN is now calling on governments and tech giants to stop making digital assistants default to female voices and explore the possibility of developing a "neutral machine gender" that sounds neither male nor female.


Share
2 min read
Published 23 May 2019 8:05am
Updated 23 May 2019 8:09am
By Claudia Farhart
Source: SBS News


Share this with family and friends