Alexa, Siri, and Google Assistant promote sexist attitudes towards women, says UN
Should AI assistants be genderless?
A report by UNESCO has suggested that the default use of female-sounding voice assistants in our smart home gadgets and smartphones perpetuates sexist attitudes towards women.
The report, titled I'd Blush if I Could, takes its name from Siri's former default response to being called a bitch by users – and criticizes the fact that Apple's Siri, Amazon Alexa, Google Assistant, and Microsoft's Cortana, are "exclusively female or female by default, both in name and in sound of voice".
- Read our Amazon Echo review
- Or, check out our Apple HomePod review
- The best Alexa skills and commands
Why is this a problem? Well, according to the report, the default use of female-sounding voice assistants sends a signal to users that women are "obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’".
The report also highlights the fact that these voice assistants have "no power of agency beyond what the commander asks of it" and responds to queries "regardless of [the user's] tone or hostility".
According to the report, this has the effect of reinforcing "commonly held gender biases that women are subservient and tolerant of poor treatment".
Worrying implications
This subservience is particularly worrying when these female-sounding voice assistants give "deflecting, lackluster or apologetic responses to verbal sexual harassment".
With at least 5% of interactions with voice assistants being unambiguously sexually explicit, it's not exactly uncommon, either – and their responses are troubling.
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
According to a report by Quartz in 2017, when asked 'who's your daddy?', Siri responded with 'you are' – and when Alexa was told 'you're hot', the assistant responded with 'that's nice of you to say'.
With voice assistants sounding more lifelike all the time, it's not a huge leap to suggest that these evasive responses could "reinforce stereotypes of unassertive, subservient women in service positions".
Since then, Alexa has been updated to disengage with verbal harassment, instead saying “I’m not going to respond to that”, or “I’m not sure what outcome you expected”.
Girls to the front
Why does it matter if voice assistants sound female as default? Well, it can affect the way we behave towards women and girls in real life.
As the report says, University of Southern California sociology professor Safiya Umoja Noble found that "virtual assistants produce a rise of command-based speech directed at women's voices".
"Professor Noble says that the commands barked at voice assistants – such as ‘find x’, ‘call x’, ‘change x’ or ‘order x’ – function as ‘powerful socialization tools’ and teach people, in particular children, about ‘the role of women, girls, and people who are gendered female to respond on demand’."
So, how has this been allowed to happen? Why are female-sounding voice assistants so ubiquitous? According to UNESCO, the problem lies in the lack of women in the room when tech companies design their AI voice assistants, and in STEM (science, technology, engineering, and maths) industries as whole.
With just 7% of ICT patents generated by women across G20 countries, these issues provide "a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education".
As well as recommending that the digital gender gap be shortened by "recruiting, retaining and promoting women in the technology sector", the report also recommends that more voice assistants should have male-sounding voices as default, ending the "practice of making digital assistants female by default".
According to CNet, Amazon and Apple didn't respond to its requests for comment and Microsoft declined to provide comment following its coverage of the report.
Google on the other hand, says that it's "developed a variety of 10 voice offerings in the US and that when customers set up a Google Home device, they have a 50-50 chance of getting either a traditionally female sounding voice, or a traditionally male sounding voice".
Olivia was previously TechRadar's Senior Editor - Home Entertainment, covering everything from headphones to TVs. Based in London, she's a popular music graduate who worked in the music industry before finding her calling in journalism. She's previously been interviewed on BBC Radio 5 Live on the subject of multi-room audio, chaired panel discussions on diversity in music festival lineups, and her bylines include T3, Stereoboard, What to Watch, Top Ten Reviews, Creative Bloq, and Croco Magazine. Olivia now has a career in PR.