Hey Alexa,why are you female?

Social Affairs Editor

One day,in the middle of the school holidays,my son asked Alexa to fart.

To his nine-year-old glee,Alexa let a sonorous one rip,before perkily telling him to ask for a pig fart or “say surprise me”.

Somebody has had too many beans,” Alexa chirruped.

On and on Alexa tooted,until my son screamed “HEY,ALEXA!! STOP!!”

“Maybe Alexa would prefer it if you said please,” I told him,wondering whether worrying that your child was being impolite to a disembodied voice powered by artificial intelligence was peak 2022 parenting angst.

But it wasn’t just that my son didn’t say please to an AI that niggled at me,it was that the virtual assistants we routinely boss around (the farting Alexa,the Google Maps voice we abuse when we disagree with the route) are almost always distinctly feminine.

Female voice assistants have been accused of perpetuating gender stereotypes.

Female voice assistants have been accused of perpetuating gender stereotypes.Marija Ercegovac

Tech companies have come under criticism for casting their artificial-intelligence assistants as female,thereby perpetuating stereotypes that women are submissive,obedient and the ones that carry out domestic chores in the home such as dimming the lights,telling children stories and updating the shopping list.

In the Australian market,most digital assistants now offer male voices as an option.

“Hey Alexa,what gender are you?” I asked. “I am an AI. I don’t have a gender,” Alexa replied.

This struck me as disingenuous,even for something without a body.

Alexa has a female voice,feminine eager-to-please characteristics (“Alexa,are you smart?” my son asked. “I try my best,” Alexa replied coyly) and the name was once a popular baby girl name,meaning “defender of man”. (The Washington Postreported last year that the popularity of Alexa as a name had plummeted since Amazon released the virtual assistant,with real-life Alexas reporting they were tired of interruptions from the bot and jokes at their expense.)

Amazon said the name was a tribute to the great library of Alexandria. The company said it tested several voices and ultimately chose to launch with the voice customers liked the most,with the goal to later add more voices. Alexa was designed not to respond to insults,it said. “Alexa’s adoption suggests that our customers find the personality easy to relate and engage with.”

In 2019,aUNESCO report said that AI voice assistants projected as young women perpetuated harmful gender biases.

“Because the speech of most voice assistants is female,it sends a signal that women are obliging,docile and eager-to-please helpers,available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the report said.

“It honours commands and responds to queries regardless of their tone or hostility. In many communities,this reinforces commonly held gender biases that women are subservient and tolerant of poor behaviour.”

Apple's Siri virtual assistant on the iPhone 4S.

Apple's Siri virtual assistant on the iPhone 4S.

The report said that when the first mainstream voice assistant,Apple’s Siri,made her debut in 2011 it was not as a genderless robot,but as a sassy young woman who deflected insults and liked to flirt and serve users with playful obedience.

The report is named “I’d blush if I could”. This is a reference to a Quartz investigation in 2017,which found Siri responded to the abuse “You’re a bitch” with “I’d blush if I could”,while Alexa said “Well,thanks for the feedback”.

(Siri now says “I won’t respond to that” while Alexa just made a dinging noise.)

“Siri’s ‘female’ obsequiousness – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products,pervasive in the technology sector and apparent in digital skills education,” the report says.

The report recommended,among other things,that tech companies end the practice of making digital assistants female by default and explore a gender-neutral option.

Professor Yolande Strengers believes smart home devices need a feminist reboot.

Professor Yolande Strengers believes smart home devices need a feminist reboot.Paul Jeffers

Kate Burleigh,the country manager of Amazon Alexa ANZ,said customers in Australia would soon be able to choose a masculine-sounding voice.

“We work hard to ensure Alexa’s responses don’t perpetuate negative stereotypes of women or any other group,” Burleigh said. “We designed our voice assistant to reflect qualities we value in all people,like being smart,considerate,empathetic and inclusive.”

Burleigh said Amazon offered several other “wake words” as an alternative to Alexa,including Echo,Computer,and Amazon.

There have been some improvements since the UNESCO report,according to Professor Yolande Strengers,a digital sociologist and human-computer interaction scholar from Monash University.

As of March 2021,for example,Siri no longer defaults to a female voice. Users must now choose between a male and female voice when enabling the voice assistant.

“I think a little bit has changed but not a lot to be honest,” Strengers said.

Strengers’ partner discovered,for example,that Alexa will say sorry when ordered to apologise. (Siri,by contrast,says thatApologise is a song written by Ryan Tedder.)

Strengers wrote an opinion piece forNBC News about the implications of a feminised device apologising on demand for no reason.

“The past year has definitely been challenging and at times downright horrible,but I’m pretty sure it’s not all Alexa’s fault,”she wrote. “And yet this feminised device is a willing and available outlet for our irritations or amusement by way of an unconditional,open-ended and continually available apology.”

Alexa still says sorry when asked to apologise.

“Some of the things that get raised do get changed and other things don’t or take longer to shift but they’re quite small tweaks in terms of the devices,” Strengers said.

Strengers and Dr Jenny Kennedy,a senior research fellow at RMIT,are the co-authors ofThe Smart Wife:Why Siri,Alexa and other smart home devices need a feminist reboot.

The book examines a range of feminised technologies including voice assistants,Paro the robotic seal,which is used to help people with dementia,sex robots and the Roomba,a housekeeping robot inspired by Rosie,the robot housekeeper fromThe Jetsons.

In addition to perpetuating gender stereotypes,Strengers said smart wives could expose people to privacy and security risks.

She said feminised and cute AI made device owners less likely to think about how much data they were collecting and whether this could be inadvertently accessed by third parties.

Removing the default female voice is a start. There is also the possibility of using a genderless voice,such as Q,which was released in 2019 by a group led by Copenhagen Pride and is deliberately intended to be gender ambiguous.

But bias is more than voice deep and Strengers believes the solution also lies in looking at the kinds of personalities,characteristics and traits that are embedded into devices.

She points to a banking chatbot called Kai,designed by roboticist Jacqueline Feldman,which has a quirky sense of humour,but does not rely on gendered stereotypes.

“I’d like to see much greater experimentation and innovation in the personality design of robots,AI and voice assistants,” Strengers said. “And also more concerted attention to the way we disclose information about what the robots and AI are collecting in terms of data and the risks that that might pose.“

Meanwhile,I am looking forward to my son being able to ask the new male-voiced Alexa to fart on command.

I am sure it is something he will do with alacrity.

Get news and reviews on technology,gadgets and gaming in our Technology newsletter every Friday.Sign up here.

Jewel Topsfield is social affairs editor at The Age. She has worked in Melbourne,Canberra and Jakarta as Indonesia correspondent. She has won multiple awards including a Walkley and the Lowy Institute Media Award.

Most Viewed in Technology