Using AI Voice Assistants Fuels Sexist Gender Stereotypes: UN Report
Wed, April 21, 2021

Using AI Voice Assistants Fuels Sexist Gender Stereotypes: UN Report

Voice assistants essentially make tasks easier and more manageable for people, not only inside their homes but also everywhere. / Photo by: Zapp2Photo via Shutterstock

 

The presence of Siri and Alexa is not a new thing in our technology-driven society. These digital voice assistants have become more and more popular in smart homes throughout the years. A 2018 study by research firm Juniper Research reported that there will be eight billion in use by 2023 – an increase from only about 2.5 billion in 2018. 

In the next five years, the fastest-growing voice assistant categories include Smart TVs (121.3 percent CAGR), smart speakers (41.3 percent CAGR), and wearables (40.2 percent CAGR). The report also showed that voice commerce will reach over $80 billion per annum by 2023. “We expect the majority of voice commerce to be digital purchases until digital assistants offer truly seamless cross-platform experiences. Connected TVs and smart displays are vital here, as they can provide a visual context that is lacking in smart speakers,” research author James Moar said. 

A recent report by eMarketer, an online site that looks for data and research on digital for business professionals, projected that 111.8 million people in the US will use a voice assistant at least monthly this 2019 – an increase of 9.5 percent from the previous year. While the use of voice assistants is growing among all age groups, it was found that millennials are the heaviest users.

Today, voice assistants are being used in a variety of ways. They essentially make tasks easier and more manageable for people, not only inside their homes but also everywhere. However, this is also deeply affecting society.

Reinforcing Sexist Gender Stereotypes

If you have interacted with or used an AI-enabled voice assistant, you've probably noticed that you are speaking to what sounds like a woman, including Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana. At first, using a woman’s voice may not sound like a problem. However, society’s equating of women with voice assistants is problematic and could have some worrying societal implications.

Earlier this year, a study conducted by UNESCO revealed that designing female voice assistants to be submissive and servile can fuel gender bias and even normalize sexist harassment. According to Tech Radar, an online publication focused on technology, the report, entitled “I’d Blush if I Could,” takes its name from Siri's former default response to being called a "bitch" by users. It criticizes the fact that today’s major voice assistants are "exclusively female or female by default, both in name and in the sound of voice.” 

According to the study, using female voices on voice assistants reinforce the notion that women exist to be "assistants," rather than the assisted. For instance, your smart assistant's female voice would oblige in every question asked or command issued to them. This takes us back to the old-fashioned notion that women hold more subservient roles than men. Gender biases could continue to spread across the world as voice-powered technology reaches into more communities.


However, tech companies chose female voices for their voice assistants because women are believed to be more effective. Research showed that women’s voices tend to be better received by consumers. Tests revealed that most people prefer listening to female voices rather than male ones. Daniel Rausch, the head of Amazon’s Smart Home division, stated that they found that a woman’s voice is more sympathetic. 
According to Vox, an American news and opinion website, the study reported that voice assistants are responding with disturbingly docile responses whenever users verbally abuse or throw sexual innuendos at them. The researchers discovered that a lot of ‘early-on enquiries’ on Microsoft’s Cortana assistant were asking about its sex life. Aside from that, Robin Labs, a company that develops digital assistants to support drivers and others involved in logistics, found that at least 5 percent of interactions were unambiguously sexually explicit.

Nonetheless, this still reinforces gender stereotypes that are harmful to women. Julia Kanouse, CEO of the Illinois Technology Association, explained that this fuels the notion that we prefer to tell a woman what to do, rather than a man. “Only recently have we started to see men move into what were traditionally viewed as female roles, and, conversely, see women fight to ensure these roles (such as flight attendants, nurses, paralegals, executive administrators) are seen as more than ‘just an assistant’,” she added. 

 

According to a study by UNESCO, using female voices on voice assistants reinforce the notion that women exist to be "assistants," rather than the assisted. / Photo by: Kaspars Grinvalds via Shutterstock

 

Exploring Gender Neutral Options

According to the World Economic Forum, an independent international organization committed to improving the state of the world by engaging business, political, and academic institutions, the UNESCO report urges tech companies to take action. This includes exploring gender-neutral options, avoiding making digital assistants female by default, and programming assistants to discourage gender-based insults and abusive language.

One of the solutions is ‘Q’, the world’s first gender-neutral voice assistant. Julia Carpenter, an expert on human behavior and emerging technologies who worked on the project, stated that their team recorded “dozens of people,” including those who identify as male, female, transgender, and nonbinary. After that, they chose one and pitch-altered it so it sounded neither male nor female. Unfortunately, tech companies seemed uninterested in this.

According to Dr. Matthew Aylett, Chief Scientific Officer at speech technology company Cereproc, this only shows that brands like Apple, Google, and Amazon are choosing a default, neutral, well-spoken female voice without even considering the repercussions. 

Overall, voice assistants should not only aim to help humans with daily tasks but also help in not perpetuating deeply-rooted gender stereotypes. This means that tech giants should start exploring options to diversify the voices they are using. Although this might be a great challenge, it would counter sexism and harmful gender norms.