|In 2018, research firm Canalys estimated that approximately 100 million smart speakers were sold across the world, which means more and more users are interacting with voice assistants / Photo by: Andriy Popov via 123RF|
In 2018, research firm Canalys estimated that approximately 100 million smart speakers were sold across the world, which means more and more users are interacting with voice assistants. It is expected that people will have more conversations with them by 2020. Tech Crunch, an American online publisher focusing on the tech industry, reported that the use of voice assistants will triple over the next few years. Estimates showed that from 2.5 billion assistants in use in 2018, it would increase by 8 billion by 2023.
Reports also showed that 94 percent of users consider voice technology as easy to use. According to them, voice assistants are not only saving them more time but also improving their quality of life. About 70 percent of users are reportedly satisfied with the technology’s ability to carry on conversations. Ultimately, voice assistants will continue to transform the way we approach our interactions with personal technology and brands.
Reinforcing Gender Stereotypes
While this sounds like good news for a lot of people and tech companies, a recent UN report revealed that voice assistants have been fueling harmful gender stereotypes. The study titled “I’d blush if I could” have explored biases in artificial intelligence. The researchers concluded that AI-powered voice assistants are reinforcing negative gender stereotypes while also failing to properly manage violent and abusive language among users.
The paper suggested that tech companies are fueling negative and regressive gender ideas, mostly harmful to women, by assigning only female voices and traditionally giving them female voices. Some of them stated that there is no intention of reinforcing these stereotypes to users. They added that research showed that most people prefer to hear a female-sounding voice on several occasions, particularly when the tasks associated with that voice are assistive.
|While this sounds like good news for a lot of people and tech companies, a recent UN report revealed that voice assistants have been fueling harmful gender stereotypes / Photo by: Iurii Golub via 123RF|
However, this empowers the idea that women are only for assistive roles. According to Digital Trends, a technology news, lifestyle, and information website that publishes news, reviews, guides, how-to articles, descriptive videos, and podcasts about technology and consumer electronics products, this places them in a position where they are expected to do what is asked of them. Female voice assistants are portrayed as "obliging and eager to please,” which fuels the idea that women are "subservient.”
"Because the speech of most voice assistants is female, it sends a signal that women are... docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK,’” the report said.
The study also focused on how owners are using voice assistants and how these are responding. The researchers discovered that AI voice assistants are responding with catch-me-if-you-can flirtation when they are presented with abusive language. They stated that the problem comes with how the engineering teams of most tech companies are staffed extensively and overwhelmingly by men.
The First Genderless AI Voice
The UN study concluded that addressing this harmful stereotype means creating a gender-neutral voice that AI assistants can use. Thus, creative studio Virtue Nordic and the human rights festival Copenhagen Pride, in collaboration with scientist Julie Carpenter, developed ‘Q,’ the world’s first genderless voice for AI systems.
According to Fast Company, a progressive business media brand with a unique editorial focus on innovation in technology, leadership, and design, the researchers stated that Q addresses the problem that happens when technology fails to represent everyone. In a statement, Carpenter said that they have acknowledged the fact social representation is important in influencing social values. While the project has no client yet, it is indeed a significant step forward.
Q will not only address the existing bias and harmful gender stereotypes but also will make technology more inclusive. This is because it can recognize people who identify as non-binary. Kristina Hultgren, a linguist who was not part of the project, said, “It’s because Q is likely to play with our minds that it is important. It plays with our urge to put people into boxes and therefore has the potential to push people’s boundaries and broaden their horizons.”
According to QZ, an online site that aims to serve a new kind of business leader with bracingly creative and intelligent journalism that’s built for users first, the researchers recorded 24 people who identified as male, female, transgender, and gender fluid. They layered their voices and found an average. The sound engineers needed to find a voice whose frequency was in a gender-neutral range: within 145 Hz to 175 Hz.
|Q will not only address the existing bias and harmful gender stereotypes but also will make technology more inclusive. This is because it can recognize people who identify as non-binary / Photo by: Kaspars Grinvalds via 123RF|
After that, the researchers tested the voice on more than 4,600 people identifying as non-binary from several countries such as the UK, Denmark, and Venezuela. They wanted to know if people would perceive the voice as non-binary. The results showed that 50 percent of the participants perceived the voice as neutral, 26 percent as masculine, and 24 percent as feminine. Thus, Q was born.
Carpenter added that one of the major goals with Q was to contribute to a global conversation about gender and about gender and technology and ethics. At the same time, to know how technology can be inclusive for people that identify in all sorts of different ways.
As of now, the team is promoting Q publicly, and they have received interest from companies in the tech industry that might want to adopt the genderless voice in their platforms. “The dream is that it’s implemented as a third option for Siri and Alexa. We’re inviting the tech firms to collaborate with us. There’s no price tag on Q,” the researchers said.
Indeed, technology should not only focus on improving lives and making tasks easier but also to not further contribute to problematic societal norms. Q is a huge step in making sure that voice assistants will represent all people and opens new opportunities for tech companies to be inclusive.