How gendered voice assistants are hurting as they help

Lately there is no shortage of articles on how the pandemic has set women back. Since women tend to earn less than men, when it came time to look after children who could no longer go to school or daycare, women were done with work. Working women around the world were forced to leave their careers to become full-time housewives with all the care and clothing that goes with it.

For a society that has not lost its mindset around traditional gender roles, seeing women as everyone else’s helpers rather than its own people with their own destiny is part of the course. We even see this reflected in the emerging field of AI voice assistants, all of which sound like women.

“Alexa, why do you sound like a girl?”

Alexa, Siri, Cortana – they’re the latest in a long line of female-like voice assistants. But why?

Well, there are those deeply ingrained attitudes in society around gender roles that we’ve had to work so hard to undo. And you’ve surely heard of the ongoing gender discrepancy in STEM fields, where only 12% of AI researchers and one in 10 UK IT leaders are women. When more women are at the table and empowered to speak up, they can raise concerns about these kinds of things.

To be clear, the rise of gender technology has been a deliberate decision, one that was branded sexist in a UNESCO Report 2019. According to the team behind the Google Assistant, there are technical reasons why its 2016 system was female, even though it initially wanted to use a male voice. Due to biases in its historical text-to-speech (TTS) data, the assistant performed better when using the female voice than the male voice. And with the pressure of time on them, their launch product was left only as feminine.

But why were your previous TTS systems trained on skewed data in the first place? And why do we seem to care how our phones talk?

Screaming, passive, complaining …

These three words are commonly used to describe the voices of female speakers. They are not exactly flattering! Even sociolinguists spent much of the 1970s labeling passive linguistic characteristics as “female speech,” which in turn was described as inferior to the powerful and assertive language used by men.

There is evidence that using a female voice actually improves the user experience. To 2019study by Voicebot found a consumer preference for synthetic female voices over their male counterparts, with an average rating increase of 12.5%; the opposite happened when human voices were scored.

Bottom line: people prefer a female voice, but only when it’s robotic.

“So my voice assistant is a girl, so what?”

The problem with voice assistants isn’t just that they all sound female. It is the passive ‘personalities’ that have been designed for them.

Imagine this: you are a woman walking down the street, minding your own business. Suddenly, a man walks by and yells out the car window, “You’re hot!” Obviously, this is unacceptable behavior, and reacting to it would likely mean lifting a specific finger.

But if you say the same to Alexa, you’ll hear “it’s kind of you to say” in response. If you threw gender insults, like b * tch or sl * t, Alexa would just politely appreciate the feedback.

Deciding the role of an affable, passive, and eager-to-please assistant is the one that best suits a woman and reinforces the weary stereotype of female submission. You can order Alexa to remind you to take out the trash, text your mom, and turn off the lights, without even a ‘please’. What a polite bot he is!

But how does it teach us to treat women better when we can just throw orders in the general direction of a female-sounding helper?

We have seen some progress, but there is more work to be done

It is not all bad news. Since the UNESCO report was published, Alexa has declared that she is a feminist. The world’s first gender-neutral voice assistant, Q, is being developed to address the problem. And there is much more emphasis on getting women into STEM at a very young age, which will pay off in more inclusive technology for years to come.

But there is a long way to go: There is a lot of deep-seated prejudices to undo that we don’t even realize we’re carrying. The best place to start is to hire more women and empower them to speak out when something is blatantly sexist. If we all work together, we can create an AI that works for everyone.

Leave a Comment