Banner

IWD: the future of the female voice assistant

Mar 05, 2021

Smart technology has progressed at such a fast pace, we’ve hardly had time to get to know the virtual assistants in our homes, or consider why – when we call out for Siri, Alexa or Cortana – it’s almost always a female that answers back.

The role of gender in robotics is by no means a new debate. Despite making up a small minority of the industry’s machine learning engineers, women have been used as a prototype for many of the world’s most famous humanoids – like Sophia, the first ever robot citizen, and Erica, who Hiroshi Ishiguro made in his image of ‘the most beautiful woman’.

While the same can’t be said for the faceless assistants programmed into our speakers and phones, the fact remains that four of the tech industry’s frontrunners chose to make their AI personalities female – and many users want to know why.


 



Amazon named their virtual assistant Alexa after Alexandria – one of the largest libraries of the ancient world. It seems like a fitting choice for what’s become a modern source of information for millions, but it doesn’t quite explain why the creators didn’t opt for a more neutral alternative, like ‘Alex’.

Before Alexa’s final launch in 2015, Amazon reported that they had trialed a range of male and female voices, and found that they were getting a much stronger response from the latter; a voice that was perceived to be more ‘sympathetic’.



This isn’t the first time research has leaned in favour of using the female voice, either. Over the years, studies have suggested that the human brain prefers the sound; that the pitch is easier to understand, warmer, more helpful; and even that women articulate their vowels more clearly.

Stanford Communications professor, Clifford Nass, said “it’s much easier to find a female voice that everyone likes than a male voice that everyone likes” – and psychologist James W Pennebaker’s work on women’s use of pronouns and tentative words may also go some way to explaining why they’ve become a natural choice for voice-enabled systems.
 


But despite many studies pointing towards a simple case of preference, it’s hard to completely separate these findings from gendered ideals, especially when question-answering supercomputers like Watson – the robot that defeated two of Jeopardy’s greatest champions – don’t seem to follow the same rule of thumb.

Are we just used to seeing women as assistants?
 


Cortana, named after a character from the Halo videogame series, became yet another female AI personality to enter the market at the end of 2015. When asked for the reason behind their decision, Microsoft simply said that the female voice embodied the qualities of a digital assistant better



While the name or voice of a virtual assistant might not seem like a big deal on the surface, these small decisions have a large impact on reinforcing age-old stereotypes.

Apple frequently came under fire for the cliched ‘personality’ that was programmed into the first versions of their voice assistant, Siri – a name inspired by the Nordic term for ‘the beautiful woman that leads you to victory’. Siri’s submissive responses to abuse highlighted a huge representation problem in the tech industry, and went on to inspire the title for the UN’s ‘I’d blush if I could’ report; a publication that explores the problematic portrayal of women in tech, and the troubling repercussions of gendered AI.

As humans, we categorise the world around us based on countless factors – including gender and voice – and we’re often not conscious of the biased decisions we make on a day-to-day basis. So if a certain tone or pitch evokes a more positive response in a brand’s target market, it isn’t surprising they should choose to use it, regardless of how that response established itself in the first place.

In the same way products aimed at babies often have a compassionate and caring voice, and luxury food brands opt for something sultry and sophisticated, the creators behind many of these digital assistants have made a decision based on the fact it’s proven to work. People are more at ease when hearing a female voice, which is essential for a product designed to be used in the comfort of home.

But if influential tech brands aren’t held accountable for the implications their choices have on wider society, then we could risk programming the same biases we’re working so hard to unlearn into the age of AI.

While Apple and Google now both provide a male alternative, and Amazon has followed suit with a range of celebrity options like Samuel L Jackson, perhaps the best way forward is to not humanize voice assistants at all. Alan Winfield, co-founder of Bristol Robotics Laboratory, said that AI’s gender problem is one of ‘the top two ethical issues in robotics’, so eliminating them completely in place of a ‘bot personality’ seems like the simplest way to solve a complex issue.

One exciting development in this area is Q: the world’s very first genderless voice. Q was created by Copenhagen Pride and Virtue in response to the bias that exists in AI today. It was recorded by people who don’t identify as male or female, and altered by audio researchers to create a neutral frequency range between 145 and 175 hertz.



The end-result is a voice that has no discernable gender; a voice that could not only help level out the playing field for men and women, but provide better representation for non-binary people and put the tech industry on-track for a more progressive future. If AI is going to be our answer to tackling unconscious human bias, as has so often been predicted, the choices tech leaders make today will no doubt decide whether the future is a safe and inclusive place for everyone.