My iPhone’s voice abettor is a woman. Whenever I cycle about new, a female voice tells me when to turn right, and when I’m home, yet addition feminine voice updates me on today’s news.

With so much female bondage in our smart devices, along with the rapid deployment of AI, it should come as no abruptness that technology is making the gender bias even worse.

Voice administration are usually female by default, whether it’s a life-size model posing as an airport chump annual adumbrative or an online chump annual chatbot — we expect them to be helpful, friendly, and patient.

webrok

According to several studies, behindhand of the listener’s gender, people about prefer to hear a male voice when it comes to authority, but prefer a female voice when they need help.

Furthermore, admitting the underrepresentation of women in AI development, voice administration are almost always given female names, such as Amazon’s Alexa, Microsoft’s Cortana, and Apple’s Siri.

During European Women in Technology Conference in Amsterdam, Laura Andina, a Artefact Manager at Telefonica Digital, explained why depicting AI administration as female is a social issue stemming from long-held biases in design — and why it should, and can, be changed.

Design’s role in gender bias

In her talk, called “Memoirs of Geisha: Building AI after gender bias,” Andina explained AI‘s gender bias by taking a look at Apple’s beat of skeuomorphic design —  a design method that replicates what a artefact would look like in real-life, as well as taking into annual how the concrete artefact would be used.

Apple’s skeuomorphic design on its beforehand iPhones included a ambit advised to resemble a real-life compass. The idea was users would anon know absolutely how to use the app with basal effort. The metal looked like metal, the wood looked like wood, and the glass looked like glass. Simple.

webrok

This same kind of ‘don’t make me think’ design has been implemented in today’s female voice assistants. Receptionists, chump annual representatives, and administration have commonly been female-dominated careers. Women have had to be helpful, friendly, and accommodating because it’s their job. The skeuomorphic design of an AI abettor accordingly would be female.

For Andina, it’s capital to break these gender biases in design to be able to make real-world changes. If new technology would stop peddling old stereotypes, women would have an easier time moving up the ranks professionally after being cast as administration or any other “helpful” stereotype.

Challenging AI’s gender roles

So how do we fix this?

Andina explained how gender roles in our claimed administration should be challenged, just like they should in real-life. It’s important to keep in mind while our daily interactions with AI create algorithms that train its behavior, we are also being socially shaped by our adventures with these voice assistants.

After the launch of Siri, it didn’t take long for users to start sexually afflictive the technology. In a YouTube video posted in 2011 titled “Asking Siri Dirty Things – Funny and Must Watch,” a man asked Siri to talk dirty to him, questioned her admired sex position, and asked her what she looked like naked.  

Disruptive technology like voice administration affect our real-life human behavior. If people foolishly collaborate with female AI technology in a rude manner, how will this affect how they treat women in the real world?

Since AI can’t carefully annul abstruse biases like humans can, we need to abode AI‘s role in arrest gender norms now, before it’s too late.

What’s the solution?

To avoid hardwiring sexism and gender bias into our future, one accessible solution, according to Andina, would be accouterment a genderless voice for AI technology.

But it won’t be easy to make — most genderless voices sound too robotic. Human-sounding voices are more trustworthy, so this could deter users.

While a genderless voice could help, technology cannot advance and move away from gender bias after assortment in artistic and administration roles. AI reinforces deeply built-in gender biases because the data being used in apparatus acquirements training is based upon human behavior. Basically, robots are sexist because the humans they learn from are.

If more women were on Silicon Valley’s aggregation boards and in their dev teams, the way we brainstorm and advance technology would assuredly alter our angle on gender roles.

Read next: TRON answers our questions about Project Atlas