W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9tdgvtifdvbwvul2pwzy9iyw5uzxitzgvmyxvsdc5qcgcixv0

Are AI systems like Alexa, SIRI and Cortana sexist?

Are AI systems like Alexa, SIRI and Cortana sexist?

Popular

AI systems and virtual assistants are no longer a thing of the future, they are a thing of right now. We use virtual assistants for almost everything. They can help you find out new academic knowledge, turn the lights on and off, help you do the weekly food shop and even schedule appointments.

Today, the majority of large tech companies have released their own versions of virtual assistants, from Apple’s SIRI to Microsoft’s Cortana, Google’s Holly and Amazon’s Alexa. Alongside a whole host of fictional AI from films, such as Blade Runner 2049’s Joi, Her’s Samantha, Marvel’s FRIDAY and Spiderman’s Karen.

But what do all of these AI assistants have in common? They are all “women” and have “female” voices. This has led to discussions around why companies always choose female voices for these services and new research has now prompted an important feminist question:

Are virtual assistants such as Siri, Alexa and Cortana sexist?

A report recently released by United Nations Educational, Scientific and Cultural Organisation (UNESCO) found that virtual assistants reflect, reinforce and spread gender bias.

It highlights that because all of these virtual assistants are default female, it makes women the “face” of glitches and errors that result from the limitations of hardware and software, designed predominately by men.

The report also explores whether the synthetic “female” voice and personality defer questions and commands to higher, often male, authorities, which could send a troubling message to women and girls about how they should respond to requests and express themselves.

is alexa sexist

The UNSECO study named ‘I’d Blush if I Could’, the response SIRI used to give to “Hey Siri, you’re a bitch”, suggests that virtual assistants reinforce the stereotypes of women as “servile” beings, who only exist to obey commands.  

The way that AI assistants respond to sexual harassment and verbal abuse has also come under scrutiny, with many believing that the docile responses to offensive language could lead to disturbing consequences in the future. Many companies probably designed the assistants to be unfailingly polite and upbeat to encourage users to carry on using the device. These design choices may serve a purpose when looking at it from a business perspective, but they pose worrying ethical questions surrounding gender stereotypes, expectations, and assumptions about what the future will hold for artificial intelligence.

A vision of the future

Earlier this year, it was reported that over 100 million devices that feature Alexa had been sold. In the last year alone, Apple reported that 500 million people actively use SIRI on their device. And, in the US, more than 90 million US smartphones owners use voice assistants at least once a month.

Research has also shown that by next year, many people are expected to have more conversations with digital voice assistants than with their spouses.

With this in mind, alongside the report findings from UNSECO, the potential influence and affect that AI assistants will have on our ideas about gender could be huge.

Harvard University researcher Calvin Lai has explored themes surrounding unconscious bias, suggesting that the more we are exposed to a certain gender association in society, the more likely we are to adopt it.

So, if AI assistants increasingly become integrated into our everyday life and they inherently teach us to associate women with assistants, the more “normalised” this view will become.

How can we build better equality?

The UNESCO report recommends that gender inequalities in AI can begin to be addressed with more gender-equal digital skills education and training. It also calls for research to be developed into a neutral machine gender for voice assistants, that is neither male nor female.

Some of the companies that faced criticism over the type of response their AI systems formulate following abusive language have made modest steps in the right direction. Siri no longer responds to “You’re a bitch” with “I’d blush if I could” — she now says, “I don’t know how to respond to that.” And Alexa now replies to some sexually explicit queries by saying, “I’m not sure what outcome you expected.”

However, there are calls for virtual assistants to actively discourage gender-based insults and abusive language, responding with a flat rejection like “That’s not appropriate.”

But, one of the most important issues that needs to be addressed is the gender imbalance in tech companies that are actually developing these systems.

Today, women make up just 12% of AI researchers, 6% software developers, and they are 13 times less likely to file an ICT patent than men, according to Wired.

Saniye Gülser Corat, UNESCO’s Director for Gender Equality, recently stated: “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

But, do these methods go far enough to affect change?

Aimee van Wynsberghe, a Robot Ethicist, believes that to make AI systems more ethical, people need to behave ethically in designing and building them. Recently talking to Forbes she explained:  “We need ethicists working in the companies that can afford them as part of the design team, where they can start to uncover the common issues other companies are running into.

“We see this need already with AI: we’ve started to create these products and even at early stages, we’re uncovering a host of ethical issues, whether it’s where the data comes from, how it’s trained, who’s making the algorithm or how it’s applied. And we’re usually finding these problems after we’ve started with the creation.”

One company has taken it a step further by releasing a feminist chatbot called F’xa, which aims to educate and inform users about gender bias. Built by a non-profit called Feminist Internet, who followed a set of guidelines that enables designers to imbue their chatbots with feminist values. Crucially, F’xa’s design team is made up of “different races, genders, gender identities, and people with different ways of thinking.”

This feminist chatbot also uses the research of Josie Young, a woman who specialises in feminist AI and who has created a Feminist Chatbot Design Process.  Her research focuses on the idea that when AI and chatbots are given a female gender, they reinforce gender stereotypes in society.  The aim of the research project was to examine this dynamic and create an intervention to disrupt the relationship between chatbots and entrenched gender power dynamics.

The design process asks designers to question a range of ideology, from “Have you considered a genderless chatbot?” to “Has the team reflected on the ways in which their values and position in society (i.e. as white, left-wing, young, Australian women) mean that they are more likely to choose one option over another or that they hold a specific, not universal, perspective on the world?”

This process certainly goes further than most to ensure that new AI systems are built with equality at their centre.

What can we do?

It’s clear than gender-equal technology cannot move forward while the same types of people remain in control of their development. Perhaps solving gender imbalance in AI tech companies would help towards the way AI is imaged and developed.

Recently, the Government announced a huge £18.5 million boost to funding for AI tech roles across the UK. This new initiative forms part of the government’s commitment to boost gender diversity in the tech sector and harness new technologies to upskill and retrain adults.

Companies across the tech sector already employ more than 2.1 million people, contribute £184 billion to the economy every year and inward investment to the UK AI sector stood at £1 billion for 2018, which is more than Germany, France, Netherlands, Sweden and Switzerland combined, according to government statistics. 

However, Tech Nation statistics published in 2018 revealed only 19% of our tech workforce are women.

Creating a more diverse future workforce will help with the design of new technology, including the fair and accurate development of algorithms, and tackle some of the greatest social challenges of our time - from protecting our environment, to transforming the way we live and work.

At STEM Women, we host a number of networking events that are perfect for women who are interested in career paths within technology. Join over 1,500 women at our events and unlock amazing career opportunities within the world of AI.

Our graduate fairs are coming to cities across the UK, from Edinburgh to Dublin, Manchester, Bristol, London and Birmingham. Visit our events page to find out more and follow us on LinkedIn, Facebook and Twitter to stay up to date with all the latest news and event information.

Or, perhaps you’re an employer looking to hire female STEM graduates? We can introduce you to thousands of students looking to start careers in STEM industries. From sponsorship and stands at careers events, to job boards and recruitment consultants that specialise in sourcing the very best talent, get in touch with us today for more information.

Enjoy this blog? Why not have a read of our blog on why more female students are taking science A levels than ever before!