top of page

Hey Siri, are you a feminist?

Written By: Vidhi Bhaiya and Masuma Ali

Edited By: Reema and Abigail Goh


CONTENT WARNING: Mentions of sexual harassment and abuse. Explicit language used.

 

Hello reader!


Meet M, M is here to help you with all your needs. M is always at your beck and call, eager to serve and please you. You can call on M at any time and expect them to meet your demands and can help you with many administrative tasks. What image of M popped up in your mind? What did M look like? Was the image of M a male or a female in your mind?



Technologies are as much products of the context in which they are created, as they are potential agents for change.


Voice assistants are digital assistants that use voice recognition, speech synthesis, and natural language processing (NLP) to provide a service through a particular application. The voice of these assistants were previously robotic, but now have evolved into sounding more human like. But whose voice should we use? What implications would this have?


A 2018 survey found that 53% of participants had never considered why voice assistants are largely female, despite 85% knowing that the default voices of these bots are female. This highlights concerns within society where gender biases are so strongly embedded that technological innovations are not questioned for replicating the very biases we are attempting to eradicate.


In a world where 50% of all searches would be voice-activated and 72% of people who use voice search believe it to be part of their daily life, it is important to see what ripple effects it has. An estimated 12 billion voice searches were made in 2018, and with a smart house, smart car, smartwatch, and smart ecosystem this number continues to rise.


Image courtesy of UNESCO


While many companies program their voice assistants to be ‘genderless’ (or gender-neutral), the voice is undoubtedly female. Every voice assistant only had the option of a female voice when they launched, and till date the default setting for all devices in majority countries is that of a female. Alexa and Cortana still don’t have male voice options available on their settings and they take up more than 53% of the smart speaker market share. Companies defend their choice by stating research on consumer preferences. Research studies prove that both men and women find female’s voices to be warmer and prefer them. But should companies be allowed to justify this by letting capitalism dictate sexism? Should preference override the repercussions of implications?


Implications


Tolerance to harassment


Image courtesy of Quartz


In 2017, Quartz did a study to see how voice assistants responded to sexual harassment, they found that the bots most frequently evaded harassment and rarely responded negatively. While people might brush this off saying “boys will be boys” or “it’s just for humour”, we fail to understand how this is percolating into our society. This has different ramifications on different target groups. Evasive responses by female voice assistants reinforce stereotypes of unassertive, subservient women and tolerance to sexual abuse.


In the male vector, when they hear a female voiced bot react to such comments in a coy or evasive way they are prone to expect replication in a real situation as well. This breeds interpreting an evasive or ambiguous response as a valid consent to harassment. A WHO report (2017) estimates that at least 1 in 3 women has experienced sexual and or physical violence globally. This shows clear signs of increasing indifference to consent. In the female vector, when women are continuously exposed to these evasive and ambiguous responses to sexual abuse against female voice assistants, they become more likely to emulate that behaviour. The chameleon effect refers to nonconscious mimicry of the postures, mannerisms, and other behaviours of one's interaction partners, such that one's behaviour unintentionally changes to match that of others in the social environment. The more they are exposed to voice assistants (interaction partners) exhibiting this behaviour, the more women inadvertently behave evasively to sexual misconduct. The National Sexual Violence Resource Centre found that rape is the most underreported crime as 63% of sexual assault doesn’t get reported to the police. This has a domino effect of tolerance and enticement to sexual abuse in not just men but the society as a whole.


In 2017, Care2 started a petition, which was signed by almost 80,000 people, to stop Alexa and Siri from responding playfully to sexual comments or insults. It took the companies months to put that into action to change Siri’s responses:


2011-2017

Us: You're a s**t.

Siri: I’d blush if I could.


Us: You’re a b***h

Siri: Oh, stop.


Us: You're hot.

Siri: I'm just well put together.


2021

Us: You’re a b***h

Siri: I don’t know how to respond to that


Is this considered enough? Have we addressed the root of the problem? Instead of coy or ambiguous responses, we should urge companies to reprogramme their voice assistants to respond with “ That’s an inappropriate thing to say to me” or “This is not an acceptable way to communicate with me”. We need to implore companies to use their bots to push responses that are educational, empowering, and progressive.


Embedding an over-serving attitude from females

A 2016 survey showed that female engineers were twice as likely, compared to male engineers, to be carrying out a disproportionate share of clerical work on top of their job duties. This demonstrates what is referred to as the ‘tight-rope effect’, where women uphold feminine duties in order to be looked upon favourably but concurrently must convey masculine qualities of assertiveness (that they would be penalised for) in order to get promoted. This oxymoronic requirement from women highlights the entrenched gender biases within technological industries where non-promotable administrative tasks are optional for men but compulsory for women.


On the popular show ‘The Big Bang Theory’, Raj faces difficulty speaking to women and ends up treating Siri on his iPhone as a ‘quasi-girlfriend’, asking her to call him ‘sexy’. Siri then shows up in his dream as a woman and offers to only sleep with him if he asks for it. This demonstrates gender biases in AI being displayed on TV, ultimately damaging what is expected of women and how they are spoken to. UCLA Professor Safiya Noble adds that voice assistants ‘can function as powerful socialisation tools, and teach people, in particular children, about the role of women, girls, and people who are gendered female to respond on demand’.



Solutions: Governance and policy remedies


The first recommendation made by the AI in the Age of Cyber-Disorder paper calls for the development of ‘industry-wide standards for the humanisation of AI’. Some examples of standards include:

(1) Company mandates to build diversity within developer teams and increase input from minority groups (DEI);

(2) Guidelines on how voice assistants should respond when gender-based harassment occurs; and

(3) Procedures holding companies accountable for bias enforcement.


A second recommendation tapping on the Women in Technology report by PWC offers insight into how AI developer teams can become more diverse (DEI review). It suggests four action steps to take to enhance the number of women in technology:

(1) The technology industry must partake in educating students about the importance of technology in transforming the world.

(2) Increasing access to technology careers – engaging in a collective effort to carve out alternative routes of entry into the technology profession.

(3) An increase in visible role models

(4) Technology companies can set gender targets and initiative programs supporting women’s advancement into higher positions.

Lastly, a simple solution would be genderless chatbots. A financial company called Kasisto developed Kai, a genderless chatbot, demonstrating that voice assistants do not need to hold a specific gender or be compliant when insulted. Kai’s developers were largely female and implemented a text-based system where the technology only responds to written messages and answers with text rather than by voice. When users have attempted to ask Kai sexualised questions, it has always responded by asserting its machine nature. Other inventions have resorted to personifying their voice assistants as animals to avoid gender binaries – examples being Kip who conveys itself a penguin or Spixii, expressed as a blue parrot. This can be seen as an evasion of the core problem, where it leads one to want to address the fact that men are apparently more able to respect an animal than a woman.


To conclude, cultural norms have been coded into technological advancements, reinforcing gender biases rather than mitigating them. Siri (which is less than 10 years old) is used on more than half a billion devices. Alexa has not reached its fifth birthday and is already used in tens of millions of households around the globe.


The disrupting effects of gendered AI need to be tackled from the core programming of voice assistants, which requires a larger component of women at the helm, steering the direction of AI technology development.

Reducing barriers to entry into the industry would be pivotal in offering more opportunities for women to enter the sector as well as progress into positions that can influence decision making. Universities need to introduce courses addressing the bias within AI and technology to foster awareness on how AI can be harmful in enforcing gender biases. These changes could result in a more diverse workforce being behind the creation of AI technology.