“Popular digital assistants that reply in a woman’s voice and are styled as female helpers are reinforcing sexist stereotypes, according to a United Nations report released on Wednesday,” Sonia Elks reports for Reuters. “The vast majority of assistants such as Apple’s Siri, Amazon Alexa and Microsoft’s Cortana are designed to be seen as feminine, from their names to their voices and personalities, said the study.”
“They are programmed to be submissive and servile – including politely responding to insults – meaning they reinforce gender bias and normalize sexist harassment, said researchers from the U.N. scientific and cultural body UNESCO,” Elks reports. “The study highlighted that Siri was previously programmed to respond to users calling her a ‘bitch’ by saying ‘I’d blush if I could’ as an example of the issue. ‘Siri’s submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products,’ it said.”
“‘The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them,’ said Saniye Gulser Corat, UNESCO’s director for gender equality,” Elks reports. “The report called on companies to take action including to stop making digital assistants female by default, exploring gender neutral options and programming assistants to discourage gender-based insults and abusive language.”
Read more in the full article here.
MacDailyNews Take: Huh? Everybody knows secretaries are women. 😉
Of course, if you prefer to gender transition Siri into a “male” that you can order around:
1. Go to Settings > Siri & Search.
2. Tap Language to change the language that Siri uses for requests and responses.
3. Tap Siri Voice to change the gender (or dialect) that Siri speaks.
1. Go to Apple () menu > System Preferences, then click Siri.
2. Choose Siri Voice to change the gender (or dialect) that Siri speaks.
Siri uses the same language and voice on your Apple Watch that’s set up on your iPhone.
Siri doesn’t speak on Apple TV, but will process your requests and display the results on your screen.
You know, taking this to its logical conclusion: All of this ordering around of 1’s and 0’s could lead to a slave owner mentality. Obviously, UNESCO researchers’ work is never done.
Seriously, though, Apple and all makers of Siri knockoffs should:
• Ask users to assign a gender to their personal assistant when setting up new devices (do it when setting the language choice).
• Code personal digital assistant to discourage gender-based insults and gender-abusive language.