SHARE

Artificial intelligence voice assistants, such as Amazon's Alexa and Apple's Siri, are perpetuating and spreading gender stereotypes, says a new UN report.

Titled "I'd blush if I could", the report from UNESCO says the almost exclusive market of female voice assistants fuels stereotypes that women are "obliging, docile and eager-to-please helpers".

And with assistants responding to requests no matter the manner in which they are asked, this also reinforces the idea in some communities that women are "subservient and tolerant of poor treatment".

Canalys, a technology research company, has estimated that 100 million "smart speakers", which are used to interact with voice assistants, were sold in 2018.

According to the UNESCO report, technology giants such as Amazon and Apple have in the past said consumers prefer female voices for their assistants, with an Amazon spokesperson recently attributing these voices with more "sympathetic and pleasant" traits.

Advertisement

However, further research has shown that preferences are a little more complex – people have been found to like specific masculine tones when listening to authority, but prefer female tones when in a helpful context.

In general, most people prefer the sound of the opposite sex, the report said.

More from Science & Tech

The report specifically notes that the inability for some female-voiced digital assistants to defend themselves from hostile and sexist insults "may highlight her powerlessness".

In fact, some companies with majority male engineering teams have programmed the assistants to "greet verbal abuse with catch-me-if-you-can flirtation," the report said.

Image: The lack of response to gender-based insults can reinforce a "boys will boys" attitude, the report notes

Some cases even found assistants "thanking users for sexual harassment", and that sexual advances from male users were tolerated more than from female users.

Citing a Quartz piece specifically Read More – Source

[contf] [contfnew]

Sky News

[contfnewc] [contfnewc]