• About
  • Contact
Friday, May 16, 2025
No Result
View All Result
Londoner News
  • Home
  • London
  • Britain
  • Europe
  • America
  • International
  • Submit Article
  • Other
    • Health
    • Tech
    • Travel
    • Science
  • Home
  • London
  • Britain
  • Europe
  • America
  • International
  • Submit Article
  • Other
    • Health
    • Tech
    • Travel
    • Science
No Result
View All Result
Londoner News
No Result
View All Result
Home Tech

AI female voice assistants reinforce gender bias – UN report

by The Editor
May 22, 2019
in Tech
0
AI female voice assistants reinforce gender bias – UN report
0
SHARES
3
VIEWS
Share on FacebookShare on Twitter

Artificial intelligence voice assistants, such as Amazon's Alexa and Apple's Siri, are perpetuating and spreading gender stereotypes, says a new UN report.

Titled "I'd blush if I could", the report from UNESCO says the almost exclusive market of female voice assistants fuels stereotypes that women are "obliging, docile and eager-to-please helpers".

And with assistants responding to requests no matter the manner in which they are asked, this also reinforces the idea in some communities that women are "subservient and tolerant of poor treatment".

Canalys, a technology research company, has estimated that 100 million "smart speakers", which are used to interact with voice assistants, were sold in 2018.

According to the UNESCO report, technology giants such as Amazon and Apple have in the past said consumers prefer female voices for their assistants, with an Amazon spokesperson recently attributing these voices with more "sympathetic and pleasant" traits.

Advertisement

However, further research has shown that preferences are a little more complex – people have been found to like specific masculine tones when listening to authority, but prefer female tones when in a helpful context.

In general, most people prefer the sound of the opposite sex, the report said.

More from Science & Tech

The report specifically notes that the inability for some female-voiced digital assistants to defend themselves from hostile and sexist insults "may highlight her powerlessness".

In fact, some companies with majority male engineering teams have programmed the assistants to "greet verbal abuse with catch-me-if-you-can flirtation," the report said.

Image: The lack of response to gender-based insults can reinforce a "boys will boys" attitude, the report notes

Some cases even found assistants "thanking users for sexual harassment", and that sexual advances from male users were tolerated more than from female users.

Citing a Quartz piece specifically Read More – Source

[contf] [contfnew]

Sky News

[contfnewc] [contfnewc]

The Editor

Next Post
AI female voice assistants reinforce gender bias – UN report

AI female voice assistants reinforce gender bias – UN report

Recommended

Sebi initiated probe in 117 new cases in FY’18

Sebi initiated probe in 117 new cases in FY’18

7 years ago
Dad told to make ‘final call’ to family from ICU. He cried with his wife. But he survived

Dad told to make ‘final call’ to family from ICU. He cried with his wife. But he survived

5 years ago

Popular News

    Connect with us

    About Us

    We bring you the best Premium WordPress Themes that perfect for news, magazine, personal blog, etc. Check our landing page for details.

    Category

    • America
    • Britain
    • Entertainment
    • Europe
    • Health
    • International
    • latest news
    • London
    • Markets
    • Science
    • Tech
    • Travel
    • Uncategorized
    • Women

    Site Links

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    • About
    • Contact

    © 2020 londonernews

    No Result
    View All Result
    • Home
    • Science
    • Travel
    • Tech
    • Health

    © 2020 londonernews