Set as Homepage - Add to Favorites

日韩欧美成人一区二区三区免费-日韩欧美成人免费中文字幕-日韩欧美成人免费观看-日韩欧美成人免-日韩欧美不卡一区-日韩欧美爱情中文字幕在线

【BRAZILIAN EROTICE GIRLSADULT SEXY FEMALE】The UN says digital assistants like Siri promote gender stereotypes

The BRAZILIAN EROTICE GIRLSADULT SEXY FEMALEU.N. is not here for Siri's sexist jokes.

The United Nations Educational, Scientific, and Cultural Organization (UNESCO) has published an in-depth report about how women, girls, and the world as a whole, lose when technical education and the tech sector exclude women.

Within the report is a razor-sharp section about the phenomenon of gendered A.I. voice assistants, like Siri or Alexa. The whole report is titled "I'd blush if I could," a reference to the almost flirtatious response Siri would give a user if they said, "Hey Siri, you're a bitch." (Apple changed the voice response in April 2019).

"Siri’s ‘female’ obsequiousness – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education," the report reads.

The report is thorough and wide-ranging in its purpose of arguing for promoting women's educational and professional development in tech. That makes the fact that it seizes on voice assistants as an illustration of this gargantuan problem all the more impactful.

The report analyzes inherent gender bias in voice assistants for two purposes: to demonstrate how unequal workplaces can produce sexist products, and how sexist products can perpetuate dangerous, misogynistic behaviors.

Mashable Trend Report Decode what’s viral, what’s next, and what it all means. Sign up for Mashable’s weekly Trend Report newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

"The limited participation of women and girls in the technology sector can ripple outward with surprising speed, replicating existing gender biases and creating new ones," the report reads.

Many news outlets, including Mashable, have reported on how AI can take on the prejudices of its makers. Others have decried the sexism inherent in default-female voice assistants, compounded when these A.I.s demure when a user sends abusive language "her" way.

Now, even the U.N. is coming for sexism in artificial intelligence— showing that there's nothing cute about Siri or Cortana's appeasing remarks.

It's startling to comprehend the sexism coded into these A.I. responses to goads from users. It's almost as if the A.I. takes on the stance of a woman who walks the tightrope of neither rebuking, nor accepting, the unwanted advances or hostile language of someone who has power over "her."

Coy A.I. responses to abusive language are illustrative of the problem of sexism in A.I., but the report takes issue with the larger default of voice assistants as female, as well. The report details how these decisions to make voice assistants female were wholly intentional, and determined by mostly male engineering teams. These product decisions, however, have troublesome consequences when it comes to perpetuating misogynistic gender norms.

"Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command," the report reads. "The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."

For these reasons, the report argues that it is crucial to include women in the development process of A.I. It's not enough, the report says, for male engineering team to address their biases — for many biases are unconscious.

If we want our world — that will increasingly be run by A.I. — to be an equal one, women have to have an equal hand in building it.


Featured Video For You
A new study says women are abused every 30 seconds on Twitter

0.1163s , 9939.15625 kb

Copyright © 2025 Powered by 【BRAZILIAN EROTICE GIRLSADULT SEXY FEMALE】The UN says digital assistants like Siri promote gender stereotypes,Public Opinion Flash  

Sitemap

Top 主站蜘蛛池模板: 特级毛片在线大全免费播放 | 2024精品国产自在现线官网相当的好看!解锁摄影新境界 | 日韩精品视频 | 日本wwwxx爽69护士 | 久久久久成亚洲国产av综合精品无码黄一级 | 麻豆精品无人区码一二三区别 | 日韩欧美国产午夜精品 | 无码国产在线视频一区二区三区 | 亚洲精品乱码久久久久 | 精品久久久久无码AV片软件 | 超碰97亚洲日韩国产 | 2024年国产亚洲免费视频 | 日本黄色aa| 免费国产凹凸在线视频 | 久久国产一片免费 | 国产中文在线精品亚洲二区 | 欧美又大又粗又爽视频在线播放 | 亚洲国产精品一区二区国产 | 精品国产久久久久久 | 人妻无码中文字幕永久在线 | 国内精品久久人妻无码 | 国产欧美日韩精品高清二区综合区 | 国产a级综合区毛片久久国产精品不卡 | 内射调教小说高H1V1姐弟 | 欧美亚洲一区在线观看 | 国产成人精品福利一区二区 | 精品久久久久久中蜜乳樱桃 | 亚洲欧洲专线 | 欧美亚洲精品中文字幕乱码免费 | 成人妇女免费播放久久久 | 国产成年无码av片在线韩国 | 欧美日韩国产免 | 2024年亚洲午夜一区二区福利 | 91久久香蕉囯产熟女线看 | 久久久久亚洲精品无码蜜桃 | 日本亚洲欧洲色情 | m3u8午夜福利一区二区三区 | 国产AV亚洲精品无码专区 | 日本毛片97爱亚洲综合在线 | 日韩欧美视频一区二区三区 | 亚洲大片精品永久免费看网站 |