Set as Homepage - Add to Favorites

日韩欧美成人一区二区三区免费-日韩欧美成人免费中文字幕-日韩欧美成人免费观看-日韩欧美成人免-日韩欧美不卡一区-日韩欧美爱情中文字幕在线

【teenage filipinos drinking in sex videos】Microsoft's AI makes racist error and then publishes stories about it

Hey,teenage filipinos drinking in sex videos at least Microsoft's news-curating artificial intelligence doesn't have an ego. That much was made clear today after the company's news app highlighted Microsoft's most recent racist failure.

The inciting incident for this entire debacle appears to be Microsoft's late May decision to fire some human editors and journalists responsible for MSN.com and have its AI curate and aggregate stories for the site instead. Following that move, The Guardianreported earlier today that Microsoft's AI confused two members of the pop band Little Mix, who both happen to be women of color, in a republished story originally reported by The Independent. Then, after being called out by band member Jade Thirlwall for the screwup, the AI then published stories about its own failing.

So, to recap: Microsoft's AI made a racist error while aggregating another outlet's reporting, got called out for doing so, and then elevated the coverage of its own outing. Notably, this is after Microsoft's human employees were reportedly told to manually remove stories about the Little Mix incident from MSN.com.


You May Also Like

Still with me?

"This shit happens to @leighannepinnock and I ALL THE TIME that it's become a running joke," Thirlwall reportedly wrote in an Instagram story, which is no longer visible on her account, about the incident. "It offends me that you couldn't differentiate the two women of colour out of four members of a group … DO BETTER!"

As of the time of this writing, a quick search on the Microsoft News app shows at least one such story remains.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!
Mashable ImageA story from T-Break Tech covering the AI's failings as it appears on the Microsoft News app. Credit: screenshot / microsoft news app

Notably, Guardian editor Jim Waterson spotted several more examples before they were apparently pulled.

"Microsoft's artificial intelligence news app is now swamped with stories selected by the news robot about the news robot backfiring," he wrote on Twitter.

We reached out to Microsoft in an attempt to determine just what, exactly, the hell is going on over there. According to a company spokesperson, the problem is not one of AI gone wrong. No, of course not. It's not like machine learning has a long history of bias (oh, wait). Instead, the spokesperson insisted, the issue was simply that Microsoft's AI selected the wrong photo for the initial article in question.

"In testing a new feature to select an alternate image, rather than defaulting to the first photo, a different image on the page of the original article was paired with the headline of the piece," wrote the spokesperson in an email. "This made it erroneously appear as though the headline was a caption for the picture. As soon as we became aware of this issue, we immediately took action to resolve it, replaced the incorrect image and turned off this new feature."

Unfortunately, the spokesperson did not respond to our question about humanMicrosoft employees deleting coverage of the initial AI error from Microsoft's news platforms.

Microsoft has a troubled recent history when it comes to artificial intelligence and race. In 2016, the company released a social media chatbot dubbed Tay. In under a day, the chatbot began publishing racist statements. The company subsequently pulled Tay offline, attempted to release an updated version, and then had to pull it offline again.

As evidenced today by the ongoing debacle with its own news-curating AI, Microsoft still has some work to do — both in the artificial intelligence and not-being-racistdepartments.

Topics Artificial Intelligence Microsoft Racial Justice

0.1295s , 14252.640625 kb

Copyright © 2025 Powered by 【teenage filipinos drinking in sex videos】Microsoft's AI makes racist error and then publishes stories about it,Public Opinion Flash  

Sitemap

Top 主站蜘蛛池模板: 无码纯肉视频在线观看 | 韩国日本三级在线播放 | 久久国产精品免费一区二区三区睡前观看 | 国产精品嫩草99AV在线 | 亚洲国产精品一区二区国产 | 福利国产微拍广场一区视频在线 | 国产熟妇久久精品亚洲熟女图片 | 无码日本精品一区二区片 | 日韩国产精品影院 | 亚洲AV成人噜噜无码网站A片 | 波多野结衣中文字幕在线观看 | 麻豆乱淫一区二区三区 | 乱码视频午夜在线观看 | 久久精品免费视频观看 | 久久久久久久精品免费看人女 | 欧美日韩国产不卡在线观看 | 久久综合经典国产二区无码 | 国产香蕉一区二区三区在线视频 | 亚洲欧美色国产综合 | 国产精品不卡一区二区三区在线观看免费在线观看高清完 | 亚洲国产精品自在自线观看 | 人妻少妇精品无码专区芭乐视网 | 无套内谢少妇毛片A片免费视频 | h高潮嗯啊娇喘抽搐视频a片小说熟妇中文人妻一区 | 青青草视频官网 | 成人在线日韩 | 国产在线无码视频一区二区三区 | 天美传媒MV免费观看软件的特点 | 在线观看一区二区三区视频 | 美女性感一区二区三区四区 | 欧美亚洲国产日韩一区二区三区 | 韩国精品无码午夜福利视预 | 国产精品久久国产三级国电话系列 | 无套无码孕妇啪啪 | 欧美精品一区二区在线观看亚洲欧美 | 精品一区二区无线乱码 | 中文字幕熟女人妻伦伦在线 | 国产精品无码一区二区久日韩亚 | 久久久亚洲精品免费 | 制服丝袜手机在线 | 欧美日韩激情视频一区二区 |