Set as Homepage - Add to Favorites

日韩欧美成人一区二区三区免费-日韩欧美成人免费中文字幕-日韩欧美成人免费观看-日韩欧美成人免-日韩欧美不卡一区-日韩欧美爱情中文字幕在线

【skinny small boobs sex videos】Tech can help us spot fake news, but there's only one real way to stop it

In the days after the election,skinny small boobs sex videos apoplectic progressive journalists spent their time writing boiling hot takes, trying to find the one CNN chyron or Nate Silver tweet responsible for handing democracy over to a Putin-loving creamsicle. And while no one could ever agree (or admit that they agree) on the real enemy, nearly everyone pointed a finger at the new guy in town: fake news.

Almost instantly, Chrome extensions appeared that made it easier for users to identify fake news. Last week, Facebook even rolled out some far-too-cautious tools to help stop the onslaught. Yet for all of their efficacy, none of these tools will be able to fully curtail the plague of propaganda.

But every tech solution rolled out so far lacks the crucial ingredient necessary to make them work: human contact.

Ugh.

Fake news has never been more powerful

It's impossible to overstate the role fake news -- or propaganda, as seems increasingly likely -- had in this election. A Buzzfeedpost-election analysis found that fake news stories significantly outperformed real news stories in the final three months leading to the election. The top 20 best-performing sites generated 8,711,000 shares, likes and reactions, compared to just 7.3 million from reputable news sources.

And while both liberals and conservatives shared fake news, Trump supporters, it seems, were particularly susceptible to it: 38% of fake news shared came from the conservative sites, compared to just 20% from liberal sites.

Original image replaced with Mashable logoOriginal image has been replaced. Credit: Mashable

Zuckerberg initially responded to criticism with outright denial, calling the idea that fake news and Facebook influenced the election "pretty crazy." So concerned journalists and software developers stepped in where the platforms didn't. Fake News Alert and B.S. Detector, developed shortly after the election, are both Google Chrome extensions that alert users when the site they're visiting is highly biased, simply clickbait or pure propaganda. Slatealso released a Google Chrome extension simply called This is Fake, which helps users identify and report fake news on Facebook.

Original image replaced with Mashable logoOriginal image has been replaced. Credit: Mashable

Facebook itself released their own set of tools last week that make it easier for users to flag fake news, hopefully making it harder for these stories to spread. Content determined to be false by their bipartisan fact-checking partner Poynter will come with a warning label as well as an explanation. Facebook has also said they'll prevent these stories from being advertised. Let's hope.

The tools to help stop fake news are flawed in their design

As powerful as each of these tools may be, none of them will likely go far enough to stop the torrent of fake news -- though they may temper it -- because they all fail to realize "the human element," and subsequently rely on two false premises:

1. The idea that people of different political persuasions are still talking to each other on social media, and therefore capable of spotting and reporting fake news.

2. The belief that people -- the same people who renamed the CNN "The Clinton News Network" and screamed "Lugenpresse," an old Nazi term, at the press -- will not project the same hostility towards Facebook, soften their criticism of The Washington Post or download a Chrome extension from ultra-progressive Slate. (Breitbart is already railing against Facebook's anti-fake news initiatives.)

In order for Facebook's new tools to properly work, for example, users must first identify suspicious-looking content. But all readers, conservative and progressive alike, are inherently biased towards content that reflects their pre-existing political beliefs and values.

Mashable Trend Report Decode what’s viral, what’s next, and what it all means. Sign up for Mashable’s weekly Trend Report newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

Facebook's algorithms fill your News Feed with familiar faces, who are more likely to share stories you like, limiting information diversity and creating echo chambers. In the months surrounding the election, Facebook users unfollowed and sometimes purged users from their feed whose political views didn't align with their own.

Original image replaced with Mashable logoOriginal image has been replaced. Credit: Mashable

So it's strange to imagine why most Facebook users would even be confronted with content they didn't like (and doesn't appeal to their political values) in the first place. If the story fits, people tend to wear it. It's remains to be seen if liberals or conservatives -- who both exist in social media echo chambers -- will be able to identify fake news stories that nonetheless appeal to their political values, and still be able to report it.

Why would users report stories that look potentially dubious (a skill, researchers found, many readers just don't have) if those stories neatly correspond with their ways of seeing the world?

Parts of our culture are post-fact and post-fact-checking

There's also the inherent danger -- though perhaps an unavoidable one -- that the Infowars, AddictingInfo and Breitbartreaders of this world will soon come to distrust Facebook's fact-checking services. Why would the people who regularly read Alex Jones -- who claimed that Hillary was in league with the devil -- or who believe that Comet Ping Pong Pizzeria was home to a child sex ring led by John Podesta suddenly trust Facebook's judgement on The Washington Post? Facebook has been accused of being liberal before, so it's unclear whether voters paranoid about the mainstream media won't just become even more suspicious of the platform.

Obviously, there are more tools than Facebook's measures that people can use to call out fake news. But even those mechanisms are partisan, and the people who use them, self-selected. Breitbartfans aren't exactly going to go to progressive Slate and download their hottest fact-checking tool. People who spent the months preceding the election actively sharing "Denzel Washington Backs Trump In the Most Epic Way Possible" probably aren't going to read that Mashablearticle listing the smartest new chrome extensions for spotting fake news. Those who live and die by The Daily Calleraren't suddenly going to find room in the hearts for little ol' bipartisan Snopes.

In our (almost) post-fact world, we've come dangerously close to post-fact-checking. And that -- perhaps more than anything else that happened this past month -- should scare you.

People can't just fact check. They need to have meaningful conversations if they want to see change.

None of this is to say that these tools won't be effective, or aren't deeply important. Not every Facebook user or Trump supporter is an Infowarsreader, and there were surely be many users who will trust Facebook's judgement and subsequently learn how to become better, more critical consumers. (Facebook could also ban some of the more egregious fake news accounts in the first place, or take for more aggressive measures to stop them).

Propaganda works, and as we've seen this election, can do lasting, potentially lethal damage.

Original image replaced with Mashable logoOriginal image has been replaced. Credit: Mashable

But if voters, particularly progressive voters, are serious about spotting and stopping fake news, they're going to need to commit a truly painful act -- and actually communicate with the people who believe these stories.

The main reason people believe in fake news, researchers found, is simple: because they want to.

The main reason people believe in fake news is simple: because they want to.

Stefan Pfattheicher, a professor at Ulm University, recently told The Washington Postthat people believe in fake news not due to a lack of intelligence, but a lack of will.

"This seems to be more a matter of motivationto process information (or news) in a critical, reflective thinking style than the ability to do so," Pfattheicher said.

If "critical readers" have any hope of stopping the fake news explosion, they'll need to do more than rely on external fact-checking tools or New York Times hyperlinks. They'll need to keep people in their Facebook feeds who they disagree with, and try to have radically empathetic, compassionate conversations with them off of the Internet.

The goal shouldn't end at halting the speed of a news story about "How all Muslims are terrorists" but at curbing people's desire to share that story, or believe in that hate, in the first place. And that means talking to people who disagree with you, people whose politics and values violently clash with your own.

Who wants to do that? No one, of course. (Raises hand.) It's awful, frequently traumatizing and, for anyone who's ever confronted an egg avatar on Twitter, often a waste of time.

Real change happens, however, when core beliefs change too. Since the election, organizers have shared tools for people to use to try and "convince" other people that their world views are distorted, even dangerous. Key to the design of every one of these tools is moving beyond factsand into the realm of the personal. For many, facts have become too partisan. Emotion, first-person stories and relationship-building often change more hearts and minds than a Politifact link or Facebook debunk.

Fake news is here to stay, and platforms, software developers and people will need to imagine even more aggressive ways to kill it. Tools will help. So will Chrome extensions. But the only real way to help people to believe in facts again is to magically, somehow, go beyond them and have a conversation.


Featured Video For You
This super precise handwriting robot is so satisfying to watch

0.1315s , 14297.015625 kb

Copyright © 2025 Powered by 【skinny small boobs sex videos】Tech can help us spot fake news, but there's only one real way to stop it,Public Opinion Flash  

Sitemap

Top 主站蜘蛛池模板: 2024国产成人综合精品 | 亚洲欧美视频网站 | 日本一卡二卡3卡四卡免费 日本一卡二卡三 | 亚洲伊人久久精品 | 人妻在线无码一区二区三区 | 日本高清在线看片免费视频 | 国产成人女人视频在线观看 | 偷拍视频精品一区二区三区 | 亚洲综合激情五月丁香六月 | 国产精品女同一区二区 | 日韩国产欧美一区在线视频免费 | 亚洲AV无码A片一区二区三区 | 国产一区二区三区在线影院 | 九一精品国产 | 波多野结衣无码a中文 | 性欧美在线视频a片俄 | 久久久日韩精品一区二区 | 精品国产一区二区 | 国产熟女aa级毛片www古代片 | 激情综合在线观看 | 国语自产拍在线观看偷拍 | 婷婷激情综合色五月久久竹菊影视 | 亚洲男女一区二区三区 | 久久久无码精品一区二区三区蜜桃 | 亚洲国产日韩一区精品久久 | 91精品视频免费播放 | 无码超乳爆乳中文字幕 | a级一级毛片免费在线观看 a级永久免费视频在线观看 | 无码中文字幕aⅴ精品影院 无码中文字幕AV久久专区 | 亚洲欧美精品一区天堂久久 | 国产精品亚洲精品专区 | 日韩高清的天堂在线观看免费 | 色婷婷一区二区三区AV免费看 | 日韩欧美国产偷亚洲清高 | 日本高清WWW色视频网站 | 日本亚洲国产中文一区二区三区 | 99视频免费在线 | 国产aⅴ无码专区亚洲 | 国产又黄又爽又色视频免费软件 | 在线看欧美成人中文字幕视频 | 久久精品免费观看视频 |