


presidential election where it was criticized for allowing pro-Trump hoaxes to proliferate. That could hurt innocent news publishers, as well as reducing clicks to Facebook’s ads.įacebook initially downplayed the issue of fake news after the U.S. If Facebook can’t improve trust in what’s shown in the News Feed, people might click all its links less. Getting this right is especially important after the fiasco this week when Facebook’s Safety Check for the tragic Las Vegas mass-shooting pointed people to fake news. “We will apply what we learn from the test to improve the experience people have on Facebook, advance news literacy, and support an informed community.” Facebook doesn’t expect the changes to significantly impact the reach of Pages, though publishers that knowingly distribute fake news could see fewer clicks if the Info button repels readers by debunking the articles. “As we continue the test, we’ll continue listening to people’s feedback to understand what types of information are most useful and explore ways to extend the feature” Su tells TechCrunch. We count on Wikipedia to quickly resolve such situations and refer you to them for information about their policies and programs that address vandalism.”Īnd to avoid distributing fake news, Facebook says Related Articles will “be about the same topic - and will be from a wide variety of publishers that regularly publish news content on Facebook that get high engagement with our community.” When asked about the risk of the Wikipedia entries that are pulled in having been doctored with false information, a Facebook spokesperson told me “Vandalism on Wikipedia is a rare and unfortunate event that is usually resolved quickly. “This work reflects feedback from our community, including publishers who collaborated on the feature development as part of the Facebook Journalism Project” says Su. Of course, whenever Facebook shows more information, it creates more potential vectors for misinformation. The changes are part of Facebook big, ongoing initiative to improve content integrity Previously Facebook only showed Related Articles occasionally and immediately revealed them on links without an extra click. Together, this could show people alternate takes on the same news bite, which might dispute the original article or provide more perspective. Trending information could also appear if the article is part of a Trending topic. Meanwhile, the button will also unveil Related Articles on all links where Facebook can generate them, rather than only if the article is popular or suspected of being fake news as Facebook had previously tested. If no Wikipedia page is available, that info will be missing, which could also provide a clue to readers that the publisher may not be legitimate. It will also display info from their Facebook Page even if that’s not who posted the link, data on how the link is being shared on Facebook, and a button to follow the news outlet’s Page.
MORE INFO BUTTON FULL
This box will display the start of a Wikipedia entry about the publisher and a link to the full profile, which could help people know if it’s a reputable, long-standing source of news…or a newly set up partisan or satire site. “They want better tools to help them understand if an article is from a publisher they trust and evaluate if the story itself is credible.” “People have told us that they want more information about what they’re reading” Facebook product manager Sara Su tells TechCrunch. So today it’s beginning a test of a new “i” button on News Feed links that opens up an informational panel.

Facebook thinks showing Wikipedia entries about publishers and additional Related Articles will give users more context about the links they see.
