Facebook is cracking down on false articles shared on its platform.
On Thursday, the social media site announced new features to fight the spread of fake news. The rollout follows Facebook’s attempt over the past month to respond to backlash that it didn’t do enough to quash the spread of false articles in the run-up to the 2016 elections.
Adam Mosseri, vice president of Facebook’s news feed, wrote in a company blog post:
We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully. We’ve focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations.
The platform’s chief executive, Mark Zuckerberg, announced the news on his Facebook page and said it has “a responsibility to make sure Facebook has the greatest positive impact on the world.”
Facebook is tacking the issue with four new features:
1. Simplified reporting. Though Facebook users previously could report fake news, the platform is highlighting the complaint as a reason to report a post. Users can flag the post, message the person who shared it or block the user involved.
2. Disputed story flags. Facebook is partnering with ABC News, the Associated Press, Politifact, Snopes and Facetcheck.org—third-party fact checkers that are members of Poynter’s International Fact Checking Network. The platform will continue to add more organizations to its list.
Once Facebook categorizes a source as a news outlet, stories from the domain can be sent to these fact checkers. From there, stories can be flagged—and once they are, they will appear less frequently in users’ timelines and will include caveats about the content.
According to Eugene Kiely, director of Factcheck.org, after Facebook alerts them of potentially fake news, the fact-checkers will send back a link to a story that debunks it, if applicable. Facebook will the append the questionable content with a notice that reads “Disputed by 3rd Party Fact-Checkers,” with an option to read more about why that specific post was flagged. If users try to share the post anyway, they’ll be met with an interstitial that again reminds them that third-party fact checkers dismissed it, and a further note that reads “Before you share this story, you might want to know that independent fact-checkers disputed its accuracy.”
“We believe providing more context can help people decide for themselves what to trust and what to share,” Mosseri said.
3. Sharing warnings. Facebook will warn users who attempt to share flagged posts that it’s been disputed by its third-party fact checkers.
We’ve found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. We’re going to test incorporating this signal into ranking, specifically for articles that are outliers, where people who read the article are significantly less likely to share it.
4. Blocking spammers’ access to advertising money. Facebook said it “eliminated the ability to spoof domains, which will reduce the prevalence of sites that pretend to be real publications.”
It’s good news for most Facebook users, but bad news for spam sites that published fake stories about president-elect Donald Trump and Democratic presidential nominee Hillary Clinton.
… Many of these false reports, which were overwhelmingly pro-Trump or anti-Clinton, can be traced to a small town called Veles in Macedonia. Teenagers here were crafting sensational stories they knew would get the attention of Americans on Facebook and therefore bring in lots of money from advertisements.
The crackdown will likely come as a disappointment to the young people of Veles, who, in some cases, have been making thousands of euros a day from advertisements on their fake news sites, according to the BBC report. When asked if he worries these stories may have unfairly influenced the U.S. election, a young man who used to run a fake news site in Veles told the BBC, “Teenagers in our city don’t care how Americans vote. They are only satisfied that they make money and can buy expensive clothes and drinks!”
The platform also said it was “analyzing publisher sites to detect where policy enforcement actions might be necessary,” but it didn’t elaborate on what those actions might entail.
Though the move seems to answer those criticizing Facebook for its earlier inaction—Wired’s headline reads, “Facebook finally gets real about fighting fake news”—it will have to do more to gain people’s trust.
Mosseri said the platform recognizes that the features are only steps in the right direction:
It’s important to us that the stories you see on Facebook are authentic and meaningful. We’re excited about this progress, but we know there’s more to be done. We’re going to keep working on this problem for as long as it takes to get it right.