Last year, Facebook rolled out new features designed to make it easier for its users to keep up with the news through promoting trending content and offering some publications the chance to publish directly to Facebook.
Since then, things have gotten complicated for the social media site. First, Gizmodo reported in May that just a handful of editors managed Facebook's trending items and were pressured to promote certain stories over others — especially to suppress news items and websites of interest to political conservatives.
Then, last month, Quartz reported Facebook was trying to correct this bias by firing its editors and letting an unmanned algorithm curate its trending feed. In its efforts to counteract its own misleading principles, Facebook inadvertently opened the floodgates to erroneous information posing as hard news stories — first there was a completely fake story about Fox News firing top anchor Megyn Kelly.
More recently, the site just trended a story alleging bombs were planted in the Twin Towers before 9/11, promoting unfounded and (for the victims' families) hurtful theories that the U.S. government somehow played a role in the terrorist attacks.
Paired with the recent Pew Research Center finding that nearly half of Americans get their political and government news from Facebook and 38 percent cite the internet as a primary source of news, the fact that false news from unvetted sites is promoted is troublesome.
Yet as Brian Feldman wrote for New York Magazine, Facebook has now painted itself into a corner where it cannot discount the validity of any website claiming to be reporting. That puts a lot more responsibility on news consumers than ever.
"To exclude openly mendacious sites like Breitbart for being misleading or false opens the company up to accusations of bias. The responsibility of flagging false articles now falls on users," Feldman wrote.
But Feldman also argued that these issues are related and it's no coincidence that the two false stories promoted were both subjects typically important to far-right conservatives. An algorithm can't be completely blamed for how close political opinion can resemble news online, he posited.
"Facebook’s problem isn’t that it suppresses 'conservative news' or allows 'fake news,'" Feldman argued. "It’s that those two categories are increasingly indistinguishable."