clock menu more-arrow no yes

Filed under:

New report says Facebook will start rating your trustworthiness. No one knows what that means

FILE - In this April 4, 2013 file photo, Facebook CEO Mark Zuckerberg walks at the company's headquarters in Menlo Park, Calif. Facebook said Tuesday, Aug. 21, 2018, that it had removed 652 pages, groups, and accounts linked to Russia and Iran for “coordi
In this April 4, 2013 file photo, Facebook CEO Mark Zuckerberg walks at the company's headquarters in Menlo Park, California. Facebook said Tuesday, Aug. 21, 2018, that it had removed 652 pages, groups and accounts linked to Russia and Iran for “coordinated inauthentic behavior” that included the sharing of political content amid stepped-up policing of its own platform ahead of the U.S. midterm elections in November.
Marcio Jose Sanchez, Associated Press

SALT LAKE CITY — If you’re not already worried about your place in the world, consider that Facebook will begin ranking its users based on trustworthiness.

The Washington Post reported Tuesday that Facebook users will be rated between a one and zero as a way to measure their reputation.

It’s not clear how ratings will be determined or how Facebook plans to use its rating system, according to The Washington Post.

Tessa Lyons, a product manager at Facebook, told the Post the effort will be used to stop people from marking accurate information as inaccurate.

“If the two align, the user’s future reports will be weighted more than those of someone who indiscriminately flags content. But the company wouldn’t discuss any further details of its assessment system, because of concerns that bad actors could game the system,” according to Quartz.

The ratings will be similar to how Facebook rated news publishers in the past.

Facebook responded to the report in a statement, according to Mashable.

“The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong and the headline in the Washington Post is misleading,” the statement reads. “What we’re actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system. The reason we do this is to make sure that our fight against misinformation is as effective as possible.”

Claire Wardle, director of First Draft, a research lab within Harvard’s Kennedy School and fact-checking partner of Facebook, told the Post that the rating system worries her.

“Not knowing how (Facebook is) judging us is what makes us uncomfortable,” she said. “But the irony is that they can’t tell us how they are judging us — because if they do, the algorithms that they built will be gamed.”

The new effort joins a long list of new ideas Facebook is considering to help curb fake news and misinformation, according to Uproxx. The company plans to identify users who constantly flag content as untrustworthy, too.