TikTok will now share a warning ahead of videos that contain questionable information that has been unverified by fact-checkers, The Verge reports.
- The warning will appear when people go to share the videos.
- The warning reads: “Caution: Video flagged for unverified content.”
Per The Verge, the warning means that a fact-checker couldn’t verify the information within the video.
- There’s no word from TikTok on how many videos fact-checkers review in one day, or which videos the company specifically reviews in a day.
- A spokesperson told The Verge that fact-checking often focuses on topics such as “elections, vaccines, and climate change and that a video doesn’t have to reach a certain popularity to qualify for review.”
Misinformation and the pandemic
Back in January 2020, Media Matters for America, a media watchdog group, said TikTok was spreading misinformation about the deadly coronavirus, as I wrote about for the Deseret News.
- Videos on the social media app claimed the government created the coronavirus to eliminate the population.
- One video suggested that the U.S. government creates a disease every 100 years as a way of population control, as I wrote about for the Deseret News.
What is TikTok’s policy?
TikTok does not tolerate misinformation, according to the company’s community guidelines. The company said it does not look to spread propaganda or misinformation to its users.
- The policy reads: “We do not permit misinformation that could cause harm to our community or the larger public. While we encourage our users to have respectful conversations about the subjects that matter to them, we remove misinformation that could cause harm to an individual’s health or wider public safety. We also remove content distributed by disinformation campaigns.”