- The warning will appear when people go to share the videos.
- The warning reads: “Caution: Video flagged for unverified content.”
Per The Verge, the warning means that a fact-checker couldn’t verify the information within the video.
- There’s no word from TikTok on how many videos fact-checkers review in one day, or which videos the company specifically reviews in a day.
- A spokesperson told The Verge that fact-checking often focuses on topics such as “elections, vaccines, and climate change and that a video doesn’t have to reach a certain popularity to qualify for review.”
Misinformation and the pandemic
- Videos on the social media app claimed the government created the coronavirus to eliminate the population.
- One video suggested that the U.S. government creates a disease every 100 years as a way of population control, as I wrote about for the Deseret News.
What is TikTok’s policy?
- The policy reads: “We do not permit misinformation that could cause harm to our community or the larger public. While we encourage our users to have respectful conversations about the subjects that matter to them, we remove misinformation that could cause harm to an individual’s health or wider public safety. We also remove content distributed by disinformation campaigns.”