YouTube announced Monday that disclosure labels will start going into effect for videos that use generative artificial intelligence. Generative AI can be described as a system “that learns to generate more objects that look like the data it was trained on,” according to MIT.

The change is meant to help create transparency between creators and viewers and create trust within their communities. Here’s what you should know about the criteria YouTube has released and what videos you can expect to have these new labels.

What labels are being added to YouTube?

YouTube acknowledges the way generative AI has been helping creators make their videos, whether that’s in the early thinking stages of storyboarding or using tools that add creative flair.

But these labels are meant to help users realize if what they’re watching is real or fake. According to CNN, “Online safety experts have raised alarms that the proliferation of AI-generated content could confuse and mislead users across the internet, especially ahead of elections in the United States and elsewhere in 2024.”

According to YouTube, the labels will be used on videos that have “realistic content”, which they explain as content that a person may confuse as a real person, place or event but was actually made with AI.

The labels will appear in the description box, but for videos that touch on sensitive topics (such as elections, global conflicts or natural disasters), a more prominent label will be seen on the video itself. The label will only say “altered or synthetic content” at this time.

What videos require this label (and which ones don’t)?

View Comments

YouTube’s Help Center has provided examples of videos that will require this new label, but it does not cover everything a creator might make using AI. We’ve compiled it all into a list below:

  • “Makes a real person appear to say or do something they didn’t do.”
  • “Alters footage of a real event or place.”
  • “Generates a realistic-looking scene that didn’t actually occur.”
  • “Digitally generating or altering content to replace the face of one individual with another’s.”
  • “Digitally altering a famous car chase scene to include a celebrity who wasn’t in the original movie.”
  • “Simulating audio to make it sound as if a medical professional gave advice when the professional did not actually give that advice.”
  • “Showing a realistic depiction of a missile fired toward a real city.”
  • “Synthetically generating music (including music generated using Creator Music).”
  • “Voice cloning someone else’s voice to use it for voiceover.”
  • “Synthetically generating extra footage of a real place, like a video of a surfer in Maui for a promotional travel video.”
  • “Synthetically generating a realistic video of a match between two real professional tennis players.”
  • “Making it appear as if someone gave advice that they did not actually give.”
  • “Digitally altering audio to make it sound as if a popular singer missed a note in their live performance.”
  • “Showing a realistic depiction of a tornado or other weather events moving toward a real city that didn’t actually happen.”
  • “Making it appear as if hospital workers turned away sick or wounded patients.”
  • “Depicting a public figure stealing something they did not steal, or admitting to stealing something when they did not make that admission.”
  • “Making it look like a real person has been arrested or imprisoned.”

There are also examples of videos that do not need this disclosure, but have had some sort of AI edits that are deemed inconsequential by YouTube.

  • “Someone riding a unicorn through a fantastical world.”
  • “Green screen used to depict someone floating in space.”
  • “Color adjustment or lighting filters.”
  • “Special effects filters, like adding background blur or vintage effects.”
  • “Production assistance, like using generative AI tools to create or improve a video outline, script, thumbnail, title, or infographic.”
  • “Caption creation.”
  • “Video sharpening, upscaling or repair and voice or audio repair.”
  • “Idea generation.”
  • “Applying beauty filters.”
  • “Synthetically generating or extending a backdrop to simulate a moving car.”
  • “Using an AI-generated animation of a missile in a video.”

If YouTube creators fail to properly label their videos, YouTube will add on the label itself without seeking permission from the creator, and if the creator refuses to add the label to their AI-generated content multiple times, there is the risk of their account getting suspended or videos removed.

YouTube will start implementing these guidelines on the YouTube app for phones, followed by desktop and TV.

Related
An art showcase in New York features AI art from the 1970s
How is AI used in the fast-food industry?
Hunting Utah’s biggest bully — social media
Join the Conversation
Looking for comments?
Find comments in their new home! Click the buttons at the top or within the article to view them — or use the button below for quick access.