clock menu more-arrow no yes

Filed under:

YouTube algorithm recommends disturbing and hateful content, according to Mozilla report

The study, which include more than 37,000 global volunteers, found that YouTube recommended videos that had ‘COVID fear-mongering to political misinformation’

The YouTube app and YouTube Kids app on an iPhone.
The YouTube app and YouTube Kids app are displayed on an iPhone in New York on April 25, 2018. YouTube’s efforts to curb videos that peddle conspiracies and white supremacy have not yet prevented its own algorithm from recommending hateful content to the website’s user, a new report from the Mozilla Foundation has found, Politico reported.
Jenny Kane, Associated Press

YouTube’s efforts to curb videos that peddle conspiracies and white supremacy have not yet prevented its own algorithm from recommending hateful content to the website’s user, a new report from the Mozilla Foundation has found, Politico reported.

In a 10-month, crowdsourced study, the Mozilla Foundation — a California-based software nonprofit — found that 71% of YouTube’s recommended videos were “regrettable,” or included “COVID fear-mongering to political misinformation to wildly inappropriate ‘children’s’ cartoons.” The Mozilla report also found that the videos reported by its study’s volunteers as regrettable received 70% more traffic than other videos the volunteers watched.

  • “YouTube’s algorithm is working in amplifying some really harmful content and is putting people down disturbing pathways,” Brandi Geurkink, Mozilla Foundation senior manager of advocacy, said, according to Politico. “It really shows that its recommendation algorithm is not even functioning to support its own platform policies, it’s actually going off the rails.”
  • “Our research confirms that YouTube not only hosts, but actively recommends videos that violate its very own policies. We also now know that people in non-English speaking countries are the most likely to bear the brunt of YouTube’s out-of-control recommendation algorithm,” Geurkink said in a statement from the report.
  • According to Politico, YouTube is the world’s second most popular website — the first being Google — and YouTube users consume 1 billion hours of videos a day.

Is YouTube doing anything to remove harmful video from its website?

In its report, Mozilla said YouTube had removed nearly 200 of the “regrettable” videos. Those videos — which had been flagged by more than 37,000 volunteers in 91 countries — had more than 160 million total views before they were removed by YouTube, The Wall Street Journal reported of Mozilla’s study.

  • YouTube has implemented 30 changes during the last year and its automated program is now able to police videos that violate YouTube’s policies with 94% success, a spokesman for the website said, according to The Wall Street Journal.
  • “The goal of our recommendation system is to connect viewers with content they love,” a YouTube spokesman said, reported Politico, “and on any given day, more than 200 million videos are recommended on the homepage alone.”