SALT LAKE CITY — A mother in Rio de Janeiro, Brazil, grew concerned as a video of her 10-year-old daughter and a friend playing in a backyard pool quickly gained more than 400,000 views. When she learned about research that showed YouTube's automated recommendation system was funneling home videos featuring children in swimsuits to pedophiles, her worst fear was confirmed, according to The New York Times.
It's just one of many stories that have U.S. legislators worried about how children can be exposed to pornography and exploited by pedophiles on common apps like Facebook, Instagram, Snapchat and YouTube.
"Predators really do lurk," Sen. Marsha Blackburn, R-Tenn., said at a Senate Judiciary Committee hearing on July 9. "They are right there on all of these apps that our precious children are using.”
The risk of exploitation is increasing, according to experts who testified at the hearing, like John F. Clark, president and CEO of the National Center for Missing and Exploited Children. Last year, the center received more than 18 million reports of international and domestic online sexual abuse. Just four years ago, the number was closer to 4 million, Clark said.
"This is a crisis affecting our children and it’s not an abstract one; it’s not a rare one,” said Sen. Mike Lee, R-Utah.
Legislators and experts agreed that technology companies are not doing enough to shield children from inappropriate content and protect them from being contacted by strangers. So far, YouTube has not made any move to revise its video recommendation system — which helps drive the platform’s billions of views, said Sen. Richard Blumenthal, D-Conn.
In March, CNN reported that Instagram is the leading social media platform for child grooming by sexual predators. Christopher McKenna, founder and CEO of Protect Young Eyes, a nonprofit dedicated to helping parents and kids use technology responsibly, decided to test how safe Instagram is for young people. He and his team started an Instagram account with two photos of a young girl and tried to mimic the behavior of an average teen.
“Within a week we had dozens of men sending us pictures of their genitals ... and sending us hardcore pornography through direct messages," said McKenna. "Even after we told all of them that we were only 12. They were relentless."
But app store descriptions of Instagram do not mention anything about sexual predators, direct message risks, sex trafficking or hardcore pornography. The app is recommended for ages 12 and older by Apple and 13 and older by Google, said McKenna.
The federal government is also to blame for failing to hold tech companies accountable, according to Georgetown law professor Angela J. Campbell.
She said the Federal Trade Commission has failed to vigorously enforce the Children’s Online Privacy Protection Act, which provides privacy and safety protections for users under the age of 13, such as requiring parental consent in order to collect a child's personal information.
Since 2000, the Federal Trade Commission has brought 29 actions to enforce the act. It's unclear how many complaints have been submitted during that time, but Campbell said that since 2012, she has personally helped file 14 requests for investigations. None has been answered publicly. One of those requests asked the Federal Trade Commission to look into findings that the Google Play Store was labeling apps in the family section as "family friendly" when they did not meet Google's own standards, she said.
"The big tech companies including Google, YouTube, Facebook and Amazon have felt empowered to ignore the existing safeguards," said Campbell.
Sens. Edward J. Markey, D-Mass., and Josh Hawley, R-Mo., introduced a bill earlier this year to update the children’s online privacy rules. The legislation would let parents and kids delete personal information that has been collected by tech companies and would create a division within the Federal Trade Commission specifically devoted to the privacy and safety of minors.
"I think we can agree that exploiting children online is one of the worst dangers and one of the worst social threats that we face," said Hawley, who added that he thinks "being callous toward children's safety" is part of the business model of some tech companies. Another bill Hawley introduced would prevent video streaming companies like YouTube from recommending videos that feature children unless they meet specific safety criteria.
McKenna agreed with Hawley, saying a "business model focused on reach and engagement absolutely conflicts with protection.”
The "discover news" section of Snapchat is where third parties post articles and advertisements. "That’s where engagement happens," said McKenna. "Again, they are based on an engagement model, they want more people looking at that content and that just feeds that cycle of revenue."
The discover news section contains content about sex, drugs and hiding internet activity from parents, among other topics, and is accessible to users of any age, said McKenna. Like Instagram, the app is marketed for ages 12 and up by Apple and 13 and up by Google.
McKenna called for more transparency and uniformity in app ratings to help parents make informed decisions about how to protect their children. He also called for parental controls that are easier to navigate.
“Parents have a primary responsibility, but they need a more proactive and preventive role from big tech companies,” said Blumenthal.
Stephen Balkman, founder and CEO of the Family Online Safety Institute, recommended families reference the organization's seven steps to good digital parenting. The first step is educating yourself by trying out the apps, games and sites your kids use. Other recommendations include using parental controls, setting ground rules and being a good digital role model by showing your kids when to unplug.
“Parents must be engaged and knowledgeable about what their children are doing on the internet,“ Balkman said. “The internet cannot be made 100 percent safe.”