Meta announced this week that it’s tackling concerns about sexploitation and intimate image abuse on Instagram by testing new features designed to protect teens and others.

The company said it’s also trying to make it harder for scammers to find potential targets and “testing new measures to support young people in recognizing and protecting themselves from sextortion scams.”

As WFAA, an ABC affiliate in Dallas, Texas, reported, “Sexual extortion, or sextortion, involves persuading a person to send explicit photos online and then threatening to make the images public unless the victim pays money or engages in sexual favors. Recent high-profile cases include two Nigerian brothers who pleaded guilty to sexually extorting teen boys and young men in Michigan, including one who took his own life, and a Virginia sheriff’s deputy who sexually extorted and kidnapped a 15-year-old girl.”

In its blog post, the company asserts that “these updates build on our longstanding work to help protect young people from unwanted or potentially harmful contact.”

Protection from nude images

The new measures include nudity blurring in private messages, which will be a default setting for those under 18, with a separate notification sent to adults encouraging them to use it as well. “When nudity protection is turned on, people sending images containing nudity will see a message reminding them to be cautious when sending sensitive photos and they can unsend these photos if they change their mind,” per the announcement.

Those forwarding a nude image will also be hit with a moment to reflect — and encouraged not to do it.

If someone does send a nude image, it will be blurred under a warning screen so the recipient can decide whether to look at it. They’ll also have an option to block the sender and report the chat.

Safety tips are included, along with warnings about potential risks and reminders that people can screenshot or forward messages without your knowledge, that profiles may be fake and that there are a range of resources available such as Meta’s Safety Center. Because images are analyzed on the device to detect nudity, “nudity protection will work in end-to-end encrypted chats, where Meta won’t have access to these images — unless someone chooses to report them to us,” Meta said.

Shielding teens from scammers

According to Meta, action is both swift and severe if sextortion is discovered. What the company says it will do includes removing the offending account, preventing creation of new ones if possible and perhaps even contacting law enforcement. Meta said its teams “work to investigate and disrupt networks of these criminals” and has reported several networks to law enforcement in just the past year.

But the company also said it is expanding its efforts to find accounts that could potentially use sextortion scams, based on “a range of signals” that might indicate sextortion behavior. “While these signals aren’t necessarily evidence that an account has broken our rules, we’re taking precautionary steps” to stop them from seeking and interacting with teens. Messages from accounts that are tagged as potential problems will go directly into the teen’s hidden requests folder, so no message notification will be seen. If a teen is already chatting with one of those accounts, Meta will send safety notices, as well, the company said.

“For teens, we’re going even further. We already restrict adults from starting DM chats with teens they’re not connected to, and in January we announced stricter messaging defaults for teens under 16 (under 18 in certain countries), meaning they can only be messaged by people they’re already connected to — no matter how old the sender is. Now, we won’t show the ‘Message’ button on a teen’s profile to potential sextortion accounts, even if they’re already connected. We’re also testing hiding teens from these accounts in people’s follower, following and like lists, and making it harder for them to find teen accounts in search results.”

Support for those interacting with scammers

Meta’s new tests include pop-up messages for those who interacted with an account removed for sextortion that will direct them to resources, like “the Stop Sextortion Hub, support helplines, the option to reach out to a friend, StopNCII.org for those over 18 and Take It Down for those under 18.”

A history of sexploitation?

Critics say the company has known for a long time that its apps pose safety risks for children. Deseret News reported recently on Utah’s lawsuit against Meta, which claims the apps have “addictive features” that can take young people into “negative spirals.”

The court documents say nearly 7 in 10 teen girls and more than 6 in 10 teens overall use Instagram and other Meta apps that appeal to teens. Per the article, “In addition to alleging that Meta knew about its harm, the state also says that Meta hasn’t made changes it knew would reduce harm. On multiple occasions, Meta has explicitly considered and explicitly rejected design changes after finding that those changes would decrease the danger of harms, but also decrease engagement — and therefore Meta’s profits.”

The lawsuit also claims Meta knows that it serves content to children that is not appropriate for their age. The Wall Street Journal published an investigation in February that concluded Meta has long known Instagram allowed children to be exploited.

“Meta has struggled to detect and prevent child-safety hazards on its platforms. After the Journal last year revealed that the company’s algorithms connect and promote a vast network of accounts openly devoted to underage-sex content, the company in June established a task force to address child sexualization on its platforms,” the Journal reported.

“That work has been limited in its effectiveness, the Journal reported last year. While Meta has sought to restrict the recommendation of pedophilic content and connections to adults that its systems suspect of sexualizing children, tests showed its platforms continued to recommend search terms such as ‘child lingerie’ and large groups that discussed trading links of child-sex material,” the article said.