“While you can’t go back in time and unsend, we can help you move forward.”

That’s the message of a service just launched by the National Center for Missing & Exploited Children aiming to help teens, and those who were once teens, remove explicit images of themselves from the internet.

Called Take It Down, the new service is for “people who have images or videos of themselves nude, partially nude, or in sexually explicit situations taken when they were under the age of 18 that they believe have been or will be shared online.”

Per its website, Take It Down works by assigning a unique digital fingerprint, called a hash value, to the explicit images. Online platforms can use the hash values to detect those images or videos on their services and remove that content. According to Take It Down, it all happens without the image or video ever leaving the user’s device or anyone viewing it. Only the hash value will be provided to the National Center for Missing & Exploited Children.

Meta, the parent company of Facebook and Instagram is helping fund Take It Down, but according to The Associated Press, only a handful of websites are currently involved in the effort.

As of Monday, AP reports participating platforms include Meta’s Facebook and Instagram, Yubo, OnlyFans and Pornhub, owned by Mindgeek. If the image is on another site, or if it is sent in an encrypted platform such as WhatsApp, it will not be taken down, according to the service.

Identifying an image for removal also comes with some challenges. For instance, cropping it, adding an emoji or turning it into a meme — it becomes a new image and thus needs a new hash. Images that are visually similar — such as the same photo with and without an Instagram filter — will have similar hashes, differing in just one character, per AP.

“Take It Down is made specifically for people who have an image that they have reason to believe is already out on the web somewhere, or that it could be,” Gavin Portnoy, a spokesman for the National Center for Missing & Exploited Children, told the AP. “You’re a teen and you’re dating someone and you share the image. Or somebody extorted you and they said, ‘If you don’t give me an image, or another image of you, I’m going to do X, Y, Z.’”

Portnoy said teens may feel more comfortable going to a site than to involve law enforcement, which wouldn’t be anonymous, for one.

“To a teen who doesn’t want that level of involvement, they just want to know that it’s taken down, this is a big deal for them,” Portnoy said. The center reports it is seeing an increase in reports of online exploitation of children. The nonprofit’s CyberTipline received 29.3 million reports in 2021, up 35% from 2020.

“This issue has been incredibly important to Meta for a very, very long time because the damage done is quite severe in the context of teens or adults,” Antigone Davis, Meta’s global safety director, told CNN. “It can do damage to their reputation and familial relationships, and puts them in a very vulnerable position. It’s important that we find tools like this to help them regain control of what can be a very difficult and devastating situation.”

While teens can engage Take It Down themselves for help in tracking down and removing images, parents or trusted adults can also use the platform on behalf of a young person, per CNN. The effort is fully funded by Meta and builds off a similar platform it launched in 2021 alongside more than 70 NGOs, called StopNCII, to prevent revenge porn among adults.