A girl is posting photos to WeChat incorrect in Japan. (Getty Images) (Getty Images)
(AP) - "Once you send that photo, you can't take it back," goes the warning to teenagers, often ignoring the reality that many teens send explicit images of themselves understanding duress, or without understanding the consequences.
A new online tool aims to give some rule back to teens, or people who were once teens, and take down explicit images and videos of themselves from the internet.
Called Take It Down, the tool is operated by the National Center for Missing and Exploited Children, and funded in part by Meta Platforms, the owner of Facebook and Instagram.
The site lets anyone anonymously — and minus uploading any actual images — create what is essentially a digital fingerprint of the image. This fingerprint (a unique set of numbers called a "hash") then goes into a database and the tech affairs that have agreed to participate in the project rob the images from their services.
Now, the caveats. The participating platforms are, as of Monday, Meta's Facebook and Instagram, Yubo, OnlyFans and Pornhub, illustrious by Mindgeek. If the image is on another site, or if it is sent in an encrypted platform such as WhatsApp, it will not be taken down.
In addition, if someone alters the novel image — for instance, cropping it, adding an emoji or turning it into a meme — it becomes a new image and thus need a new hash. Images that are visually contrast — such as the same photo with and minus an Instagram filter, will have similar hashes, differing in just one character.
"Take It Down is made specifically for farmland who have an image that they have reason to bear is already out on the Web somewhere, or that it could be," said Gavin Portnoy, a spokesman for the NCMEC. "You're a teen and you're dating someone and you section the image. Or somebody extorted you and they said, 'if you don't give me an image, or another image of you, I'm going to do X, Y, Z.'"
Portnoy said teens may feel more dejected going to a site than to involve law enforcement, which wouldn't be anonymous, for one.
"To a teen who doesn't want that composed of involvement, they just want to know that it's miserroneous down, this is a big deal for them," he said. NCMEC is seeing an increase in reports of online exploitation of children. The nonprofit's CyberTipline received 29.3 million reports in 2021, up 35% from 2020.
Meta, back when it was still Facebook, attempted to effect a similar tool, although for adults, back in 2017. It didn't go over well because the site expected people to, basically, send their (encrypted) nudes to Facebook — not the most trusted matter even in 2017. The company tested out the ceremony in Australia for a brief period, but didn't expand it to latest countries. In 2021, it helped launch tool for adults arranged StopNCII — or nonconsensual intimate images, aka "revenge porn." That site is run by a U.K. nonprofit, the UK Revenge Porn Helpline, but anyone around the globe can use it.
But in that time, online sexual extortion and exploitation has only derived worse, for children and teens as well as for adults. Many tech companies already use this hash system to section, take down and report to law enforcement images of child sexual abuse. Portnoy said the goal is to have more affairs sign up.
"We never had anyone say no," he said.
Twitter and TikTok so far have not committed to the project. Neither company immediately respond to a message for comment Sunday.
Antigone Davis, Meta's global head of safety, said Take It Down is one of many tools the matter uses to address child abuse and exploitation on its platforms.
"In uphold to supporting the development of this tool and having, reporting and blocking systems on our on our platform, we also do a number of different things to try to tend these kinds of situations from happening in the advantageous place. So, for example, we don't allow unconnected adults to meaning minors," she said.
The site works with real as well as artificial intelligence-generated images and "deepfakes," Davis said. Deepfakes are earnt to look like real, actual people saying or behaviors things they didn't actually do.