Just joined Pornhub’s support tool to fight meta ‘sector’.

Opinion

A rude stranger convinces a teenager to send illegal photos and threatens to publish them if he is not paid. Adolescents end up dating only when they post intimate pictures of themselves on pornographic websites.

It’s the scenario of any parent’s nightmare — tech companies have historically been ill-equipped to police it. But internet platforms are supporting new tools to allow users to capture those images.

Facebook’s parent company Meta is funding a new platform designed to address these concerns, allowing young people to actively scan and remove their online images from select websites. Managed by the National Center for Missing and Exploited Children; Download it assigns a “hash value” or digital fingerprint to images or videos; Which technology companies use to identify copies of media on the web and remove them. Participants include technology companies such as Instagram, Facebook and pornography websites, including Onlyfans and Pornhub.

Antigone Davies, META’s global head of security, said: “Having an intimate image of others can be very scary and scary, especially for young people.” press release Announcing the effort. “It can feel even worse when someone tries to use those images for more images, sex or financial threats – a crime known as sextortion.”

The new tool comes as internet platforms struggle to find and prevent the spread of sexually explicit images on their websites without the subject’s consent. Experts say the problem is getting worse as the use of digital devices swells during the outbreak.

In the year In 2021 Report According to the Revenge Porn Helpline, reports of intimate image abuse have increased by 40 percent in the past five years. Increase In cases registered between 2020 and 2021.

“A lot of times a child doesn’t know there’s an adult on the other end of that conversation,” Gavin Portnoy, a spokesman for the National Center for Missing and Exploited Children, said in an interview. “So they’re looking for more pictures or more videos, and they’re usually worried that we’re going to get that child’s community, family. [and] friends”

Tech companies that find sexually explicit images of youth are required by law to report the user who posted the content, but there is no such requirement for adults. Dozens of states have passed the statues Designed Megan Iori, senior counsel at the Electronic Privacy Information Center, said Section 230 of the Communications Decency Act gives tech companies legal immunity from user-generated content to address pornographic content they don’t agree to enforce.

The definitions “allow companies not only to ignore requests to remove harmful content, including defamatory information and revenge porn, but also to ignore instructions to remove that information,” Iorio said.

Although the download is only open to children under 18 or their guardians, the same follows 2021. Effort To help meta-adults find and remove non-consensual explicit content about themselves. Meta funded and built the technology for Stop Abuse, a non-consensual intimate image abuse platform run by Revenge Sex Helpline. Users are allowed to submit issues to the helpline; UK-based technology policy non-profit run by SWGfL. and then Participating sites, including Facebook, Instagram, TikTok and Bumble, will remove the content.

In 2017, Meta tried to get users to report their own suspicious images so that the company could find them on their network and stop them from sharing them again. But the move drew criticism from lawyers who said the program could harm users’ privacy.

We offer you some site tools and assistance to get the best result in daily life by taking advantage of simple experiences