Online platform will help global fight against child sexual exploitation material

[ad_1]

 

A new online tool will help the global fight against intimate images of children being shared online.

The National Center for Mission & Exploited Children​ (NCMEC) is the largest and most influential child protection organisation in the United States.

It has now launched Take It Down, a new platform designed to proactively prevent young people’s intimate images from spreading online.

Young people or anyone whose intimate images of when they were under 18 are able to submit a case that will proactively search for their intimate images on Facebook, Instagram, Only Fans, Yubo and PornHub.

READ MORE:
* Children as young as 10 are victims of ‘sextortion’ in New Zealand
* Men most likely to be victims of ‘sextortion’, cases on the rise
* Stolen childhoods: The NZ team saving kids from sex abuse at home and overseas
* Stolen childhoods: The men paying to watch child sex abuse and those trying to stop it
* Stolen Childhoods: Sex offender says admitting crimes was just the start of his rehab
* Stolen childhoods: Children sexually abused on camera still living with effects years on

The tool will then assign a unique hash value – a numerical code – to the image or video and companies like Meta can find any copies and take them down and prevent them from being posted on the apps in the future.

Meta New Zealand and Pacific Islands head of public policy Nick McDonnell said this is a dedicated safety tool for young people including in Aotearoa.

Images of children can now be reported through Take It Down.

KATHRYN GEORGE/Stuff

Images of children can now be reported through Take It Down.

“This world-first platform will offer support and dedicated resources to young Kiwis to prevent the unwanted spread of their intimate images online, which we know can be extremely distressing for victims.

“We’ll continue to work with safety organisations, including NetSafe, New Zealand law enforcement and victims to help to combat this issue for young people online,” McDonnell said.

Stuff previously interviewed a victim whose images had been shared online.

Years after being sexually abused as a child, images of Solomon* were being discovered on computers across the United States and even military bases in Germany and Japan.

Tania* told Stuff she lives in fear the images her stepfather took of her will be released when he gets out of prison.

“I feel like that because everything is so permanent in those photos and videos, I don’t feel like it is ever going to be over.”

The online tool will enable tech companies to find the images and remove them.

KATHRYN GEORGE/Stuff

The online tool will enable tech companies to find the images and remove them.

Stuff asked Meta a number of questions about the end-to-end encryption Facebook now has on its messenger app.

A recent report by Laura Draper from American University Washington College of Law, titled Protecting Children in the Age of End-to-End Encryption, found the encryption severely complicates investigating child sexual abuse material and makes it more difficult for tech companies to detect these crimes.

End-to-end encryptions enables offenders to trade the images with less fear of detection, Draper said.

Offenders no longer need to access the Dark Web, but can simply download apps with end-to-end encryptions such as Facebook Messenger, WhatsApp, Proton and Signal.

Nick McDonnell said Meta has zero tolerance for child sexual exploitation.

“We lead the industry in detecting, reporting, and taking action against this kind of heinous abuse.

“Currently, we employ multiple technologies to detect and remove intimate images. We use machine learning classifiers and photo- and video-matching technology, which allow us to proactively detect images or videos that are shared without permission that violate our Community Standards,” McDonnell said.

McDonnell said Meta invested in creating new tools focused on preventing harm by banning suspicious profiles, restricting adults from messaging children they’re not connected with and defaulting under 18-year-olds to private or “friends only accounts”.

”We believe that Take It Down will make a difference as a tool for preventing intimate images being shared across platforms,” McDonnell said.

The spokesperson said Meta bans more than 300,000 accounts each month suspected of sharing child exploitation imagery by using all available unencrypted information to detect and prevent this kind of abuse.

[ad_2]

Leave a Comment