New effort hopes to pull back explicit images shared by and of children

Around the world, explicit images of children and teens are circulating on the web.

Sometimes they're willingly shared with a girlfriend or boyfriend, other times they're tricked into sharing nude photos by a perpetrator, says National Center for Missing and Exploited Children (NCMEC) spokesman Gavin Portnoy.

Live On-Air
Ask Your Smart Speaker to Play K M O X
NewsRadio 1120 KMOX
Listen Now
Now Playing
Now Playing

Portnoy says that can be devastating for a young person, sometimes leading them to take tragic measures. Now there's a new tool to help those victims get photos, videos -- even AI generated images -- removed.

NCMEC has partnered with Meta and other platforms to create a tool called "Take It Down". To create a case, anyone who is under 18, or was under 18 when illicit material was taken, can go to the site to answer a series of questions. "They point it to that image or video, it gets hashed, which is a nerdy way of saying we take a digital fingerprint of it," explains Portnoy, "that digital fingerprint then gets shared with the National Center for Missing and Exploited Children." Those digital fingerprints are shared with sites that have agreed to pull any already posted or prevent uploads of those images in the future. He stresses it is anonymous -- no names or other personal information is taken.

You hear the full interview with the NCMEC about "Take It Down" in the latest "What the Media?!!?" podcast with KMOX's Megan Lynch and media literacy expert Julie Smith.

The platforms participating include Facebook, Instagram, Pornhub, Mindgeek, OnlyFans, and Yubo (you-boh). NCMEC hopes to add more electronic service providers in the future.

@2023 Audacy (KMOX). All rights reserved.

Follow KMOX | Facebook | Twitter | Instagram
Listen on the free Audacy app.
Tell your smart speaker to play K M O X

Featured Image Photo Credit: ChiccoDodiFC/iStock/Getty Images Plus