
UPDATED: 5:40 p.m.
PHILADELPHIA (KYW Newsradio) — A recent change to Pennsylvania law that outlaws AI deepfake sexual images also replaces the term “child pornography” with a term that advocates say shows the seriousness of the crime.
Abbie Newman, CEO of Mission Kids Child Advocacy Center in Montgomery County, was on the state panel that worked to remove the term “child pornography” from different statutes. They replaced it with “child sexual abuse material.”
“When you hear the term just ‘pornography,’ it implies that there is an element of consent, or there may be an element of consent. A child cannot consent to being sexually abused,” explained Newman.
“That's why child sexual abuse materials, CSAM, or child sexual assault materials [are] much better [descriptions] for what is going on.”
Prosecutor Lauren Marvel handles many of the more serious child abuse cases for the Montgomery County District Attorney’s Office. Marvel says the videos and images in many of these cases drive detectives into therapy, or even to leave the job. She says when people call it child sexual abuse material, they accurately call it what it really is.
“It is the documentation and exploitation of a child being sexually abused, and then that sexual abuse material being used for someone else's sexual gratification,” she said.
Marvel says it’s important not only for society to understand the seriousness of the crime, but also for survivors to know what was done to them is not being trivialized. “Imagine them navigating the world, hearing what happened to them, being referred to as kiddie porn or child porn,” she said.
“We are telling those victims specifically, this is not something that you are. This is not something that you consented to. This is something that happened to you, something that someone did to you, that you are a victim and not a participant.”