
A Minnesota woman's horrific experience with deepfake pornography is behind a new bill passed by Congress.
The bipartisan "Take it Down Act" criminalizes the publication of non-consensual intimate imagery, including those generated by artificial intelligence.
The measure was introduced by Sen. Ted Cruz, a Republican from Texas, and Sen. Amy Klobuchar, a Democrat from Minnesota, and later gained the support of First Lady Melania Trump.
"The distribution of these images, or even the threat of distribution, has ruined reputation, shattered lives," Klobuchar said Thursday.
An Otsego, Minnesota mother, Molly Kelley, was one of 85 women who have been victimized. She says the anxiety of knowing the images will live online forever is crushing.
"I worry if a client or colleague has come across these fabricated images," Kelley explains. "Where are they? Have they already seen them? how will this affect my career, my reputation, my children's future?"
Kelley adds that she became a victim of people who were actually close family friends.
"I was not hacked and my social media was never public," Kelly said. "He's someone I trusted, a close friend of over 20 years. The offender's now ex-wife saw the images he created and notified me immediately."
The bill also requires social media platforms to remove deepfake images within 48 hours, something Klobuchar called a big deal.
"If they don't, they'll be held accountable," the senator adds.
Critics of the bill, which addresses both real and artificial intelligence-generated imagery, say the language is too broad and could lead to censorship and First Amendment issues.
What is the Take It Down Act?
The bill makes it illegal to “knowingly publish” or threaten to publish intimate images without a person's consent, including AI-created "deepfakes." It also requires websites and social media companies to remove such material within 48 hours of notice from a victim. The platforms must also take steps to delete duplicate content. Many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, but the Take It Down Act is a rare example of federal regulators imposing on internet companies.
Who supports it?
The Take It Down Act has garnered strong bipartisan support and has been championed by Melania Trump, who lobbied on Capitol Hill in March saying it was “heartbreaking” to see what teenagers, especially girls, go through after they are victimized by people who spread such content. President Trump is expected to sign it into law.
Cruz said the measure was inspired by Elliston Berry and her mother, who visited his office after Snapchat refused for nearly a year to remove an AI-generated “deepfake” of the then 14-year-old.
Meta, which owns and operates Facebook and Instagram, supports the legislation.
“Having an intimate image – real or AI-generated - shared without consent can be devastating and Meta developed and backs many efforts to help prevent it,” Meta spokesman Andy Stone said last month.
The Information Technology and Innovation Foundation, a tech industry-supported think tank, said in a statement Monday that the bill's passage “is an important step forward that will help people pursue justice when they are victims of non-consensual intimate imagery, including deepfake images generated using AI.”
“We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse,” Klobuchar said in a statement after the bill's passage late Monday. “These images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable."
What are the censorship concerns?
Free speech advocates and digital rights groups say the bill is too broad and could lead to the censorship of legitimate images including legal pornography and LGBTQ content, as well as government critics.
“While the bill is meant to address a serious problem, good intentions alone are not enough to make good policy,” said the nonprofit Electronic Frontier Foundation, a digital rights advocacy group. “Lawmakers should be strengthening and enforcing existing legal protections for victims, rather than inventing new takedown regimes that are ripe for abuse.”
The takedown provision in the bill “applies to a much broader category of content — potentially any images involving intimate or sexual content” than the narrower definitions of non-consensual intimate imagery found elsewhere in the text, EFF said.
“The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Services will rely on automated filters, which are infamously blunt tools,” EFF said. “They frequently flag legal content, from fair-use commentary to news reporting. The law’s tight time frame requires that apps and websites remove speech within 48 hours, rarely enough time to verify whether the speech is actually illegal.”
As a result, the group said online companies, especially smaller ones that lack the resources to wade through a lot of content, “will likely choose to avoid the onerous legal risk by simply depublishing the speech rather than even attempting to verify it.”
The measure, EFF said, also pressures platforms to “actively monitor speech, including speech that is presently encrypted” to address liability threats.
The Cyber Civil Rights Initiative, a nonprofit that helps victims of online crimes and abuse, said it has “serious reservations” about the bill. It called its takedown provision unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse."
For instance, the group said, platforms could be obligated to remove a journalist’s photographs of a topless protest on a public street, photos of a subway flasher distributed by law enforcement to locate the perpetrator, commercially produced sexually explicit content or sexually explicit material that is consensual but falsely reported as being nonconsensual.
The Associated Press contributed to this story.