
Fake sexually explicit AI-generated images of superstar Taylor Swift have been flooding social media, particularly X, and shared so many times, it prompted a response from the White House.
At least a couple dozen unique AI-generated images have been viewed millions of times, according to the Associated Press. "The most widely shared were football-related, showing a painted or bloodied Swift that objectified her and in some cases inflicted violent harm on her deepfake persona," the AP noted.
White House Press Secretary Karine Jean-Pierre called the fake images "alarming" and urged social media companies to enforce rules concerning "nonconsensual, intimate imagery of real people."
"We know that lax enforcement disproportionately impacts women and they also impact girls, sadly, who are the overwhelming targets," Jean-Pierre said during a briefing.
X has temporarily blocked searches for Taylor Swift to "prioritize safety" and said it was "actively removing all identified images." It also warned that "posting non-consensual nudity images is strictly prohibited."
Swift has not commented publicly on the images.
Like other forms of image-based sexual abuse, deepfake porn causes irreparable harm, threatening victims' mental health, physical safety, freedom of expression and life opportunities, according to Dr. Mary Anne Franks, president of the Cyber Civil Rights Initiative at George Washington Law School.
As the graphic AI content continues to circulate online, many have renewed calls for laws around misuse of AI technology and legislation that would make sharing and posting AI-generated sexual content, or deepfake phonography, a crime.
An analysis by USA Today identified 10 states that have passed laws banning exploitative deepfake pornography and where victims can take legal action: California, Florida, Georgia, Hawaii, Illinois, Minnesota, New York, Texas, South Dakota and Virginia. There is no federal law against it.
That doesn't mean a legal battle would be easy. Carrie Goldberg, a victims' rights attorney, told USA Today the faked images could possibly result in criminal charges, but justice is more likely by suing companies involved in creating, hosting and sharing the deepfake porn.
Swift could always file a lawsuit focusing on the misappropriation of her likeness, even if the perpetrators are from another country, Goldberg added.
"When you're Taylor Swift, there's always going to be recourse," she said. "There are a lot more options for people who have resources like she has, where she can get law enforcement from other countries to care. That's not available to most people."
Rep. Joe Morelle (D-NY) is one lawmaker behind the push for legislation to stop the spread of deepfake pornography generated by AI.
"Try to imagine the horror of receiving intimate images looking exactly like you — or your daughter, or your wife, or your sister — and you can't prove it's not," Morelle said in a statement. "Deepfake pornography is sexual exploitation, it's abusive, and I'm astounded it is not already a federal crime."
While the images are fake, their impacts are very real, Morelle said, adding that deepfakes almost exclusively target women.
"Women are facing a very real danger in our increasingly digital world," he said. "Let's not wait for the next mass incident to make the news. This is happening every day to women everywhere, and it's time to give them back their power."
Follow KNX News 97.1 FM
Twitter | Facebook | Instagram | TikTok