
A bill introduced at the Minnesota State Capitol would require companies to turn off consumer access to "nudification technology," which allows users to generate nude images of anyone using a simple photo and artificial intelligence.
Larger platforms like ChatGPT or Meta already prohibit nudification, but it's still widely available on smaller websites.
Megan Hurley spoke about her experience as a victim of this "deepfake" technology Monday.
"I have been humiliated," says Hurley. "I have never taken nude pictures or exchanged nudes with anybody, but because of this easily accessible website, there are convincing graphic images and pornographic videos on the internet of me forever."
DFL Senator Erin Maye Quade's (Apple Valley) bill would also allow any companies not complying with the new rule to be sued by their state's attorney general.
"It is both a great honor and absolutely devastating to be the lead author of SF 1119 which will impose steep fines for companies that allow consumers to access nudification functions on apps, websites, and platforms," Maye Quade says in a social media post about her bill.
The bill would prevent a person who owns or controls a website, application, software, or program to access, download, or use the website to "nudify" a person, an image or video.
A person that violates this section is subject to a civil penalty of not less than $500,000 for each unlawful access, download, or use.