Minnesota lawmakers are taking steps to combat the misuse of artificial intelligence (AI) technology that creates fake nude images or pornography of real people without their consent. A bill discussed Wednesday in the Senate Judiciary and Public Safety Committee aims to close legal loopholes and hold AI companies accountable for enabling such harmful practices.

The proposed legislation, authored by Sen. Erin Maye Quade (DFL-Apple Valley), would require AI companies to disable features that allow users to generate explicit images or videos of individuals. The bill builds on Minnesota’s existing law banning nonconsensual sexual deepfakes, which currently criminalizes the dissemination of such content but not its creation.

“The harm begins at creation,” Maye Quade emphasized during the hearing. “People are downloading these apps and using them to ‘nudify’ their teachers, classmates, siblings, and friends. This is a violation of privacy and dignity.”

Under the bill, AI platforms that fail to remove these features in Minnesota could face a $500,000 civil fine. The proposal also makes it easier for victims to sue for damages and seeks to provide restitution for those impacted.

The issue gained urgency after Megan Hurley, a massage therapist, testified about being targeted by AI-generated fake images that depicted her in explicit situations. Hurley described the emotional and financial toll of the experience, including lost income and the ongoing trauma of knowing the images could circulate online indefinitely.

“I cannot overstate the damage this technology has done,” Hurley said. “I’m appalled that companies are profiting from this kind of exploitation.”

While lawmakers agree on the need to address the issue, some, like Sen. Warren Limmer (R-Maple Grove), cautioned that the rapid evolution of AI technology could make it challenging to regulate. “In a few short years, creating these images may not be a high technical challenge,” Limmer noted.

The committee postponed a vote to allow for further discussion on structuring fines and allocating funds to support survivors of deepfake offenses. A corresponding House bill has yet to be introduced.

As AI technology continues to advance, Minnesota’s efforts reflect a growing national conversation about the ethical use of AI and the need to protect individuals from digital exploitation.

Leave a comment

Trending