Skip to main content
Image
Building in New York

CONGRESSMAN JOE MORELLE ANNOUNCES RENEWED EFFORT TO COMBAT HARMFUL DEEPFAKE PORNOGRAPHY

March 6, 2025

Morelle re-introduces bipartisan legislation he authored to combat the devastation caused by deepfake intimate images, empower victims and their families

(Washington, D.C.)—Today, Congressman Joe Morelle introduced bipartisan legislation he authored to combat the pervasiveness of non-consensual deepfake pornography. Morelle’s Preventing Deepfakes of Intimate Images Act would create both criminal and civil penalties for those who create and post non-consensual deepfakes on social media.

“AI-generated deepfakes have created a disturbing new avenue for harassment, abuse, and misinformation—causing real-world harm with devastating consequences. Women are disproportionately targeted by this technology, yet there are too few protections in place to defend them from exploitation,” said Congressman Joe Morelle. “I’m proud to be reintroducing the bipartisan Preventing Deepfakes of Intimate Images Act to help us close those gaps and establish necessary safeguards to combat the growing misuse of deepfake technology. I’m grateful to my colleague Rep. Kean for joining me in supporting this critical legislation, and I look forward to working with him to pass this common-sense, long-overdue legislation into law.”

“Artificial intelligence is advancing faster than the necessary guardrails, making it easier for deepfake technology to be used explicitly and disruptively,” said Congressman Kean. “One victim of deepfake pornography is one too many. We must ensure there are serious consequences for the perpetrators. Addressing AI is a bipartisan issue, and I am proud to co-lead this bill with Congressman Morelle and look forward to working together to get it across the finish line—protecting young girls, women, and all victims of deepfakes.” 

With advancements in technology and artificial intelligence has come the emergence of “deepfakes”—digitally altered images that use an individual’s likeness mapped onto a photo or video of someone else. A 2023 study found that 98 percent of deepfake videos online were pornographic in content and 99 percent of them targeted women. While the videos are fake, their impacts are very real—and until now, little has been done to provide women with protection or recourse from this disturbing phenomenon.

In March 2022, President Biden signed the Violence Against Women Act (VAWA) Reauthorization Act, a bipartisan law taking significant steps to address domestic violence. The VAWA Reauthorization included provisions to empower individuals victimized by the non-consensual disclosure of their intimate visual images to seek civil penalties in federal court. However, these provisions did not include the same protections for those impacted by the disclosure of deepfakes, leaving individuals vulnerable to exploitation by these image

Over the past three years, there have been numerous instances of non-consensual deepfakes making national headlines, including celebrities like Taylor Swift and Jenna Ortega as well as high school students like Francesca Mani, whom Congressman Morelle hosted in Washington in 2024.

To address this pervasive issue, Congressman Morelle introduced the Preventing Deepfakes of Intimate Images Act, which would:

  • Prohibit the non-consensual disclosure of digitally altered intimate images and make the sharing of these images a criminal offense;
  • Create a private right of action for victims to seek relief;
  • Ensure that an individual’s consent to create the image does not establish consent for the sharing or disclosure of the image;
  • Provide additional protections to preserve a plaintiff’s anonymity in civil actions.

Joining Congressman Joe Morelle in introducing this legislation is Congressman Tom Kean (R, NJ-07).

The Preventing Deepfakes of Intimate Images Act has received endorsements from national advocacy organizations, including: the Cyber Civil Rights Initiative, Rape, Abuse & Incest National Network (RAINN), SAG-AFTRA, the National Organization for Women (NOW), Public Citizen, the Joyful Heart Foundation, and Alecto AI.

Dr. Mary Anne Franks, President and Legislative & Tech Policy Director of the Cyber Civil Rights Initiative, said: "As the nation’s leading organization dedicated to combating image-based sexual abuse, the Cyber Civil Rights Initiative welcomes this much-needed legislation to address the growing epidemic of sexually explicit digital forgeries. Regardless of the motives of the perpetrators, the nonconsensual distribution of digitally manipulated intimate images inflicts severe and often irreparable psychological, financial, and reputational injury on victims. The abuse disproportionately harms women and other vulnerable groups, chilling their freedom of expression and undermining their equal participation in society.”

Stefan Turkheimer, Vice President of Public Policy for RAINN, said: “The Preventing Deepfakes of Intimate Images Act is an important tool for justice, giving survivors of nonconsensual intimate image abuse the legal protection they deserve and the power to fight back. By holding perpetrators accountable and recognizing the profound harm caused by AI-generated exploitation, it sends a clear message: no one has the right to violate another person by distributing intimate images without consent.”

Duncan Crabtree-Ireland, SAG-AFTRA National Executive Director and Chief Negotiator, said: "I applaud Congressman Morelle for introducing the Preventing Deepfakes of Intimate Images Act against sexualized non-consensual digital replicas and deepfakes. Sexual abuse, whether occurring physically or digitally, should never be excused or permitted as ‘personal expression,’ and it should never be tolerated. Deepfakes are violations, objectification and exploitation, and must be made illegal and punishable by law. This bill is a powerful step to ensure that this technology is not used to cause harm, and it will help curb an incredibly destructive practice that strikes at the heart of personal privacy, safety and autonomy. We must take action to ensure that we can discern between what is real and what is not real, or we are looking at a far more dangerous future of sexual exploitation and objectification.”

Christian F. Nunes, President of the National Organization for Women (NOW), said: “The National Organization for Women is proud to endorse the Preventing Deepfakes of Intimate Images Act. This vital bipartisan legislation, introduced by Rep. Joseph D. Morelle (D-NY) and Rep. Tom Kean (R-NJ) makes sharing these images a criminal offense and creates a right of private action that allows victims to take their abusers to court and make them pay for the harm they’ve caused. We’ve seen plenty of outrage over the rise of deepfake images that are used to harass and exploit women—and earn profits for abusers. But outrage is not enough. We need laws like the Preventing Deepfakes of Intimate Images Act, to thwart the creation of deepfakes and hold those who make and share them accountable.”

Craig Holman, Ph.D., from Public Citizen, said: "Deep fake technology has evolved to become so refined, and so inexpensive, as to allow anyone the ability to produce images and voices indistinguishable from reality. It has become a tool for inflicting violence and intimidation, largely against women, by depicting the targeted persons in completely fabricated intimate situations. Rep. Morelle's desperately-needed "Preventing Deepfakes of Intimate Images Act" would provide unsuspecting victims with the tools needed to fight back."

Ilse Knecht, Director of Policy and Advocacy for the Joyful Heart Foundation, said: “The non-consensual sharing of synthetic intimate images, often called deepfake pornography, is an exponentially growing form of abuse that humiliates, degrades, and threatens women and girls, causing real and lasting damage to their mental health and wellbeing. Current federal law does not prohibit this abuse, allowing these perpetrators to inflict this life-shattering violence on survivors with impunity. It's time for the Federal Government to pass legislation that recognizes this harm and allows survivors to seek justice. We urge Congress to quickly enact Representative Morelle’s Preventing Deepfakes of Intimate Images Act to support survivors of this alarming and widespread abuse and bring accountability to their abusers.”

Andrea Powell, Alecto Al's Chief of Impact and Director of the Alecto Foundation, said: “There is nothing fake about the pain and trauma that synthetic or Al generated intimate deepfake abuse images causes. Without federal legislation that allows for a pathway toward justice and protection from this form of sexual violence, survivors across the United States are left exposed and violated while their abusers and those who facilitate their abuse simply walk away free. We firmly support the introduction of the Preventing Deepfakes of Intimate Images Act to ensure survivors don't have to stand alone in the face of their own injustice."

###