What is the Take It Down Act?
A bill designed to protect people from being victims of the non-consensual sharing of sexually explicit images and deepfake pornography was the focus of a roundtable involving First Lady Melania Trump.
What is the Take It Down Act?
What we know:
The Take it Down Act is legislation introduced by Sen. Ted Cruz of Texas that makes it unlawful to knowingly publish "non-consensual intimate imagery (NCII), including "digital forgeries" created with AI software (or deepfake pornography), and require social media and similar websites to have in place procedures to remove such content upon notification from a victim," according to a 2024 release from Cruz’s office.

FILE-A person uses a laptop at a desk. (Photo by Matthew Horwood/Getty Images)
RELATED: FBI warn against AI-generated deepfake content created for sextortion schemes
Other legislators that support the bill include Democratic Sen. Amy Klobuchar of Minnesota and Rep. Madeleine Dean of Pennsylvania also supports the bill.
What we don't know:
It’s unknown when the legislation will be passed in the House, but it passed the Senate with bipartisan support during the last session of Congress and again in February.
Take It Down Act roundtable
Local perspective:
First Lady Melania Trump hosted a roundtable discussion Monday on Capitol Hill as she lobbied for the Take It Down Act saying it was "heartbreaking" to see what teenagers and especially girls go through after they are victimized by people who spread such content, the Associated Press reported
She also called on the Republican-controlled House to pass the bill and send it to President Donald Trump to sign into law.
The legislation's chief sponsors are Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., and Reps. Maria Salazar, R-Fla., and Madeleine Dean, D-Pa.
Cruz, who hosted the discussion, told the first lady that her leadership "is incredibly important and I’m confident it’s going to play a critical role in accelerating the passage of this bill and getting it passed into law."
CNN noted that the first lady introduced her "Be Best" platform, which included a focus on online safety during Trump’s first term in office.
AI-generated explicit images deepfakes on the rise
Big picture view:
Most states have laws protecting individuals from revenge porn, but only 20 states have explicit laws covering deepfake non-consensual intimate imagery (NCII), per a release from Congresswoman Maria Elvira Salazar of Florida.
Researchers tell the Associated Press that the number of explicit deepfakes have increased in the last several years, as the technology used to create these explicit images is more accessible and easier to use.
Dig deeper:
Over the past few years, several celebrities were reportedly victims of deepfake pornographic images that circulated online and social media.
Last year, fake sexually explicit and abusive fraudulent images of singer Taylor Swift were disseminated on the social media platform X, with some shared on Meta-owned Facebook and other social media platforms, according to the Associated Press.
RELATED: Taylor Swift AI-generated explicit photos spark outrage
Meanwhile, more fake and sexualized images of actors Miranda Cosgrove, Jennette McCurdy, Ariana Grande, Scarlett Johansson, and former tennis star Maria Sharapova were shared on multiple Facebook accounts, which amassed tons of likes and shares on the platform, CBS reported.
The Source: Information for this story was provided by a release from Sen. Ted Cruz’s office, which has background on the Take It Down Act, previous LIVENOW from FOX reporting, the Associated Press, CBS News, and CNN. This story was reported from Washington, D.C.