In a major legislative step aimed at reining in the abuse of artificial intelligence, former U.S. President Donald Trump has officially signed into law a sweeping bill that makes the creation and distribution of nonconsensual AI-generated explicit imagery a federal crime. The legislation, named the TAKE IT DOWN Act — short for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks — marks a significant federal response to a rising tide of digital exploitation.
The bill, signed on May 19, comes with strong backing from former First Lady Melania Trump, who has been vocal in her advocacy against online exploitation, particularly when it involves the use of emerging technologies like AI. The law targets one of the darker sides of AI — the creation of hyper-realistic deepfake pornography — and imposes strict obligations on platforms and websites to remove such content swiftly.
A Federal Crime With Teeth
Under the new law, publishing — or even threatening to publish — nonconsensual sexually explicit imagery that has been artificially generated is now a federal offense. This includes deepfakes of both adults and minors. The intent to harm, harass, or intimidate is central to how the law will be enforced. Offenders could face substantial fines or imprisonment depending on the severity of the case.
Moreover, the law mandates that all digital platforms — including social media sites, web forums, and apps — must respond to takedown requests involving illicit content within 48 hours. The obligation to implement a formal takedown process applies across the board, from large tech firms to small online communities.
Trump addressed the new legislation during a press conference at the White House Rose Garden, also sharing remarks on Truth Social, his preferred communication platform. “This bill takes aim at AI-generated forgeries — deepfakes — which are becoming more convincing and dangerous every day,” he said.
Melania Trump’s Role and Advocacy
Melania Trump played a significant role in rallying support for the TAKE IT DOWN Act. In a prepared statement, she called the legislation a “national victory” and a critical first step toward protecting individuals — particularly women and children — from technological abuse.
“Artificial Intelligence and social media are the digital candy of the next generation — sweet, addictive, and engineered to have an impact on the cognitive development of our children,” she warned. “But unlike sugar, these new technologies can be weaponized, shape beliefs, and sadly, affect emotions and even be deadly.”
A Bipartisan Effort
The TAKE IT DOWN Act enjoyed rare bipartisan support. It was first introduced in June 2024 by Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN) and successfully passed through both chambers of Congress by April 2025. This demonstrates a growing consensus across party lines that the misuse of AI technology must be addressed urgently.
Rising Global Concern Over Deepfakes
The U.S. is not alone in taking legal action against AI-generated explicit content. The United Kingdom implemented similar regulations through its Online Safety Act in 2023, which also criminalizes the distribution of deepfake pornography.
Incidents such as the January 2024 spread of deepfake pornographic images of pop icon Taylor Swift served as a wake-up call for lawmakers. Swift’s name was temporarily blocked from being searchable on X (formerly Twitter) due to the widespread dissemination of doctored images, prompting public outcry and political momentum.
According to cybersecurity firm Security Hero, deepfake content is not only spreading rapidly but remains overwhelmingly exploitative. Their 2023 report found that 99% of AI-generated explicit deepfakes target women — a sobering statistic that underscores the urgent need for stronger legal protections.
The Road Ahead
With the TAKE IT DOWN Act now enshrined in federal law, the United States takes a critical step in curbing the misuse of artificial intelligence for malicious and abusive ends. However, experts warn that enforcement and public education will be key to the law’s success.
As deepfake technology continues to evolve and become more accessible, lawmakers, tech companies, and advocacy groups will need to remain vigilant. While the TAKE IT DOWN Act marks a powerful beginning, the broader battle to secure digital spaces — especially for women and vulnerable populations — is just getting started.