
New federal legislation aims to combat the publication of non-consensual deepfakes and mandates notice-and-removal procedures for covered platforms.
By Sy Damle, Andy Gass, Ghaith Mahmood, and Michael H. Rubin
On May 19, 2025, President Trump signed into law the “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act” (the Take It Down Act or the Act). The Act, which received widespread bipartisan support, prohibits any person from knowingly publishing “intimate visual depictions” of minors and non-consenting adults. This includes deepfakes, which are images or videos that have been edited or generated by artificial intelligence.
The Act gives certain websites and other online services one year to implement notice-and-removal procedures to enable victims to remove unlawful images. Under the Act, a covered platform that is subject to the Act’s notice-and-removal requirements must remove non-consensual intimate images within 48 hours of receiving a valid removal notice. Failure to comply will be treated as a violation of the Federal Trade Commission Act and subject to Federal Trade Commission enforcement.
This Client Alert focuses on the obligations of and implications for covered platforms and briefly addresses the Act’s imposition of criminal liability for the publication of non-consensual intimate visual depictions.