Go back Trends & News

Take It Down: Trump Signs Bill Targeting the Removal of Intimate Deepfakes

By Content Team on May 26, 2025

In May, U.S. President Donald Trump signed into law the Take It Down Act, a bill designed to make it easier to take down explicit and intimate content published on online platforms—especially social networks and other channels primarily fueled by user-generated content.

The law applies to both authentic images and digitally altered or generated ones, including those created with artificial intelligence. That’s why it marks a key step in fighting deepfakes —hyper-realistic, AI-generated content that’s indistinguishable from authentic images of identifiable individuals.

Trump called attention to this issue during the bill signing. “With the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will. This is ... wrong, and it’s just so horribly wrong”, said the President. 

The most important update: platforms are now required to remove reported content within 48 hours. Compliance will be monitored by the Federal Trade Commission (FTC).

The new law also makes it a federal crime to threaten the publication of intimate images, whether they’re authentic or AI-generated.

The Take It Down Act passed in the House with near-unanimous support: 409 votes in favor, only 2 against. This strong backing could open the door for future regulations in the same direction.

What are the penalties?

The Federal Trade Commission (FTC) is in charge of monitoring online platforms and enforcing the law. As long as platforms follow the rules, they won’t face penalties.

But the law clearly defines penalties for individuals who publish this type of material. The sentences vary depending on the victim’s age (adult or under 18) and whether the image is authentic or generated.

For the publication of intimate images (authentic or not), or for threatening to publish authentic intimate images, penalties are:
• Up to 2 years in prison if the victim is an adult;
• Up to 3 years in prison if the victim is under 18.

For threats involving digital forgeries (digitally manipulated or AI-generated images):
Up to 18 months in prison if the victim is an adult;
Up to 30 months in prison if the victim is under 18.

All penalties may also include fines.

The law includes exceptions: there’s no violation if the image is shared for the purpose of reporting a crime or for medical reasons. If the person depicted shares the image themselves, or gives explicit permission to do so, it’s not considered a violation either.

Still, the regulation clarifies that distribution requires specific consent. In other words, even if someone consents to the creation of an intimate image using digital means, that does not imply permission to distribute or share it.

The takedown process

Platforms subject to the Take It Down Act must display a clear notice or link informing users about the process to request the removal of unauthorized intimate content. This notice must outline the platform’s responsibilities and explain how to file a takedown request.

A valid takedown notice must include:
• A physical or electronic signature from the person submitting the request or their authorized representative;
• Enough information for the platform to locate the infringing content (such as a copy of the material, links, or other references);
• A brief explanation of why the content is unauthorized;
• Contact information so the platform can follow up with the requester or their representative.

These requirements may change how takedown processes are handled. Although most platforms already have content removal procedures in place, it’s rare for reports to include verifiable signatures (physical or digital).

In addition to removing the reported content within 48 hours, the law also requires platforms to make a “reasonable effort” to identify and remove any known identical copies. With millions of new images uploaded every day, platforms will need technical solutions to locate these visually related files at scale.

Platforms have one year from the bill’s signing to comply with the Take It Down Act’s requirements.

Deepfakes in corporate fraud

There have already been numerous cases of criminals generating fake images, audio clips, or videos of celebrities, public figures, and executives to carry out scams or damage a company’s reputation. In some cases, these AI-generated fakes have even been promoted as ads on major platforms.

Although these incidents are becoming more common, deepfakes that don’t depict identifiable individuals in intimate contexts are not covered by the Take It Down Act.

But that doesn’t mean there’s no recourse. Many fraudulent AI-generated contents still violate platform policies and terms of service. That means companies can request their removal using standard takedown processes.

In doing so, organizations can take an active role in preventing their brands, executives, or representatives from appearing in misleading content meant to deceive consumers.

Check out our ebook on Deepfakes and Corporate Fraud to learn more about how these threats impact brand reputation and user trust—and the concrete steps you can take to remove harmful content and protect your image.