Skip to content
DWT logo
People Services Insights
About Offices Careers
Search
People
Services
Insights
About
Offices
Careers
Search
Insights
Media & Entertainment

2025 Take It Down Act Seeks To Rein in Both Real and Computer-Generated Intimate Images

The act imposes new takedown obligations on services that host user-generated content
By   James Rosenfeld and Celyra I. Myers
07.02.25
Share
Print this page

On May 19, 2025, President Trump signed the "Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act" (the "Take It Down Act"), making a significant change to the regulatory regime for social networking platforms and other online hosts of user-generated content. The Take It Down Act takes aim at the publication of both authentic non-consensual intimate images, and "digital forgeries," which include AI-generated materials or "deepfakes" depicting identifiable individuals in intimate contexts without their consent. The statute is designed to protect individuals from unauthorized use of their images and likenesses, at a time when generative AI's ability to create such content improves every day. It imposes harsh criminal penalties—fines and imprisonment—on users who post such content. It also imposes new takedown obligations on online services that may require alteration of their content moderation protocols.

New Requirements for Covered Platforms

The Take It Down Act mandates that "covered platforms," including online services or apps that "primarily provide[] a forum for user-generated content," establish notice and takedown procedures to permit the removal of reported non-consensual intimate images on their websites—procedures reminiscent of those mandated by the Digital Millennium Copyright Act for removing copyright-infringing content. Under the Act, platforms are required to respond to requests received through these nascent notice and takedown processes within a strict 48-hour timeframe. This is different and potentially shorter than the "expeditious removal" mandated by the DMCA. Also, unlike the DMCA, the Take It Down Act does not have a process for counter-notifications or restoration of removed content.

Importantly, these takedown procedures do not require platforms to verify the veracity of claims, or even the identities of claimants. Instead, a valid notice need only (1) bear the physical or electronic signature of the claimant, (2) identify the non-consensual intimate image, or provide information sufficient for the platform to locate it, (3) contain a brief statement that the claimant has a good faith belief that the image was shared without their consent, and (4) provide the claimant's contact information. The Act does not impose penalties for knowingly false takedown notices, as the DMCA does.

Covered platforms have one year from the enactment of the statute to develop and implement the prescribed procedures. Platforms that fail to comply will be subject to the penalties available under the Federal Trade Commission Act, which include civil fines, injunctive relief, and consumer redress.

Practical Concerns for Platforms

While the statute is well intentioned and addresses critical issues, its implementation presents practical concerns for platforms. It exempts publications of subject matter of "public concern" from the removal process, but nonetheless may impact such publications. The strict turnaround times and lack of mandated verification mechanisms may result in less thorough investigations into claims, causing platforms to remove reported images without adequately investigating whether those images have been reshared by other users or appear elsewhere on their platforms. Even more likely, the necessarily rushed investigation of these claims will result in unjustified removals of content, impacting users whose content does not violate the statute and others who wish to view it.

The latter point raises significant First Amendment issues. First, the risk of unjustified removals may create a chilling effect on speech for creators whose works involve nudity and/or other covered subject matter, who may feel uncomfortable sharing their content on covered platforms due to the threat of erroneous takedowns and/or prosecution. Second, the statute may also have a chilling effect on platforms themselves, who may, in an effort to comply, implement policies that are overinclusive, or inadequate claim verification procedures that have the cumulative effect of removing virtually all content that could be implicated by the statute, whether that content is actually in violation or not. And while platforms are shielded from legal liability arising from "good faith disabling of access to, or removal of, material claimed to be a nonconsensual intimate visual depiction," wrongful removals still represent meaningful violations of the rights of creators to distribute such content and the right of the public to access it, ultimately harming user satisfaction and public trust.

Balancing Compliance With Maintaining Speech-Friendly Online Environments

To navigate these challenges, platforms can take proactive steps to comply with the statute while fostering free speech environments:

  1. Consult With Legal Experts: In-house or outside counsel well-versed in notice and takedown procedures and legislative compliance for content moderation can provide invaluable guidance.

  2. Revise Terms of Service: Platforms should ensure their terms of service explicitly address and ban the categories of content implicated by the statute, clearly enumerate the process for initiating the takedown process, and inform users of the parameters within which they must operate.

  3. Enhance Takedown Procedures: In addition to meeting the statutory requirements for notice and takedown procedures, platforms should consider going beyond these requirements to create thorough but quick claim verification systems to protect both victims of non-consensual image sharing and creators, and perhaps restoration procedures to address wrongful removals. Training compliance officers in careful assessment of claims can help prevent wrongful takedowns and protect legitimate content.

By taking these steps, platforms can effectively comply with the 2025 Take It Down Act while minimizing the risk of infringing on the rights of their users. Balancing compliance with the protection of creative expression is essential in our evolving digital landscape. For more information or advice regarding compliance with the Take It Down Act, please contact the authors or another member of Davis Wright Tremaine's media and entertainment practice.

Related Articles

06.23.25
Insights
Media & Entertainment
California Law Ensures Reporters Can Cover Protests – Plus DWT Resources for Media Organizations and Journalists Read More
03.19.25
Insights
Education
New Administration Outlook: Trump's DOE Letters and Colleges' First Amendment Defenses Against Pretextual Title VI Threats Read More
03.10.25
Insights
Media & Entertainment
Lights, Camera, Legislation: Are Your Entertainment Contracts AI Ready? Read More
DWT logo
©1996-2025 Davis Wright Tremaine LLP. ALL RIGHTS RESERVED. Attorney Advertising. Not intended as legal advice. Prior results do not guarantee a similar outcome.
Media Kit Affiliations Legal notices
Privacy policy Employees DWT Collaborate EEO
SUBSCRIBE
©1996-2025 Davis Wright Tremaine LLP. ALL RIGHTS RESERVED. Attorney Advertising. Not intended as legal advice. Prior results do not guarantee a similar outcome.