Image: Person on phone at night. Text: Take It Down Act, Addresses Nonconsensual Images
By Joseph Storch, Senior Director of Compliance and Innovation
On May 19, 2025, President Trump signed into law the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act or ‘‘TAKE IT DOWN Act.’’ The law, which amends the Communications Act, has been developing over a number of years, and received broad, bipartisan support in the House and Senate. While the criminal and civil aspects of the law are not directed at or limited to the higher education environment, inasmuch as college students experience these types of harms at elevated numbers, it is important for college and university professionals to be aware of the law, its protections, and requirements.
Technology facilitated sexual violence (TFSV) has been increasing in both raw numbers and in its spread.* Creating deepfakes and sharing of intimate images are just a few of the ways in which individuals can harm others through the use of technology. Most occurrences of TFSV are committed by someone that the victim knows, including significant others (present or former), family, and friends. In many cases images are originally taken or shared consensually but then are reshared and published online in ways that are meant to embarrass or shame. Both consensual and non-consensual taking and sharing of nude and other intimate images has increased markedly. Many children receive a nude image through an electronic message before they are 18. Images published without consent can lead to significant mental anguish, and some perpetrators attempt to extort victims for money or additional images to keep the images unpublished.
There are two main areas of the new law. The first addresses civil and criminal penalties for sharing or threatening to share intimate images of children and for sharing non-consensual intimate images of adults, whether real or forgeries, and the second requires that certain internet companies develop a process to remove such images upon request.
- Criminal Prohibition on Disclosing Actual and Fabricated Intimate Images
The law prohibits the intentional sharing and threats to share intimate depictions of adults and children under 18 years old.
The law applies to both actual images and videos that are taken of adults without consent or of children, as well as digital forgeries of depictions of adults and children. Sometimes called “deepfakes,” these edited or created depictions, sometimes with the assistance of artificial intelligence, look more and more accurate as technology advances. Simultaneously, the technology to facilitate these depictions is becoming less expensive and more accessible to actors looking to commit harm.
Images of Adults
Under this law, no person may knowingly publish an intimate visual depiction (photo or video) of a person 18 years of age or older if the image was obtained or created when:
- the person sharing the image knew or should have known that the identifiable person in the image had a reasonable expectation of privacy, and
- the image was not created voluntarily in a public or commercial setting, and
- publishing the image is intended to cause harm or causes psychological, financial, reputational, or other harm.
Note that per the statute, consent to having the image initially captured and even affirmatively sharing that image with another person is not, in and of itself, consent to having that image further shared or published.
Fines and Imprisonment: Committing the violation against an adult can result in fines and imprisonment for up to two years. Threatening commission can result in fines and imprisonment for up to 18 months. A court may also order restitution and forfeiture of property and financial gains from the violation.
Images of Children
The law prohibits knowingly publishing an intimate visual depiction (photo or video) of a child under years of age with intent to:
- abuse, humiliate, harass or degrade the minor, or
- arouse or gratify the sexual desire of any person.
Fines and Imprisonment: Committing the violation against a child can result in fines and imprisonment for up to three years. Threatening commission can result in fines and imprisonment for up to 30 months. A court may also order restitution and forfeiture of property and financial gains from the violation.
Next Steps: Colleges and universities have no technical obligations under this law but may want to provide constructive educational outreach to students and employees. Areas to consider:
- Providing education to campus law enforcement/public safety, Title IX staff, Student Affairs personnel (including Housing and Conduct), academic advisors, and faculty, so they can effectively support students who report being victims of such crimes by offering information and directing them to appropriate and available resources.
- Sharing information with state and community agencies and not-for-profits who may work with and support students, employees, and community members, so that they can be of best assistance when they learn that someone has experienced such harm.
- Considering including the violations referenced in this law in your policies or reviewing your policies to make sure that these violations are covered by your current definitions. For example, consider whether your policy addressing sexual misconduct includes sexual exploitation as a violation, and whether the definition contemplates that images or other content created or shared without consent might be enhanced or generated using technology.
- Considering including related violations or cross-referencing the sexual exploitation prohibition in your policy governing the acceptable use of technology.
Exemptions
The law appropriately makes reasonable exemptions, including:
- investigative, protective, or intelligence activities of federal, state, and local law enforcement agencies and intelligence agencies;
- reasonable and good faith disclosures within legal and law enforcement proceedings;
- when possessed or shared for medical, scientific, and educational purposes;
- when reporting such a depiction for legal or takedown purposes;
- when seeking support or help for receiving such an image; and
- when assisting the identifiable individual.
It is not a violation for a person to possess or publish a depiction of themselves.
Institutions should take note of these exceptions in considering whether their campus public safety unit are sworn law enforcement or not, to be aware of which specific exceptions apply.
- A Notice and Takedown System for Intimate Images
The new law requires that certain internet providers develop a process for people harmed by the sharing of internet images to request that such images be removed from their service.
In the Digital Millennium Copyright Act of 1998, Congress amended the Copyright Act to require, among other provisions, that to reduce or avoid liability for a copyright violation, certain online service providers (OSP’s) must allow for a notice and takedown process wherein alleged copyright violations can be brought to the attention of the OSP and the OSP can respond and remove the violating content. Higher education institutions became accustomed to receiving and addressing these notices in the 2000’s when peer-to-peer filesharing became popular.
Many years ago, this author testified and called for a similar approach to bullying and harassment within the Communications Decency Act, and the Take It Down Act takes an important step by applying these principles towards actual and fabricated intimate images.
The Take It Down Act requires that on or before May 19, 2026, all covered providers establish processes for an individual identifiable in an illicit image (or an authorized person acting on their behalf) to notify the provider that an image of them was published without consent and request that the image be taken down. Covered providers are generally websites, online services, online applications, or mobile applications that provide in their regular course of business fora for user generated content. The notice must include a physical or electronic signature, a localized identification of where the image is, a brief statement that the image was not shared consensually, and the contact information for the identified individual in the image.
The notice on each platform must be conspicuous, clear, easy to read, and in plain language and it must provide information on how the notice and takedown system works.
Upon receiving a valid notice, the provider must remove the intimate image flagged and make reasonable efforts to remove known identical copies “as soon as possible” but at the most within 48 hours.
Next Steps: Colleges and universities have no technical obligations under this section of the law, but when covered platforms begin to stand up these notice and takedown systems, institutions may wish to link to those notice forms from relevant portions of their websites and update training and guidance for staff to help them direct those impacted by these crimes to resources to have the images removed.
To learn more about how to support your school or institution in regard to the Take It Down Act, contact our Client-Services Team.
This Grand River Solutions news post does not constitute legal advice, and no attorney-client relationship is formed. Readers are encouraged to seek the assistance of counsel for legal advice.
*The author is grateful to Dr. Randi Spiker of Florida Atlantic University and Christian Murphy of Catharsis Productions for generously sharing their research and scholarship on TFSV.
Additional Resources:
- National Center for Missing and Exploited Children (NCMEC): Is Your Explicit Content Out There?
- New York State Office for the Prevention of Domestic Violence: Technology-Facilitated Gender-Based Violence
- VAWNet: Technology-Facilitated Abuse
- American Bar Association: Five Myths About Technology-Facilitated Intimate Partner Violence (“Tech Abuse”)