Clinics File Suit Against Website that Generates Nonconsensual Nude Images
On Oct. 16, the Media Freedom & Information Access Clinic (MFIA) and the Lowenstein International Human Rights Clinic, working with co-counsel Shane Vogt, filed a lawsuit in federal court in New Jersey against ClothOff, a website that uses artificial intelligence to generate hyperrealistic, nonconsensual nude images of real children and adults.
The complaint alleges that ClothOff purposefully markets itself to teenagers and encourages users to create nonconsensual images of children and adults. The suit builds on a prior lawsuit filed by the plaintiff, in which she obtained injunctive relief. It also complements earlier litigation filed by the San Francisco City Attorney’s Office, working with the Law School’s San Francisco Affirmative Litigation Project, to develop new legal pathways to hold producers of nonconsensual deepfake nude images legally accountable.
The clinics represent a New Jersey teenager, who is proceeding pseudonymously as Jane Doe. One of Doe’s high school classmates used ClothOff to generate a hyperrealistic nude image of Doe from a photo on her Instagram account. This image was shared on Snapchat along with deepfake nude images of other girls in Doe’s class. According to the complaint, the experience caused Doe enormous distress and disrupted her high school education. She lives in fear that the image ClothOff created of her remains available online and will resurface, the complaint says.
“What happened to our client is deeply troubling and disturbing,” said MFIA clinic student Brina Harden ’27. “ClothOff is specifically designed and marketed to allow and encourage users to create nonconsensual sexual images of real individuals, including young children. The images are indistinguishable from photos, and they can live on the internet forever, allowing bad actors to exploit, manipulate, and abuse others, particularly women and children.”
The complaint notes that victims of deepfake pornography have no way of knowing whether their images continue to be circulated online.
The complaint alleges that those behind the operation of ClothOff have long tried to shield their own identities, actively using pseudonyms, fake names and addresses, and third-party payment options to avoid detection and legal accountability. Investigative reporting by The Guardian helped inform the complaint, which alleges that Alaiksandr Babichau and Dasha Babicheva, both apparently located in Belarus, are profiting from the website.
“We are committed to serving these defendants abroad to ensure nobody involved in this operation escapes accountability,” says Dara Gold ’27, another MFIA clinic student.
Several social media and technology companies have responded by taking a firm stance against ClothOff’s proliferation of nonconsensual, sexual images, according to the complaint. At the end of the summer, the counsel team sent letters to X Corporation, Discord Inc., Google LLC, and YouTube, LLC requesting that the companies take action to put a stop to ClothOff’s ongoing criminal conduct. The companies responded quickly, removing ClothOff’s accounts and bots, as well as the website’s ability to utilize their credentialing services. Many of the companies said that ClothOff’s activities clearly violate their child safety policies.
While the psychological harm inflicted on Doe cannot be undone, the counsel team is working to stop ClothOff’s activities, as well as to demonstrate the viability of legal claims against websites like ClothOff.
“The law tends to lag behind technological developments,” said Jeanica Geneus ’27, a student in the MFIA clinic. “The law around deepfakes is just developing, and we’re hoping to provide victims with more tools to hold perpetrators like ClothOff accountable. We aim to demonstrate that existing legal prohibitions can protect those most vulnerable to exploitation by websites like ClothOff. We hope to see justice served, both for the plaintiff in this case and for the countless children that have been and will be exploited by these AI websites and platforms, if they are not shut down.”
The complaint was covered in an article in The Wall Street Journal, emphasizing the growing concern of deepfake technology’s misuse to generate nonconsensual, sexually explicit images of minors and adults, and subsequently discussed by Vogt in an interview on Fox News.
The plaintiffs in the suit are represented by Yale Law School’s Media & Freedom Information Access Clinic, Shane Vogt of Vogt Law, and local New Jersey counsel John Gulygas of Barr & Gulyas, LLC.
Students from Yale Law School’s Lowenstein International Human Rights Clinic supported the development of the litigation. The MFIA and Lowenstein clinic teams were led by Yale Law School students Jeanica Geneus ’27, Dara Gold ’27, Brina Harden ’27, Atia Ahmed ’26, Andrea DenHoed ’26, Raymond Perez ’26, Shannon Sommers ’26, Victoria Maras ’25, Visiting Clinical Lecturer in Law Tobin Raju, and Visiting Clinical Associate Professor of Law John Langford ’14.