Undressing the Law: Chicago-Kent Professor Breaks Down Grok Photo Scandal

Date

Author

By Kayla Molander
Alexandra Yelderman

鈥淔rom a criminal law standpoint, AI-generated nude images might run afoul of the recently enacted Take It Down Act, which criminalizes the publication of certain digitally created 鈥榠ntimate visual depictions鈥 of nonconsenting individuals,鈥 says Chicago-Kent College of Law Professor .

Artificial intelligence-generated sexual imagery began the social media platform X (formerly known as Twitter) in December 2025. The images were created by X owner Elon Musk鈥檚 AI chatbot Grok. The chatbot was responding to user requests to alter photos of real people to remove their clothing, pose them in suggestive ways, or dress them in revealing garb.

While the European Union and other international entities are , Yelderman says the legal situation in the United States consists of a lot of gray area.

鈥淭he Take It Down Act forbids the nonconsensual publication of digitally-altered nudes of identifiable people,鈥 she says. 鈥淚t would seem that merely suggestive material鈥攐r bikini pictures鈥攚ouldn鈥檛 count. While the act criminalizes the knowing publication of certain images, it doesn鈥檛 appear to affect people who use AI to 鈥榰ndress鈥 adults for their own personal enjoyment.鈥

However, Yelderman says that images of minors are a completely different story.

鈥淎ny pornographic depiction of an actual, identifiable minor, whether real or morphed, is illegal,鈥 says Yelderman, noting that the images don鈥檛 have to be published to run afoul of the law. 鈥淗owever, depictions of minors in bikinis鈥攁s Grok was alleged to create鈥攎ight not qualify as child pornography under current case law.鈥

If images created by Grok turn out to be ruled illegal in a court of law, the issue of liability for the violation would still have to be decided.

鈥淕rok鈥檚 own legal liability is an open question. The platform on which the images were published would be immune from civil liability under Section 230 of the Communications Decency Act,鈥 says Yelderman. 鈥淏ut creators are another story, and people whose likenesses are digitally altered to appear nude might be able to sue them in state court under defamation/ false light or 鈥榓ppropriation of likeness鈥 statutes.鈥

As for AI-generated images that are completely fabricated or that do not look like a real person, those follow another legal framework.

鈥淲hen AI generates realistic sexual images from scratch, the law鈥檚 power is much more limited,鈥 Yelderman says. 鈥淚t鈥檚 obscenity doctrine that defines the contours of what鈥檚 legally protected in those cases. Even the most offensive digitally created sexual depictions鈥攊ncluding those of generated children鈥攃annot be outlawed if they have serious literary, artistic, political, or scientific value. Once actual people are no longer in the equation, it becomes much harder to outlaw prurient material.鈥