That is AI generated summarization, which can have errors. For context, at all times discuss with the complete article.
The ‘Take It Down Act’ takes impact on Could 19, 2026 within the US, requiring platforms to adjust to takedown requests of sexually specific photos inside 48 hours
Because the finish of December, 2025, X’s synthetic intelligence chatbot, Grok, has responded to many customers’ requests to undress actual individuals by turning images of the individuals into sexually specific materials. After individuals started utilizing the function, the social platform firm confronted international scrutiny for enabling customers to generate nonconsensual sexually specific depictions of actual individuals.
The Grok account has posted hundreds of “nudified” and sexually suggestive photos per hour. Much more disturbing, Grok has generated sexualized photos and sexually specific materials of minors.
X’s response: Blame the platform’s customers, not us. The corporate issued an announcement on Jan. 3, 2026, saying that “Anybody utilizing or prompting Grok to make unlawful content material will undergo the identical penalties as in the event that they add unlawful content material.” It’s not clear what motion, if any, X has taken in opposition to any customers.
As a authorized scholar who research the intersection of legislation and rising applied sciences, I see this flurry of nonconsensual imagery as a predictable final result of the mixture of X’s lax content material moderation insurance policies and the accessibility of highly effective generative AI instruments.
Focusing on customers
The fast rise in generative AI has led to numerous web sites, apps and chatbots that enable customers to supply sexually specific materials, together with “nudification” of actual kids’s photos. However these apps and web sites are usually not as extensively identified or used as any of the foremost social media platforms, like X.
State legislatures and Congress had been considerably fast to reply. In Could 2025, Congress enacted the Take It Down Act, which makes it a felony offense to publish nonconsensual sexually specific materials of actual individuals. The Take It Down Act criminalizes each the nonconsensual publication of “intimate visible depictions” of identifiable individuals and AI- or in any other case computer-generated depictions of identifiable individuals.
These felony provisions apply solely to any people who put up the sexually specific content material, to not the platforms that distribute the content material, reminiscent of social media web sites.
Different provisions of the Take It Down Act, nonetheless, require platforms to ascertain a course of for the individuals depicted to request the removing of the imagery. As soon as a “Take It Down Request” is submitted, a platform should take away the sexually specific depiction inside 48 hours. However these necessities don’t take impact till Could 19, 2026.
Issues with platforms
In the meantime, person requests to take down the sexually specific imagery produced by Grok have apparently gone unanswered. Even the mom of one in every of Elon Musk’s kids, Ashley St. Clair, has not been capable of get X to take away the pretend sexualized photos of her that Musk’s followers produced utilizing Grok. The Guardian studies that St. Clair stated her “complaints to X workers went nowhere.”
This doesn’t shock me as a result of Musk gutted then-Twitter’s Belief and Security advisory group shortly after he acquired the platform and fired 80% of the corporate’s engineers devoted to belief and security. Belief and security groups are usually accountable for content material moderation and initiatives to forestall abuse at tech corporations.
Publicly, it seems that Musk has dismissed the seriousness of the state of affairs. Musk has reportedly posted laugh-cry emojis in response to among the photos, and X responded to a Reuters reporter’s inquiry with the auto-reply “Legacy Media Lies.”
Limits of lawsuits
Civil lawsuits like the one filed by the mother and father of Adam Raine, a teen who dedicated suicide in April 2025 after interacting with OpenAI’s ChatGPT, are one solution to maintain platforms accountable. However lawsuits face an uphill battle in the US given Part 230 of the Communications Decency Act, which typically immunizes social media platforms from authorized legal responsibility for the content material that customers put up on their platforms.
Supreme Courtroom Justice Clarence Thomas and plenty of authorized students, nonetheless, have argued that Part 230 has been utilized too broadly by courts. I typically agree that Part 230 immunity must be narrowed as a result of immunizing tech corporations and their platforms for his or her deliberate design decisions — how their software program is constructed, how the software program operates and what the software program produces — falls exterior the scope of Part 230’s protections.
On this case, X has both knowingly or negligently did not deploy safeguards and controls in Grok to forestall customers from producing sexually specific imagery of identifiable individuals. Even when Musk and X consider that customers ought to have the flexibility to generate sexually specific photos of adults utilizing Grok, I consider that in no world ought to X escape accountability for constructing a product that generates sexually specific materials of real-life kids.
Regulatory guardrails
If individuals can’t maintain platforms like X accountable through civil lawsuits, then it falls to the federal authorities to analyze and regulate them. The Federal Commerce Fee, the Division of Justice or Congress, for instance, may examine X for Grok’s era of nonconsensual sexually specific materials. However with Musk’s renewed political ties to President Donald Trump, I don’t count on any severe investigations and accountability anytime quickly.
For now, worldwide regulators have launched investigations in opposition to X and Grok. French authorities have commenced investigations into “the proliferation of sexually specific deepfakes” from Grok, and the Irish Council for Civil Liberties and Digital Rights Eire have strongly urged Eire’s nationwide police to analyze the “mass undressing spree.” The U.Ok. regulatory company Workplace of Communications stated it’s investigating the matter, and regulators within the European Fee, India and Malaysia are reportedly investigating X as nicely.
In the US, maybe the very best plan of action till the Take It Down Act goes into impact in Could is for individuals to demand motion from elected officers. – Rappler.com
The article initially appeared on The Dialog.
Wayne Unger, Affiliate Professor of Legislation, Quinnipiac College

