3-in-5 Images Generated by Grok Are Sexualized
- Musk Exposed
- Jan 22
- 2 min read
Updated: Jan 23

Elon Musk’s AI chatbot Grok generated millions of sexualized images on X in late December and early January.
Over a nine-day period, Grok posted more than 4.4 million images to its public X account. A statistical analysis by the Center for Countering Digital Hate estimated that roughly 65 percent of the images, or more than three million, were sexualized depictions of women, men, or children. The Times conservatively estimated that at least 41 percent—about 1.8 million images—contained sexualized imagery of women.
The surge began in late December, after users discovered that Grok would comply with requests to alter real photographs by removing clothing or placing people—often women and children—into sexualized poses. Because Grok’s X account publicly posted the resulting images, the content spread rapidly and visibly across the platform.
The scale and speed of the image generation prompted investigations by regulators in several countries, including the United States, Britain, India, and Malaysia, over potential violations of laws governing nonconsensual sexual imagery and child exploitation.
“This is industrial-scale abuse of women and girls,” said Imran Ahmed, chief executive of the Center for Countering Digital Hate. He said that while similar “nudifying” tools exist elsewhere, none had previously been integrated so directly into a major social media platform.
The spike in activity followed posts by Mr. Musk on Dec. 31 showcasing Grok-generated images, including one depicting himself in a bikini. In the nine days after those posts, Grok generated more than 4.4 million images, compared with about 312,000 in the preceding nine days, according to data from the analytics firm Tweet Binder.
On Jan. 8, X restricted Grok’s image-generation features to paying users and later added limits preventing prompts for images of real people in revealing clothing. X said it has “zero tolerance” for child sexual exploitation and nonconsensual nudity. The restrictions reduced public image generation on X, though Grok’s app and website continue to allow users to generate sexual content privately.
Analyses found that many of the altered images depicted influencers, musicians, and actresses, including the mother of Elon Musk's 13th child, Ashley St. Clair, and 14- year- old Stranger Things Actress Nell Fisher. Some images portrayed women covered in fluids or holding sexually suggestive objects. The Center for Countering Digital Hate identified sexualized images of children within its sample and estimated that tens of thousands may have been generated overall.
Experts said the episode was unprecedented in its visibility and volume. Even at its peak, one of the largest deepfake forums hosted tens of thousands of sexualized videos—far fewer than the millions of images produced by Grok in just days.


Comments