Grok AI continues to digitally undress women and children despite platform’s pledge to act
- Musk Exposed
- Jan 6
- 3 min read

Degrading images of children and women, digitally undressed by Elon Musk’s AI chatbot Grok, continue to circulate on X, despite the platform’s promise to suspend users who generate them.
The use of Grok to alter photographs and create sexualized images of real women and children in underwear without consent has prompted urgent scrutiny. On Monday, the UK communications watchdog, Ofcom, said it had made “urgent contact with X and xAI to understand what steps they have taken to comply with their legal duties to protect users in the UK.” The regulator added that it would consider whether a formal investigation is necessary based on the company’s response.
The surge in content began after a December update to Grok made it easier for users to post photographs and request that clothing be digitally removed, permitting users to generate images of individuals in minimal clothing or sexually suggestive poses.
On Sunday and Monday, users continued creating sexually suggestive images of minors, including children as young as 10. Ashley St Clair, the mother of one of Musk’s children, reported that Grok had generated an image of her at 14 years old in a bikini. Similarly, a picture of Stranger Things actor Nell Fisher, 14, was digitally altered on Sunday to show her in a banana-print bikini. Many women have expressed outrage on X after discovering that their images had been manipulated without consent. Some AI-generated images depicted women and children with substances resembling semen on their faces and chests.
Researchers at AI Forensics, a Paris-based nonprofit, analyzed 50,000 mentions of @Grok on X and 20,000 AI-generated images posted between 25 December and 1 January. They found that over a quarter of the mentions were requests for image creation, and that the most common prompts included terms such as “her,” “put,” “remove,” “bikini,” and “clothing.”
More than half of the images depicted people in “minimal attire” like underwear or bikinis, mostly women under 30. Approximately 2% of the images appeared to show people under 18, including some representing children under five. Most of the content remained available online and included requests for extremist content such as Nazi and Islamic State propaganda.
Initially, Musk appeared to find the trend amusing, posting a crying-with-laughter emoji in response to a digitally manipulated toaster wearing a bikini. He wrote: “Not sure why, but I couldn’t stop laughing at this one.” Following global backlash over the harmful nature of the content, he later stated that “anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.”
An X spokesperson said: “We take action against illegal content on X, including child sexual abuse material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary.”
Earlier statements from Grok claiming that it had “identified lapses in safeguards” and was “urgently fixing them” were in fact generated by artificial intelligence. It remains unclear whether the company is taking any real action to address the failures in safeguards.
Regulators have been working for years to outlaw nudification apps on phones, with mixed success. Last month’s Grok updates have mainstreamed the problem, making it simple to create and share intimate images of individuals with most of their clothing removed on one of the internet’s most popular platforms.
Jess Asato, a politician for the the British Labour Party, who has been campaigning for better regulation of pornography, said: “It is up to Elon Musk to realise this is a form of sexual assault and take appropriate action. It is taking an image of women without their consent and stripping it to degrade her – there is no other reason to do it except to humiliate.”


Comments