Elon Musk's X has moved to restrict its Grok AI tool from editing photos of real people into sexualized images, following weeks of public outcry and government scrutiny. The platform announced it would prevent the feature in jurisdictions where such deepfakes are illegal, and limit image editing to paid users only.
The shift came after numerous women—including journalists, academics, and everyday users—discovered their photos had been altered without consent to show them in revealing clothing. The scale of the problem became impossible to ignore: women across the UK, US, and elsewhere reported being targeted by the same tool that Musk had promoted as a major feature of X's AI capabilities.
Jess Davies, a journalist who was among the early targets, called the change a "positive step" but underscored the damage already done. "It's a sobering thought to think of how many women including myself have been targeted by this," she told the BBC. Dr Daisy Dixon, a philosophy lecturer at Cardiff University, described feeling "shocked" and "humiliated" after discovering her image had been manipulated on the platform. For her, the policy shift felt like a "battle-win," but the harm—she said—should never have occurred in the first place.
We're a new kind of news feed.
Regular news is designed to drain you. We're a non-profit built to restore you. Every story we publish is scored for impact, progress, and hope.
Start Your News DetoxHow pressure created change
The reversal didn't happen in a vacuum. UK regulators flagged concerns, California's attorney general launched an investigation into the spread of sexualized deepfakes (including of children), and campaigners at organizations like the End Violence Against Women Coalition made clear this wasn't a minor feature—it was a platform failure. Andrea Simon, the coalition's director, noted that "victims of abuse, campaigners and a show of strength from governments" had forced the company's hand.
X's response includes geoblocking the feature in jurisdictions where it's illegal and restricting image editing to paying users. But significant questions remain. How will the platform actually enforce location-based blocks? Will determined users find workarounds? And what happens to images that have already been created and shared.
The UK regulator Ofcom called the move "welcome" but emphasized its investigation into whether X broke existing laws continues. That investigation could carry real consequences if the platform is found non-compliant.
What started as individual women speaking up about their violated images has become a test case for whether tech platforms will act before regulators force them to—or only after public pressure becomes too loud to ignore.









