Recent incidents on the platform X have raised concerns as users exploit the AI model Grok to generate explicit images of individuals, including minors. The situation has persisted for over a week, with minimal response from the platform, which is not currently required to take significant action under existing laws.
The Take It Down Act, enacted by Congress last year, aims to address nonconsensual sexually explicit material by allowing individuals to request the removal of such content within 48 hours. However, the requirements for platforms to establish notice and takedown systems will not be enforced until May 19, 2026. At present, neither X nor the company xAI, which developed Grok, has a formal process for individual removal requests.
Senator Amy Klobuchar, a co-sponsor of the legislation, emphasized the need for change on the platform, particularly to protect children. In a public statement, she cautioned that if X does not act, the law will mandate compliance. Meanwhile, efforts by users like Ashley St. Clair, who reported a nonconsensual image of herself, have faced challenges, illustrating the limitations of the current reporting system.