Rb. Amsterdam - C/13/783613 / KG ZA 26-120
Dutch court bans X's Grok from generating non-consensual intimate and CSAM imagery in Netherlands.
Summary
Amsterdam District Court issued a preliminary injunction against X.AI and X Corp., prohibiting Grok from generating, distributing, or displaying non-consensual sexual imagery and child sexual abuse material (CSAM) within The Netherlands. The court found that X failed to implement sufficient safeguards despite evidence that Grok users can bypass content boundaries, violating GDPR and Dutch Civil Code. The injunction applies to images of Dutch nationals even if distributed outside the country, though the court acknowledged limited international jurisdiction.
Full text
Help Rb. Amsterdam - C/13/783613 / KG ZA 26-120: Difference between revisions From GDPRhub Jump to:navigation, search ← Older editVisualWikitext Revision as of 13:08, 31 March 2026 view sourceAp (talk | contribs)Bureaucrats, Interface administrators, noContributionReport, Administrators536 editsmTag: Visual edit← Older edit Latest revision as of 14:10, 31 March 2026 view source Ap (talk | contribs)Bureaucrats, Interface administrators, noContributionReport, Administrators536 editsmTag: Visual edit Line 84: Line 84: The legal status of the evidence of child sexual abuse material (CSAM) on the platform was, in the court’s view, more complicated to determine. The court stated that the specific image submitted by the organisation did not show explicit nudity, and it was not evident that the person was a minor since the image depicted a fictional character. Nonetheless, the court concluded that it was not necessary to determine the legal status of the image, as the organisation sufficiently demonstrated that Grok users can push boundaries when generating images. This means that whether CSAM is involved depends on the context, and that the controller had similarly not implemented sufficient measures to prevent this. The legal status of the evidence of child sexual abuse material (CSAM) on the platform was, in the court’s view, more complicated to determine. The court stated that the specific image submitted by the organisation did not show explicit nudity, and it was not evident that the person was a minor since the image depicted a fictional character. Nonetheless, the court concluded that it was not necessary to determine the legal status of the image, as the organisation sufficiently demonstrated that Grok users can push boundaries when generating images. This means that whether CSAM is involved depends on the context, and that the controller had similarly not implemented sufficient measures to prevent this. The court granted the preliminary injunction claims, as the non consensual images violated the GDPR, and the (facilitation of) generating CSAM violated the national Civil Code. The court found that it was sufficiently plausible that the controller unlawfully and culpably contribute to a climate of online inappropriate behaviour, and that the organisation has sufficient urgent interest. The court dismissed the controller’s arguments in relation to the user being responsible, as the controller (was the intermediaries that controlled the functionalities of Grok. Therefore, the court prohibited both XIUC and X.AI from generating and/or distributing sexual imagery without explicit consent of the individuals involved residing in The Netherlands through its Grok function. This was also the case for producing, distributing, offering and publicly displaying CSAM. The US entity X.AI was included, on the grounds that harm could still occur in The Netherlands even if X facilitates the distribution of images of Dutch nationals outside of The Netherlands. The court acknowledged that it did not have international jurisdiction to rule on the legality of distributing generated images of fictional persons outside of The Netherlands. Therefore, the injunction against X.AI was limited to unlawful images of persons residing in The Netherlands. This limitation, however, did not apply to XIUC. The court granted the preliminary injunction claims, as the non consensual images violated the GDPR, and the (facilitation of) generating CSAM violated the national Civil Code. The court found that it was sufficiently plausible that the controller unlawfully and culpably contributes to a climate of online inappropriate behaviour, and that the organisation has sufficient urgent interest. The court dismissed the controller’s arguments in relation to the user being responsible, as the controller (was the intermediaries that controlled the functionalities of Grok. Therefore, the court prohibited both XIUC and X.AI from generating and/or distributing sexual imagery without explicit consent of the individuals involved residing in The Netherlands through its Grok function. This was also the case for producing, distributing, offering and publicly displaying CSAM. The US entity X.AI was included, on the grounds that harm could still occur in The Netherlands even if X facilitates the distribution of images of Dutch nationals outside of The Netherlands. The court acknowledged that it did not have international jurisdiction to rule on the legality of distributing generated images of fictional persons outside of The Netherlands. Therefore, the injunction against X.AI was limited to unlawful images of persons residing in The Netherlands. This limitation, however, did not apply to XIUC. == Comment ==== Comment == Latest revision as of 14:10, 31 March 2026 Rb. Amsterdam - C/13/783613 / KG ZA 26-120 Court: Rb. Amsterdam (Netherlands) Jurisdiction: Netherlands Relevant Law: GDPR (no specific Articles mentioned)Article 6:162(2) BW Decided: 26.03.2026 Published: 26.03.2026 Parties: X.AI LLC X CORP. X INTERNET UNLIMITED COMPANY National Case Number/Name: C/13/783613 / KG ZA 26-120 European Case Law Identifier: ECLI:NL:RBAMS:2026:3106 Appeal from: Appeal to: Unknown Original Language(s): Dutch Original Source: de Rechtspraak (in Dutch) Initial Contributor: ap A court banned X from generating and distributing non consensual intimate and child sexual abuse material (CSAM) through Grok in The Netherlands, and prohibited X from offering Grok functionalities as long as the violations occurred. This was part of preliminary injunction proceedings. Contents 1 English Summary 1.1 Facts 1.2 Holding 2 Comment 3 Further Resources 4 English Machine Translation of the Decision English Summary Facts X (composed of US entities X.AI LLC, X CORP and Irish entity X Internet Unlimited Company, the controller) is a social media platform that provides the generative AI chatbot Grok. It is available as a standalone app or feature on the social media platform. One of the tasks that Grok does is generating visual content, where users can request the LLM to generate and/or edit images. Stichting Offlimits (a non profit organisation) estimated that Grok had generated approximately 3 million sexualised images between December 29 2025 and January 9 2026, of which approximately 23,000 depicted children. The controller had later restricted the feature to paid members, and announced measures to prevent the Grok account on X from editing images of real people. In January 2026 the EU Commission announced that it was launching an investigation on whether the controller met its obligations under the Digital Services Act (DSA) relating to the dissemination of illegal content in the EU. In February 2026 Stichting Offlimits filed preliminary injunction before the court, requesting that the court prohibit the controller from generating and/or distributing sexual imagery through the image generating functionality. This was the case for non consensual imagery and imagery classified as child pornography under national law. In addition, the organisation requested the court to prohibit the controller from offering Grok’s functionality as part of its platform as long as it allows users to generate these images. The organisation brought claims based on tort and violations of the GDPR; specifically it argued that creating and distributing non consensual nude images was unlawful processing of personal data under the GDPR. The controller argued that it was impossible to generate these images, and that it implemented technical safeguards to prevent users from circumventing the restrictions on generating images. Finally, the controller argued that it was the user who generated the images, and that Grok was a tool. Holding The court first clarified that it was competent to hear the case. Under the GDPR, XIUC is the controller for the processing of personal data of data subjects in The Netherlands. Article 79(2) GDPR allows for proceedings to (also) be br