Does Grok train on data from unsigned users in EU?
Executive summary
Grok’s training on EU users’ data has been the subject of regulatory action: X/Twitter agreed with Ireland’s Data Protection Commission (DPC) to suspend and delete certain processing of EU/EEA users’ personal data for Grok training, and the DPC welcomed that suspension [1] [2]. xAI’s public FAQ says unauthenticated (unsigned) Grok sessions may be collected “where permissible,” but explicitly excludes the EU/UK from that treatment, meaning the company asserts it does not collect unauthenticated EU user content for training in those jurisdictions [3].
1. What critics and privacy groups alleged about Grok’s training data
European privacy advocates, notably NOYB, filed complaints alleging X used millions of EU users’ personal data to train Grok without proper notice or consent, arguing the platform did not adequately inform users or obtain lawful consent for repurposing public posts for AI training [4] [5]. Reporting and watchdog commentary emphasise that the controversy centered on whether public posts collected for social interaction were lawfully repurposed for model training under GDPR principles like purpose limitation and transparency [6] [7].
2. What regulators did and what X agreed to do
Faced with Irish DPC enforcement action, X/Twitter agreed to suspend processing the personal data contained in EU/EEA public posts for the purpose of training Grok — an undertaking the DPC publicly welcomed and which required deletion and cessation of use of certain EU user data for Grok training [1] [2] [8]. That injunction covered specific processing between defined dates and was framed as the DPC’s first use of urgent court powers in the matter [1].
3. Company policy on unsigned/unauthenticated users and regional carve‑outs
xAI’s consumer FAQ discloses that when someone does not log into an account (i.e., is unauthenticated), the company “may collect and retain your content on an anonymous basis” in regions where such collection is permissible, and it explicitly notes that this behaviour excludes the EU/UK — signalling the company’s public position that unauthenticated EU users are not subject to the anonymous collection used elsewhere [3]. Other industry and analyst writeups echo that X introduced opt-out settings for EU users and that some jurisdictions have treated the setting as inadequate or defaulted in ways regulators found problematic [9] [10].
4. Residual uncertainties, timelines and competing claims
Reporting shows a temporal and jurisdictional complexity: regulators say X processed EU/EEA posts for Grok between identified dates before the suspension, and subsequent investigations (including a formal DPC probe opened later) continued to examine lawfulness and separation of EU data from non‑EU datasets [1] [11] [12]. National authorities differ in approach — Switzerland’s FDPIC concluded the opt‑out met its rules in that country, while Irish actions forced a different outcome for EU/EEA users — so whether Grok ever trained on unsigned EU users’ content depends on the time window and which legal regime applied [13] [8].
5. Bottom line — direct answer
On the record gathered here, Grok does not collect and use unsigned (unauthenticated) users’ content for training in the EU/UK according to xAI’s FAQ and following X’s enforced undertaking with the Irish DPC to suspend and delete certain EU user data for Grok training; however, prior processing that occurred before that suspension and the precise handling during contested windows remain the subject of regulatory inquiry and complaints [3] [1] [11]. The public record therefore supports a present‑tense “no” for ongoing unauthenticated collection in the EU/UK as claimed by the company and enforced by the DPC, with the critical caveat that historical processing and unresolved investigatory questions are documented by regulators and advocates [2] [4].