AI systems processing EU personal data in 2026 must comply with the GDPR alongside the EU AI Act. GDPR fines reach EUR 20 million or 4% of global turnover, and regulators (CNIL, Garante, DPC, BfDI) have now litigated every major LLM provider.
The EU General Data Protection Regulation (Regulation (EU) 2016/679) applies whenever personal data is processed by AI. The European Data Protection Board (EDPB) issued Opinion 28/2024 on 17 December 2024 specifically addressing AI models trained on personal data. It confirmed:
| Lawful Basis | Applicability to AI | Typical Use |
|---|---|---|
| Consent (Art. 6(1)(a)) | Possible but impractical at scale | User-initiated features |
| Contract (Art. 6(1)(b)) | Narrow | Personalisation of a contracted service |
| Legal obligation (Art. 6(1)(c)) | Rare | Regulatory required screening |
| Vital interests (Art. 6(1)(d)) | Emergency medical AI | Exceptional |
| Public task (Art. 6(1)(e)) | Government AI | Public bodies |
| Legitimate interests (Art. 6(1)(f)) | Most common for training | Requires LIA |
| Case | Authority | Year | Outcome |
|---|---|---|---|
| ChatGPT temporary ban | Garante (Italy) | 2023 | Service restored after compliance changes |
| Clearview AI | CNIL (France) | 2022 | EUR 20M fine |
| ChatGPT training data | Garante | 2024 | EUR 15M fine |
| DeepSeek | Garante | 2025 | Service restricted |
| Replika | Garante | 2023 | Temporary ban on processing Italian user data |
OpenAI ChatGPT — Garante banned in March 2023 after a data breach exposed conversation titles; service restored April 2023 after OpenAI added opt-out, age gate, and updated Privacy Policy. A EUR 15M fine followed in December 2024.
Replika — Garante imposed a temporary processing ban in February 2023, citing risks to minors and data-processing transparency.
X (Twitter) Grok — Irish DPC secured a voluntary undertaking in August 2024 to pause training on EU user data pending an Article 22 investigation.
Every AI product processing EU personal data must:
Q: Can I train an LLM on publicly scraped data? Only with a lawful basis and after a robust LIA. Public availability does not make data free for training.
Q: Does Article 22 prohibit all automated decisions? No — it restricts decisions with "legal or similarly significant effects" unless consent, contract necessity, or explicit legal authorisation applies.
Q: What is EDPB Opinion 28/2024? A formal EDPB opinion clarifying GDPR application to AI training, deployment, and anonymisation claims.
Q: Is anonymisation possible for LLMs? Only if personal data cannot be inferred from outputs — a high bar that must be empirically demonstrated.
Q: Must I do a DPIA for every AI system? Yes for high-risk processing (profiling, biometrics, innovative tech); good practice for all AI.
Q: Do Chapter V transfers still work post-Schrems II? Yes — SCCs with transfer impact assessments; the EU-US Data Privacy Framework (July 2023) restored adequacy for certified US organisations.
Q: What are typical GDPR AI fines? EUR 15-20M for severe violations; the EDPB has emphasised that AI-specific context can aggravate penalties.
GDPR and the EU AI Act are now co-enforced. Building AI that respects data-subject rights is faster than defending against regulators.
Audit your AI against GDPR with Misar AI's privacy compliance toolkit.
Free newsletter
Join thousands of creators and builders. One email a week — practical AI tips, platform updates, and curated reads.
No spam · Unsubscribe anytime
Comments
Sign in to join the conversation
No comments yet. Be the first to share your thoughts!