In a significant move, the Italian Data Protection Authority (GPDP) has fined OpenAI €15 million after concluding an investigation into the company’s handling of personal data in training its AI application, ChatGPT. The investigation, which highlighted breaches of the General Data Protection Regulation (GDPR), brings OpenAI under increased scrutiny as global regulators tighten their grip on tech companies.
Key Findings of the Investigation
The GPDP’s investigation revealed several violations:
- Illegal Data Processing: OpenAI processed personal data without an appropriate legal basis, breaching GDPR rules.
- Transparency Gaps: The company failed to provide adequate information to users about how their data was being collected and used.
- Inadequate Age Verification: ChatGPT lacked mechanisms to prevent users under 13 from accessing the platform, potentially exposing minors to inappropriate content.
Mandatory Awareness Campaign
To address these issues, OpenAI has been ordered to conduct a six-month communication campaign aimed at increasing public awareness of its data practices. The campaign, spanning radio, television, print, and online platforms, will inform users about:
- How data is collected and processed.
- Their rights under GDPR, including the ability to object, rectify, or delete personal data.
A Broader Context of Crackdowns
OpenAI’s fine comes amid a wave of regulatory actions against major tech companies in Europe:
- Netflix: The Dutch Data Protection Authority fined the streaming service €4.75 million for failing to clearly disclose its data handling practices between 2018 and 2020.
- Meta: The Irish Data Protection Commission issued a €251 million fine over a 2018 data breach impacting 29 million Facebook users.
OpenAI’s Past Challenges in Italy
This isn’t the first time OpenAI has faced issues in Italy. In 2023, ChatGPT was temporarily banned in the country over privacy concerns. The ban was lifted after OpenAI implemented changes, including user consent mechanisms.
Implications for Tech Companies
The €15 million fine underscores the growing vigilance of European regulators in ensuring compliance with data protection laws. OpenAI’s case serves as a stark reminder for tech companies to prioritize transparency and user consent, particularly when handling sensitive personal data.
As data privacy continues to dominate global conversations, companies like OpenAI are under increasing pressure to align their practices with regulatory standards.


