The perils of using AI without a Data Protection Impact Assessment (DPIA): Lessons from Snapchat’s My AI



In the era of rapid technological advancements, artificial intelligence (AI) has become an integral part of our lives, revolutionising industries and offering innovative solutions. However, with great power comes great responsibility, especially concerning data protection. The recent preliminary enforcement notice issued by the UK Information Commissioner’s Office (ICO) to Snapchat serves as a stark reminder of the perils of deploying AI without first conducting a Data Protection Impact Assessment (DPIA).

Snapchat’s My AI and the ICO’s Concerns

In October 2023, the ICO issued Snapchat with a preliminary enforcement notice, raising concerns about their generative AI chatbot named ‘My AI.’ This AI-powered feature, launched by Snapchat in February 2023 and rolled out to its UK user base in April 2023, utilised OpenAI’s GPT technology. While the introduction of My AI promised exciting interactions, it also raised significant privacy concerns.

The ICO’s investigation revealed that Snapchat’s risk assessment conducted before launching My AI did not adequately evaluate the data protection risks associated with this generative AI technology. In simpler terms, Snapchat failed to perform a comprehensive DPIA.

The Importance of a DPIA

A DPIA is a crucial step in the development and deployment of AI systems, especially those that handle personal data. It is a systematic assessment that identifies and evaluates the potential risks and impacts of data processing activities. DPIAs help organisations proactively mitigate risks, ensure compliance with data protection regulations, and protect individuals’ privacy rights.

Snapchat’s failure to conduct a DPIA led to several privacy concerns:

  1. Data Privacy: Users interacting with My AI may have shared personal information unknowingly, and this data might not have been adequately protected.
  2. Data Security: Inadequate risk assessment could result in vulnerabilities that malicious actors could exploit, jeopardising user data security.
  3. User Consent: Users might not have been adequately informed about how their data was being used by My AI, raising questions about the validity of user consent.

The ICO’s Stance on Data Protection

The ICO’s preliminary enforcement notice to Snapchat serves as a clear indication that regulatory authorities are taking data protection seriously in the AI landscape. They stress that organisations must consider both the benefits and risks associated with AI technologies.

The Future of AI and Data Protection

Snapchat’s case highlights a broader issue within the AI industry. As AI technologies become common place, companies must prioritise data protection from the outset. Failing to conduct a DPIA not only exposes users to privacy risks but also leaves organisations vulnerable to regulatory actions and potential reputational damage.

To navigate this evolving landscape, companies developing or using generative AI must:

  1. Conduct Comprehensive DPIAs: Organisations should invest time and resources into thorough DPIAs before deploying AI systems, ensuring they identify and mitigate potential risks.
  2. Enhance Transparency: Clear and concise communication with users regarding data usage, consent, and privacy policies is essential to build trust and maintain compliance.
  3. Regularly Review and Update: AI systems should be continuously monitored and updated to adapt to changing data protection regulations and emerging risks.


The ICO’s preliminary enforcement notice to Snapchat over the My AI feature serves as a cautionary tale for organisations venturing into the world of AI. Neglecting data protection and failing to conduct a DPIA can have dire consequences. As AI continues to evolve and integrate into various aspects of our lives, ensuring data privacy and security must remain paramount. Organisations must view data protection as an integral part of AI development, aligning innovation with ethical and legal responsibilities to protect user privacy rights.