Skip to content
GDPR Obligations calendar_today Updated: 7 April 2026 schedule 7 min read

AI Tools and Privacy: What Should You Watch Out For?

ChatGPT, Copilot, Midjourney - more and more businesses use AI tools. But what personal data goes in? Who is the processor? And do you need a processing agreement? This article explains the GDPR implications.

summarize Key Takeaways
  • check_circle Everything you enter into an AI tool may be used to train the model, unless you explicitly disable that
  • check_circle If you enter personal data into an AI tool, that is processing under the GDPR
  • check_circle Italy temporarily banned ChatGPT in 2023 over privacy violations - a warning for all of Europe
  • check_circle You need a processing agreement with your AI provider if you process personal data through the tool

AI is everywhere, including your business

More and more businesses use AI tools in their daily work. ChatGPT for writing emails, Copilot for generating code, Midjourney for images, or AI features in their CRM to analyse customer data. Convenient, but from a GDPR perspective there are serious implications.

The core question is simple: what data goes into the AI tool, and what happens to it?

What makes AI tools different?

With traditional software (your accounting system, your CRM), you know fairly precisely where your data is and what happens to it. With AI tools, that is different:

  • Training data. Many AI models use user input to improve the model. What you enter may be processed in ways you don’t expect.
  • Opacity. You don’t know exactly how the model handles your data. Where is it stored? For how long? Who has access?
  • Servers outside the EU. Most major AI providers (OpenAI, Google, Microsoft) process data on US servers. That constitutes a transfer of personal data to a third country.

Case study: Italy bans ChatGPT

In March 2023, the Italian supervisory authority (Garante) temporarily banned ChatGPT. The reasons:

  • No valid legal basis for collecting and processing personal data to train the model
  • No age verification, allowing minors access without protection
  • No transparency towards users about what happened to their data
  • No mechanism for data subjects to exercise their rights (access, deletion)

OpenAI made adjustments (clarified privacy policy, option to disable training data, added age verification) and ChatGPT was reinstated. But the signal was clear: AI tools must comply with the same GDPR rules as any other software.

European supervisory authorities have since established a joint taskforce specifically for ChatGPT and similar AI services. This is not a one-off incident - it is the start of structural enforcement.

The three questions you must ask

1. What personal data goes in?

Be honest: do you sometimes paste a customer complaint email into ChatGPT to draft a response? Do you paste CVs into an AI tool for a summary? Do you enter customer names and email addresses?

As soon as you enter identifiable personal data, that is processing under the GDPR. It doesn’t matter that you “just quickly” wanted something rewritten.

2. Who is the processor?

The role allocation under the GDPR is important:

  • Controller (you): you determine the purpose and means of processing
  • Processor (the AI provider): processes data on your behalf

With the free version of ChatGPT, OpenAI is partly a joint controller, as they may use your input for model training. With ChatGPT Enterprise or the API version, OpenAI acts as a processor and offers a Data Processing Agreement (DPA).

This distinction is crucial. With a processor, you have control via a processing agreement. With a joint controller, the situation is more complex and you have less grip on what happens with the data.

3. Is there a processing agreement?

If you use an AI tool commercially and send personal data through it, you need a processing agreement (DPA). Check:

  • Does the provider offer a DPA? (Enterprise versions of ChatGPT, Copilot, and Claude do)
  • Where is the data processed? (EU or US?)
  • Can the provider use the data for training? (If so, they are not a pure processor)
  • What security measures are in place?

Practical do’s and don’ts

What you CAN do

  • Use AI for generic tasks that don’t require personal data: text suggestions, translating standard texts, brainstorming marketing ideas
  • Anonymise data before entering it: replace names with “Customer A”, remove email addresses and phone numbers
  • Choose a business subscription with a DPA if you use AI structurally (ChatGPT Enterprise, Microsoft Copilot for Business, Claude for Work)
  • Disable training data where possible; in ChatGPT you can indicate in settings that your data may not be used for training
  • Document your AI usage in your processing register
  • Create an internal AI policy that tells employees which tools they may use and what data they may or may not enter. Read our practical guide on creating an AI acceptable use policy

What you should NOT do

  • Paste customer data into the free version of ChatGPT or similar tools
  • Have CVs summarised by an AI tool without a DPA
  • Enter medical or financial data into any AI tool without strict safeguards
  • Make automated decisions about individuals (e.g. screening applicants) without human intervention and without a DPIA
  • Assume it’s safe because “everyone uses it” - popularity is not a legal basis

What should you do now?

  1. Inventory which AI tools you and your employees use
  2. Assess per tool whether personal data goes in
  3. Check whether a DPA is available and whether training data can be disabled
  4. Document AI usage in your processing register
  5. Create an internal AI policy with clear guidelines for employees
  6. Consider a DPIA if you use AI for profiling, automated decision-making, or large-scale data processing
auto_awesome Automate your GDPR file?

GDPRWise scans your website, detects processing activities and third parties, and helps you build your complete GDPR file - including processing register and processing agreements for all your tools.

GW
GDPRWise Editorial

This article was written by the GDPRWise team and reviewed by our privacy experts. We regularly review our content for accuracy and legal correctness.