Skip to main content

VeratoAI

GDPR and AI Tools

Back
Regulation
5 min read·Feb 18, 2026·VeratoAI

The problem nobody's talking about

Every time a member of your team pastes a client's name, email address, or personal details into ChatGPT, Claude, or any other AI tool, they're initiating a data processing activity. Under GDPR, that processing needs to be documented, have a lawful basis, and comply with data protection principles.

In most businesses, none of those things are happening.

What GDPR requires

The General Data Protection Regulation sets out clear obligations for any organisation that processes personal data. When it comes to AI tools, the key requirements are:

Lawful basis for processing

You need a lawful basis to process personal data. If staff are entering client details into AI tools, what's your lawful basis? In most cases, businesses haven't even considered the question — let alone documented an answer.

Data processing records

Article 30 of GDPR requires you to maintain records of all processing activities. If you don't know which AI tools your staff are using, you can't maintain accurate records. Every undiscovered AI tool is a gap in your compliance documentation.

Data transfer safeguards

Many AI tools process data on servers located outside the UK and EU. Under GDPR, international data transfers require specific safeguards — such as Standard Contractual Clauses or adequacy decisions. Free AI tools rarely provide these protections.

Data minimisation

GDPR requires you to process only the minimum amount of personal data necessary. Staff pasting entire client files into AI tools for a quick summary are almost certainly processing more data than necessary.

Real-world scenarios

A solicitor pastes case notes into ChatGPT to help draft a client letter. Those notes contain the client's name, address, and details of their legal matter. The data is now processed by OpenAI's servers in the US.

An HR manager uses an AI tool to help write a performance review. They include the employee's name, role, salary, and performance concerns. That personal data is now held by a third party.

A financial adviser uploads a client's portfolio statement to an AI tool for analysis. The document contains the client's name, national insurance number, and full financial history.

In each case, the employee was trying to work more efficiently. In each case, they may have breached GDPR without realising it.

What to do about it

1. Audit your AI tool usage

You can't fix what you can't see. Discover every AI tool in use across your business — approved and unapproved.

2. Update your data processing records

Add every AI tool to your Article 30 records. Document what data is processed, where it's stored, and what safeguards are in place.

3. Write clear data handling rules

Tell staff exactly what types of data can and cannot be entered into AI tools. Make the rules specific and easy to follow.

4. Review your contracts

Check whether your client agreements or NDAs prohibit sharing information with third parties. If they do, AI tools are third parties.

5. Train your team

Most staff have no idea that pasting client data into ChatGPT has GDPR implications. Training is the fastest way to reduce risk.

The cost of getting it wrong

ICO enforcement actions for GDPR breaches can result in fines of up to £17.5 million or 4% of annual global turnover. But the reputational damage of a data breach involving client information can be far more costly — especially for professional services firms where trust is everything.

Ready to get started?

Book a free 30-minute AI Governance Discovery Call. No jargon, no pressure.

Book a call

Ready to get your AI governance sorted?

Book a free 30-minute discovery call. No jargon, no pressure — just a clear conversation about where you stand.