
Why you need an AI Governance Policy
If your organisation is using AI without clear internal rules, now is a sensible time to put them in place.
Using AI tools like ChatGPT or Copilot without clear rules creates data protection and confidentiality risks.
AI Governance Policies for UK Organisations
We help organisations put clear AI governance policies in place to manage these risks and ensure AI is used responsibly across the business.
Without an AI governance policy, an organisation is exposed to the risk that employees are entering personal, commercially sensitive information into AI systems such as ChatGPT and Copilot without oversight. This approach also increases the volume of data being created and stored on the organisations servers.
We provide practical legal and compliance support. Helping organisations put clear AI governance in place so their approach is more robust, accountable and easier to defend.

Clara Westbrook
25+ Years PQE
Founder | Qualified Solicitor | Data Protection Specialist
079 7693 9016
Benefits of having an AI governance policy
A clear AI governance policy sets clear rules for staff, supports better decision-making and reduces the risk of personal data, confidential information or commercially sensitive material being misused.
- Sets clear rules for staff: Makes it easier for employees to understand what AI tools can be used, for what purpose, and what information must never be entered into them.
- Supports accountability: Helps demonstrate that AI use is being managed responsibly and supports accountability under the UK GDPR.
- Protects confidential and personal data: Reduces the risk of sensitive business or personal information being exposed.
- Reduces server space: Drafts, summaries, meeting notes can accumulate rapidly.
Did you know: Information generated via AI can still be disclosed under a Subject Access Request.
When does an organisation need an AI governance policy?
You are likely to need n Ai governance policy if one or more of these below points apply:
- Staff are using AI tools such as ChatGPT, Copilot or other AI systems in day-to-day work.
- Confidential or commercially sensitive information is being used in prompts, drafts, summaries or internal analysis.
- AI is being used to make decisions about employees, customers or other individuals (which may require a Data Protection Impact Assessment under ICO guidance).
- There are no clear internal rules on approved tools, acceptable use, restricted inputs or oversight.
- Teams are using AI in different ways such as generating images without oversight.
AI Governance policy services
We support organisations in designing and implementing clear AI governance policies tailored to how AI is used within their business. This includes:
- Defining acceptable AI use
- Setting boundaries on data input
- Establishing oversight and accountability
- Aligning AI use with UK GDPR requirements
Timelines for putting one in place
An AI governance policy can usually be put in place within 1 to 4 weeks, depending on how your organisation uses AI, the level of risk involved, and what policies or governance documents you already have. It should be reviewed and updated as your use of AI develops over time.
Assess > Identify Risks > Draft & Advise
Our 3-step AI approach
We provide a clear, structured approach to ensure your AI governance policy is structured, practical, and aligned with regulatory expectations.
1. Assess
Tell us about your organisation and how AI is being used. We review current use, identify key risk areas and provide a clear fixed-price quote.
2. Identify Risks
We will assess where AI use may create data protection, confidentiality, accountability or operational risk, and where safeguards are needed.
3. Draft & Advise
We prepare a clear and practical AI governance policy and advise on the steps needed to support consistent use across the organisation.
Costing structure
We offer both a fixed fee and variable pricing. If you have any specific requirements please get in touch.
Fixed Fee | From £750 – £2,500 + VAT | Ongoing |
Hourly Rate | £375 +VAT | Ideal for one-off projects |
Daily Rate | From £1,000 + VAT | Ideal for long-term pieces of work which may take a few days to a few weeks |
Retainer | Ongoing | Ideal for ongoing legal support |
The EU AI Act and UK organisations
The EU AI Act is not part of UK law, but it still relevant to UK organisations that offer AI systems or AI-enabled services into the EU, or whose AI use affects individuals in the EU. Or if you have an entity within the EU you will need to have a policy in place. The UK’s current approach is intended, in part, to encourage data-driven organisations to establish and maintain their operations within the United Kingdom. That said, a robust AI policy grants organisations greater control over the personal, confidential, or commercially sensitive data being inputted into these systems.
Why choose us
We are a solicitor-led organisation with over 25 years’ experience in data protection and privacy law. We help organisations put clear, practical AI governance in place where AI use creates data protection, confidentiality and accountability risk.
We’ve worked with organisations across multiple sectors including WarnerMedia, Yum! Brands, Burberry, Expedia and Société Générale on data protection and governance matters, including projects involving AI, monitoring, new systems and higher-risk processing.
Need an AI Governance Policy?
Speak directly with a data protection solicitor +44 (0)79769 39016 (9:00 am – 6:00 pm UK time). If you would like us to call or email you, please leave your details, and we will be in touch.
Westbrook Data Protection Services Limited,
2nd Floor, Midas House, 62 Goldsworth Road Woking, Surrey, GU21 6LQ
View our Privacy Policy here
Explore more data protection and privacy services
Our team have a deep understanding of the following areas of law and continue to add value to our clients’ businesses.
Latest Insights
- Changes to employment law and the rise in Subject Access RequestsChanges to employment law and the rise in Subject Access Requests Employment Rights Act From […]
- Court of Appeal’s Ruling on strengthened data privacy rightsFarley v Paymaster – Court of Appeal Boosts Data Subjects’ Rights to Compensation for Non-Material […]



