Hero Backdrop

AI on the horizon - how to avert contractual disputes in a developing AI landscape

Explore how to protect your business from unnecessary claims in the UK, when navigating the use of AI in your commercial practices.

Published on:
Reading time: 5 minutes read

The current regulatory framework in the UK

On 13 January 2025, the Prime Minister announced the UK AI Opportunities Action Plan. This plan outlines the UK’s intentions to become a world leader in artificial intelligence technology for the benefit of private businesses and their customers. It sets out a number of key actions to make the UK a global AI hub including the creation of data centres and technology hubs, investing in a National Data Library to provide large data sets to AI developers to expedite AI creation and accuracy and fund AI training for UK individuals. If this plan is successful, it would provide a boost for private companies to continue leveraging AI to improve their products and services for their own benefit and that of their customers. This is claimed to be supported by AI regulation and the arrival of the UK AI Safety Bill at some point later this year. At present there is no AI specific regulation in the UK as we currently rely on various existing regulations and regulators, such as the Information Commissioner's Office (ICO) and the Competition and Markets Authority (CMA).

With the impetus of AI being driven at the highest political level, businesses need to be ready to embrace the benefits that this may bring to the creation and delivery of products and services in terms of efficiency and growth. Whether you are already using a form of AI software within your business or contract with those that do, or are seeking to purchase AI tools, there are a number of considerations that need to be addressed to ensure that your contracts are fit for purpose.

Commercial Contracting

Supplier contracts of AI systems

A trend that is set to continue to evolve in 2025 is AI specific clauses in commercial contracts with suppliers that are likely to insist on more precise restrictions on the use of AI software supplied. For example, and in simple terms, if AI software is trained to perform specific tasks and the AI is in fact used for other tasks by the purchaser, the AI supplier will want to ensure they are not liable. We expect that AI supplier businesses will incorporate wording to address this very issue in supplier codes of conduct.

Purchaser contracts of AI systems

Similarly, rather than focusing on contract terms particularly, we expect business purchasers of AI systems will want to ensure effective procurement due diligence, including requesting evidence to support the claims made by the supplier, for example, about the performance, efficiency, fairness and capabilities of the system they are selling.

Businesses that use AI in the provision of services to third parties

Contractual provisions in this regard remained quite unsophisticated in 2024, and it is thought that this trend may continue as we progress through 2025. Rather than bespoke AI drafting, businesses are largely content to rely on familiar clauses, such as the indemnity in respect of claims for IP infringement, confidentiality or data protection provisions, to cover AI related scenarios, perhaps with some adjustment of the drafting to refer specifically to AI.

However, the pace of change with AI is such that any market standard provisions that were to emerge this year would inevitably evolve in 12 months’ time. Something that we may see more of are businesses seeking to prevent suppliers from using AI either in whole or in part to perform the service, particularly in the more creative industries. Suppliers may resist this on the basis that AI is fast becoming so prevalent that it may be unduly restrictive. As we move through 2025, businesses that have so far been reluctant to engage with AI may begin to consider that they will lose competitive advantage if they do not start, at the very least, to evaluate how AI may maximise business opportunities.

Our pointers in this regard relate to the pre contract steps and considerations you can take if you use AI in the provision of services or if you are purchasing those services as follows:

  • As with the negotiation of other contract terms, each party should take responsibility for the risks it is best placed to manage, and this applies to AI. Therefore:
  1. ensure you have sufficient understanding of the AI you use as a business in your remit with your contracting party;
  2. address each AI issue in the contract and make clear which party is responsible for each risk and when a party does not bear a particular risk;
  3. set out the contractual consequences of failing to manage each risk.
  • Where different categories of AI are to be used in the delivery of a service, the parties should specify firstly which types of AI can be used on a project; secondly at which point AI can be used in a project and thirdly who can use the AI - only those with sufficient training - bearing in mind any warranties contained in the proposed contract.
  • When agreeing to use AI as part of the services to be provided, as this is still an unregulated and developing area, the parties should be aware of both technical and ethical concerns relating to its use and be aware of both performance and liability risks to its use:
  1. performance: that AI generated materials meet relevant regulatory requirements. For example, a design for a new roof by an architect or structural engineer meets general and specific building safety requirements;
  2. liability risks: whether AI generated materials lead to copyright or trademark infringements as regards both the information being used to train the AI model and the ownership of the results. Confidentiality and data protection are also likely to be real risks where contracts have confidentiality clauses which can place restrictions on data in a project. If AI models have been trained based on project data as supplied by a contracting party, this can lead to a breach of confidentiality or information security.
  • These risks can be ameliorated by seeking safeguards such as warranties, limiting liability and insurance. However, our experience to date and the anticipated slow evolution of market standard drafting for AI provisions in these types of scenarios leads us to conclude that the parties are best working together as regards the deployment of AI in any given contract. Collaboratively identifying the AI software to be used and the level of risks of parties to which it gives rise, is the best approach and can help avoid costly and time consuming disputes later.

For information about the issues discussed here and how we can help you prepare for these changes, or if you have a dispute you would like to discuss, please get in touch with our Commercial solicitors.

Did you find this article useful?

Reviewed by:

Photo of Karen Elder

Karen Elder

Legal Director

Karen has over 30 years’ experience in commercial dispute resolution in the areas of construction, corporate/commercial and property matters in the Business and Property Courts and specialist divisions.

Related Services:

Related Sectors: