- within Litigation and Mediation & Arbitration topic(s)
- in United Kingdom
- with readers working within the Property, Transport and Law Firm industries
- within Compliance and Tax topic(s)
- with Senior Company Executives, HR and Finance and Tax Executives
Whilst none of us can predict how the future will unfold (perhaps not without the use of AI to tell us!), one thing is clear- an increased uptake of AI tools – which are already widely used in dispute resolution, is expected to accelerate, emerging as one of the key trends of 2026.
Undoubtedly, this development will be accompanied by further development of policies and procedures to govern its use and to aid the balance of efficiency and accuracy.
In March 2025, the Chartered Institute of Arbitrators (CIArb) introduced its Guideline on the Use of AI in Arbitration (2025) (the "CIArb Guideline"), which it updated in September 2025.The CIArbGuideline seeks to place a framework around the use of AI in arbitration by parties and Tribunals and to protect the integrity of AI driven results, and imposes obligations on parties and Tribunals to independently verify results yielded by AI.
In recent articles, we have explored the introduction of the AAA-ICDR AI Arbitrator in construction arbitration, as well as how AI is transforming international construction projects and what legal disputes could follow.
This article focuses on how the CIArb Guideline may change, govern and/or shape the use of AI in arbitration, specifically relating to procedural aspects, including:
- how the use of AI may impact the role of arbitrators;
- procedures the parties should consider to govern the use of AI; and
- changes that may be seen in arbitration agreements and Procedural Orders/Terms of Reference as parties seek to formalise the use of AI in arbitration.
What is the CIArb Guideline?
The CIArb Guideline is a non-mandatory "soft law" instrument which parties or tribunals may adopt in arbitration proceedings. It is intended to supplement the applicable laws, regulations and policies around the use of AI in arbitration, and to mitigate some of the risks that AI may pose to the integrity of the arbitral process, the parties' procedural rights and the enforceability of any arbitral award or settlement (for example, what happens where the use of AI tools is banned in specific jurisdictions).
The CIArb Guideline provides, amongst other things, recommendations on the use of AI in arbitration and addresses the power of the Tribunal to give directions and parameters as to the use of AI by the parties. The CIArb Guideline also provides a template agreement for the use of AI in arbitration and wording for inclusion in procedural orders.
Whilst setting out a guideline framework for the use of AI in arbitration, the CIArb recognises the many benefits of the use of AI in arbitration, not least the time and cost savings that may be made in an otherwise often lengthy and costly process.
Those benefits do however have to be measured against the inherent risks that AI poses to the arbitral process, namely:
- The enforceability of arbitral awards;
- the impartiality and independence of arbitral awards in a case where AI has had a significant influence over document production and case analysis; and
- the risk to the confidential nature of arbitration as well as data integrity and cybersecurity, particularly where "open" tools are used which may raise concerns about how confidential data inputted into such tools is stored, used for machine learning purposes and potentially presented to third parties as output from their use of the same AI tool.
What recommendations does the CIArb Guideline make?
The CIArb Guideline makes four clear recommendations:
- Parties and arbitrators are encouraged to make "reasonable enquiry" about any prospective AI tool to be used in an arbitration and should satisfy themselves that they understand the technology, functionality and data to the best of their ability.
- Parties and arbitrators should try to understand the potential risks associated with the use of AI and weigh the perceived benefits against the arbitration-related risks, including: rule of law, administration of justice, credibility and legitimacy of arbitration, and the environment.
- Parties and arbitrators should make reasonable enquiry about any AI-related law, regulation, rule of court (if relevant) applicable in the relevant jurisdictions.
- The use of an AI Tool by any participant in the arbitration shall not diminish the responsibility and accountability that would otherwise apply to them without the use of an AI Tool.
How will AI impact the role of the Tribunal?
Tribunal members have discretion as to whether to use AI, however, they should be transparent with other Tribunal members and the parties as to what AI tools are being used and – where necessary – seek agreement to the use of those tools. Whilst the use of AI tools may delegate some aspects of an Award (for example, research, summary of the facts and/or drafting of the Award itself), responsibility for the Award ultimately lies with the Tribunal. So, whilst AI tools may be used by the Tribunal to enhance the efficiency of their decision-making process, those efficiencies should be reviewed by the Tribunal through a critical lens to ensure the accuracy of the information and/or output by the specific AI tool.
Template agreement for the use of AI in arbitration
The CIArb Guideline includes a template agreement on the use of AI in arbitration, which parties may choose to adopt either as a standalone agreement, or by incorporation into any arbitration agreement/clause within a wider contract (the "AI Agreement"). The AI Agreement seeks to regulate the use of AI in arbitration proceedings.
Some key provisions of the AI Agreement include:
- The extent to which parties and representatives can use AI in preparation for and during the arbitration proceedings and the specific tools parties are permitted to use.
- The tasks that AI tools may be used for (for example, research, document review, formatting).
- The obligations imposed upon parties seeking to use AI tools (for example, disclosing the intention to use specific tools, use of reasonable efforts to verify sources and accuracy of information provided by AI tools and the obligation to refrain from using any AI tools that may produce results that are misleading or otherwise inaccurate).
The AI Agreement also imposes obligations on the Tribunal to disclose any AI tools used and to protect the accuracy of AI results. This includes, for example, an obligation to independently analyse the facts, the law and the evidence and the verification of relevant sources.
Procedural Order No. 1
The CIArb Guideline also provides a template Procedural Order that broadly states the parties and the Tribunal will be guided by the CIArb Guideline. There are two forms of this template, one in short form, which simply records the parties' agreement to the terms of the CIArb Guideline and the other in long form, which sets out the parties' and the Tribunal's substantive duties in respect of the use of AI in the proceedings in more detail.
Key takeaways: what will 2026 bring?
The CIArb Guideline is not the first document aimed at governing the use of AI in arbitration: back in 2024, the Silicon Valley Arbitration and Mediation Center (SVAMC) published its Guideline on the Use of Artificial Intelligence in International Arbitration (2024) as well as the Stockholm Chamber of Commerce (SCC) which published its Guide to the Use of AI in Cases Administered Under the SCC Rules (2024).
2025 also saw the publication of a "Note on the use of artificial intelligence in arbitration proceedings" by the Vienna International Arbitration Centre (VIAC).
However, the CIArb Guideline remains one of the most comprehensive soft-law documents to date.
With the trajectory of AI in dispute resolution continuing to rise, 2026 is expected to yield yet further guidance and regulatory control to govern the use of AI in international arbitration.
Read the original article on GowlingWLG.com
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]