The Role of Artificial Intelligence in Online Dispute Resolution A Brief and Critical Overview

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Information & Communications Technology Law

ISSN: (Print) (Online) Journal homepage: www.tandfonline.com/journals/cict20

The role of Artificial Intelligence in Online Dispute


Resolution: A brief and critical overview

Hibah Alessa

To cite this article: Hibah Alessa (2022) The role of Artificial Intelligence in Online Dispute
Resolution: A brief and critical overview, Information & Communications Technology Law, 31:3,
319-342, DOI: 10.1080/13600834.2022.2088060

To link to this article: https://doi.org/10.1080/13600834.2022.2088060

© 2022 The Author(s). Published by Informa


UK Limited, trading as Taylor & Francis
Group

Published online: 16 Jun 2022.

Submit your article to this journal

Article views: 13766

View related articles

View Crossmark data

Citing articles: 1 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=cict20
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW
2022, VOL. 31, NO. 3, 319–342
https://doi.org/10.1080/13600834.2022.2088060

The role of Artificial Intelligence in Online Dispute Resolution:


A brief and critical overview
Hibah Alessa*
School of Law, University of Leeds

ABSTRACT KEYWORDS
The growth of online dispute resolution can be seen both in real Artificial Intelligence; ODR;
terms, via the development of systems to deal with Technology
lowcomplexity disputes, and theoretically, with many
commentators arguing that such systems represent the future of
dispute resolution and the law as a whole. At the same time,
substantial developments have been made in the use of artificial
intelligence to aid online dispute resolution. Artificial intelligence
agents can be identified as existing in a number of different areas
of dispute resolution as both an aid and a replacement for
traditional resolution techniques. It can be asserted that unless
carefully developed, a number of issues will emerge around the
way that users interact with these systems. AI-based ODR systems
can fail to take into account information which usually affects
dispute resolution, like emotional responses or abstract qualities
of negotiating parties. Additionally, such systems are not yet
equipped to handle certain overarching principles of justice
which affect the resolution process and can therefore have a
subversively normative effect. Furthermore, it is easy to conceive
of a future in which access to justice is hampered rather than
helped by the development of AI processes. Therefore, this paper
aims to investigate the role that AI currently plays in shaping
ODR, and how that role might develop in the future. First this
paper briefly analysed the nature of the development of AI in
ODR and the technology is utilised. This will lend itself to a
discussion regarding what direction the industry is going in.
Second, the issues surrounding the interface between humans
and machines was investigated and the potential effects of these
issues upon ODR was evaluated. Finally, the effect of AIs in ODR,
with regards to an individual’s access to justice, was analysed. It
concludes that a significant level of oversight and regulation will
be necessary to obviate issues with AI in ODR. Also, this paper
provides a series of recommendations regarding how best to
proceed with AI in ODR.

CONTACT Hibah Alessa [email protected]


*Present address: PhD Student, School of Law, University of Leeds, UK; Lecturer, School of Law and Political Sciences, King
Saud University, Saudi Arabia.
This article has been republished with minor changes. These changes do not impact the academic content of the article.
© 2022 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/
licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly
cited.
320 H. ALESSA

1. Introduction
Law is, at its heart, a matter of dispute resolution.1 It is therefore of little surprise that a
significant amount of discussion revolves not only around the nature of the law itself,2
but also around the manner in which disputes are resolved.3 Nonetheless, there is con-
sensus on the fact that one major weakness of resolving disputes using the judicial
system is that it is time-consuming. This leads to extensive costs associated with litiga-
tion. Consequently, many parties are opting for Alternative Dispute Resolution (ADR)4
mechanisms. These are not only cheaper but they also provide parties with the
means of finding creative solutions which benefit both sides. Within the realm of
ADR, Online Dispute Resolution (ODR)5 is emerging as a leading dispute resolution
tool. ODR represents the combination of modern technology with current ADR tech-
niques.6 As a result, ODR has become prominent over the course of the last few
decades.7 Arguably, the lines between ODR and traditional legal systems are likely to
blur in the future. For example, Lord Justice Briggs recently called for the creation of
an online civil court system due to the increased sophistication of online services and
large proportion of court users who are able to communicate online.8 Given the chan-
ging landscape of disputes resolution due to globalisation, I believe ODR ought to be
examined in detail.
Concurrently to the rise of ODR, the use of artificial intelligence (AI) systems in dispute
resolution has grown significantly. AI can be utilised in a plethora of ways.9 This growth is
itself contextualised by wider conversations about the growing role of AI at a societal
level.10 For example, AI can act as a mediator or provide a quantitative break down in
assets. Therefore, it is important to investigate the role that AI currently plays in
shaping ODR, and how that role might develop in the future to enhance the efficiency
of ODR. This paper examines the development of AI in ODR and the technology that is
utilised. This lends itself to the discussion regarding what direction the industry should

1
Ronald Dworkin, Law’s Empire (HUP, 1986) 13.
2
Debates on the nature and purpose of the law in Anglo-American legal philosophy have largely centred around the Hart-
Dworkin debate – whether the law is a model of rules and what are the limits of judicial discretion. An incredible
amount of literature focuses on justifying Dworkin’s objections or defending Hart’s reply. For a brief overview, see
Scott J Shapiro, ‘On Hart’s Way Out’ in J Coleman (ed), Hart’s Postscript: Essays on the Postscript to the Concept of
Law (Oxford University Press 2001); Kenneth Einar Himma, ‘HLA Hart and the Practical Difference Thesis’ (2000) 6
Legal Theory 1; WJ Waluchow, ‘Authority and the Practical Difference Thesis’ (2000) 6 Legal Theory 45; Matthew
Kramer, ‘How Moral Principles Can Enter the Law’ (2000) 6 Legal Theory 83.
3
The debate has been centred around the merits of the adversarial and communitarian bargaining theories. See Robert J
Condlin, ‘Cases on Both Sides: Patterns of Argument in Legal Dispute Negotiation’ (1985) 44 Maryland Law Review 65;
Robert J Condlin, ‘Bargaining in the Dark: The Normative Incoherence of Lawyer Dispute Bargaining Role’ (1992) 51
Maryland Law Review 1; Rebecca Hollander-Blumoff, ‘Just Negotiation’ (2010) 88 Washington University Law Review
381; Carrie Menkel-Meadow, ‘From Legal Disputes to Conflict Resolution and Human Problem-Solving: Legal Dispute
Resolution in a Multidisciplinary Context’ (2004) 54 Journal of Legal Education 7.
4
Nadja Marie Alexander, Global Trends in Mediation (Kluwer, 2006) 6.
5
Electronic ADR (eADR), online ADR (oADR) and Internet dispute resolution (iDR) are synonymous.
6
Karolina Mania, ‘Online dispute resolution: The future of justice’ (2015) 1 International Comparative Jurisprudence 76, 77
<http://dx.doi.org/10.1016/j.icj.2015.10.006> accessed 26 September 2021.
7
Ibid, 78.
8
Lord Justice Briggs, Civil Courts Structure Review: Interim Report (Judiciary of England and Wales, 2015) 6.7 <https://www.
judiciary.uk/wp-content/uploads/2016/01/ccsr-interim-report-dec-15-final1.pdf> accessed 25 September 2021.
9
Carneiro et al. ‘Online dispute resolution: An artificial intelligence perspective’ (2014) 41 Artificial Intelligence Review
211, 212.
10
Olivia Solon, ‘Man 1, machine 1: landmark debate between AI and humans ends in draw’ (The Guardian, 19 June 2018)
<https://www.theguardian.com/technology/2018/jun/18/artificial-intelligence-ibm-debate-project-debater> accessed
26 September 2021.
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 321

take. This paper also critically examines the issues surrounding the interface between
humans and machines and assesses the potential effects of these issues upon ODR. It
then attempts to determine the extent to which AI in ODR enhances individuals’ access
to justice. Finally, a series of recommendations are made regarding how best to
proceed with AI in ODR. It is concluded that a significant level of oversight and regulation
will be necessary to obviate issues with AI in ODR.

1.1. Definitions
Due to the novel nature of AI and ODR, the definition of key terms is not straightforward.11
Therefore, it is vital to shed light on definitions of both of ODR and AI for the purposes of
circumscribing the analysis in this paper.

1.1.1. Online dispute resolution


In the era of innovation, e-commerce and rapid development, dispute resolution has
found a new and innovative way by using technology to solve disputes.12 In the digital
age, one cannot deny the fact that “conflict is a growth industry.”13 Therefore, demand
for dispute resolution tools will likely grow simultaneously.14 Consequently, it is crucial
to lay out a clear definition of ODR in order to delineate its role. ODR may be defined
as the use of information and communication technology to help people prevent and
resolve disputes.15 ODR roundly describes an easier, faster and more efficient mode of
pursuing ADR.16 A thorough examination of diverse definitions of ADR is beyond the
purview of this paper. However, it is generally regarded as those avenues for resolving
disputes that do not require the use of litigation or traditional legal systems.17 Such
alternative avenues might be negotiation, mediation and arbitration.18 Nonetheless, it
should also be noted that there is nothing that precludes ODR from being a component
of traditional legal proceedings, especially in the future when the relevant technology and
techniques are more fully developed.
However, a simple definition of ODR as ‘ADR’ plus ‘online’ tools would be too broad,
given the high prevalence of networked communication. It is difficult to imagine any
modern conflict not involving the sending of at least one email or message between
the parties involved. Thus, it is more useful to use a nuanced definition that encompasses
dispute resolution processes and the creation of a ‘virtual environment’ to solve
disputes.19 Although the strict boundaries of this definition are blurry, they exclude extre-
mely simple uses of online processes from ODR. For example, the mere scanning and
sending of an invoice as evidence involves an online process, but it cannot be regarded
11
Mania (n 6) at 77.
12
Ethan Katsh and Orna Rabinovich-Einy, Digital Justice Technology and the Internet of Disputes (OUP, 2017) 1.
13
Roger Fisher and William Ury, Getting to YES Negotiating an Agreement without Giving In <https://www.fd.unl.pt/
docentes_docs/ma/AGON_MA_25849.pdf> accessed 25 September 2021.
14
Katsch (n 12).
15
Dave Orr, Colin Rule, ‘Artificial Intelligence and the Future of Online Dispute Resolution’ (2017) Santa Clara High Tech-
nology Law Journal 10.
16
Mania (n 6) at 77.
17
Carneiro (n 9) at 213.
18
Ibid; John Olin and Robert Mnookin, ‘Alternative Dispute Resolution’ [1998] Harvard Law School Center for Law, Econ-
omics and Business Discussion Paper Series 232 <http://lsr.nellco.org/harvard_olin/232> accessed 25 September 2021.
19
Ethan Katsh, Janet Rifkin, Resolving Conflicts in Cyberspace (Jossey-Bass, 2001) 17.
322 H. ALESSA

as creating a virtual environment for the settlement of disputes.20 Rather, such a virtual
environment may include21 communication exclusively via a chatroom, with documents
uploaded for all parties to see.22 Alternatively, desires might be translated into numerical
values by a computer.23 This can be greatly distanced from more traditional methods of
ADR involving face-to-face meetings in which tone, body language and environment
might all have a tangible effect upon proceedings.24
ODR has grown in popularity over recent years since 1990.25 By 1999, ODR was readily
embraced by many companies. The first use of ODR began in the United States of
America.26 Thereafter, ODR has become widespread. It has been used by eBay, the Squar-
eTrade portal, CyberSettle and many other commercial service providers.27 However, it
was only in 2001 that governments began to see the value in ODR. Many noted that
by moving disputes online, the burden on the courts was reduced and the efficiency of
resolving disputes was enhanced.28 For example, the “Money Claim Online” platform is
a judicial ODR platform in England and Wales. The platform resolves legal issues for
fixed sum claims up to £100,000.29

1.1.2. Artificial intelligence


AI is unique. This is because AI can develop itself by using intelligent techniques or by
working intelligently.30 Therefore, it is perhaps not surprising that there have been
many definitions of AI over the years. The difference in the definitions of AI varies accord-
ing to how it is seen from each academic’s perspective. Some academics define AI based
on how it works, its main features and what it can do. Others define it based on what it
cannot do.31 Another definition of AI compares human ability with AI which is “ … trying
to solve by computer any problem that a human can solve faster.”32 This definition is
viewed as not being clear on the point of whether the human or the computer is
faster. However, Lodder and Thiessen argue that while it may have originally meant
that the human is faster, in the current context, it should be interpreted as implying
the opposite meaning, leading to their formulation that AI is “ … trying to solve by com-
puter any problem, that a human can solve, better, faster, more consistently, without
getting tired, etc.”33 In the same vein, Nilsson has defined AI as “The activity devoted
20
Civil Justice Council, Online Dispute Resolution (Civil Justice Council, 2015) <https://www.judiciary.uk/wp-content/
uploads/2015/02/Online-Dispute-Resolution-Final-Web-Version1.pdf> accessed 16 September 2021.
21
Hairong Li, Terry Daugherty, Frank Biocca, ‘Characteristics of Virtual Experience in Electronic Commerce’ (2001) 15
Journal of Interactive Marketing 13.
22
Ibid.
23
Mania (n 6) at 71.
24
Orr (n 15) at 10.
25
Ibid.
26
Faye Wang, Internet Jurisdiction and Choice of Law: Legal Practises in the EU, US and China (CUP, 2010).
27
Ethan Katsh, Janet Rifkin and Alan Gaitenby, ‘E-Commerce, E-Disputes, and E-Dispute Resolution: In the Shadow of eBay
Law’ Vol15(3) Ohio State Journal on Dispute Resolution 705 <http://pages.ebay.> accessed 16 September 2021.
28
Ibid.
29
Money Claim, HM Courts and Tribunal Services, <https://www.moneyclaim.gov.uk> accessed 15 September 2021.
30
Arnold Lodder and Ernest Thiessen, ’The Role of Artificial Intelligence in Online Dispute Resolution’ [2003] <http://
citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.97.9137&rep=rep1&type=pdf> accessed 10 September 2021.
31
Arno R Lodder and John Zeleznikow, ‘Developing an Online Dispute Resolution Environment: Dialogue Tools and Nego-
tiation Support Systems in a Three-Step Model.’ <https://research.vu.nl/ws/portalfiles/portal/2115691> accessed 20
September 2021.
32
‘The Free On-line Dictionary of Computing’ <http://foldoc.org≥ accessed 12 January 2018.
33
Lodder and Thiessen (n 28).
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 323

to making machines intelligent, and intelligence is that quality that enables an entity to
function appropriately and with foresight in its environment.”34
The difficulty of finding consensus on the definition of AI is related to the fact that
the term ‘intelligence’ itself is nebulous.35 For the sake of brevity, a working definition
can be lifted from computer science academia: “any device that perceives its environ-
ment and takes actions to maximise its chance of successfully achieving its goals.”36
Again, such a definition is too broad. A refrigerator which monitors temperature and
adjusts its cooling element can accordingly be regarded as acting intelligently. In
the same regard, a hyper-advanced futuristic computer also acts intelligently if it pro-
vides life-counselling advice to stressed office workers. Therefore, one can see that pre-
cision is the key;37 trying to define how a device will perform in an intelligent way
helps to clarify how the device may enhance human abilities.38 However, the very
nature of AI is that it is constantly evolving.39 Therefore, a precise definition is only
ever likely to be suitable only for a short period of time. This paper regards AI to
be a developing and largely theoretically unbridled technology. Whilst the future is
uncertain, it is still arguably possible to examine current trends in AI and ODR in
order to discuss what might occur in the future.

2. The Role of AI in ODR


2.1. Current role of AI in ODR
Before one can consider the future of AI in ODR, it is important to consider the past and
current relationship between AI and ODR. Historically, AI has played a role in the appli-
cation of justice, preservation of rights, and the promotion of social values. For
example, by facilitating legal work and increasing its efficiency by resolving disputes.
One way this is achieved is by helping people to understand the human reasoning
process in constructing and sustaining legal arguments.40 Law is created and enforced
by a chain of processes which include information processing, reasoning, and decision
making. Similarly, AI works by information retrieval, knowledge representation and
reasoning, natural language processing, machine learning and data mining.41
Although AI and law publications can be traced back to the early 1950s,42 the focus
began in the 1970s.43 However, since the early 1980s, Law and AI has been an interesting
34
Nils Nilsson, The Quest for Artificial Intelligence: A History of Ideas and Achievements (CUP, 2009).
35
Ibid.
36
David Poole, Randy Goebel, Alan Mackworth, Computational Intelligence: A Logical Approach (OUP, 1998) 1.
37
Sundar Pichai, ‘AI at Google: our principles’ <https://www.blog.google/technology/ai/ai-principles/> accessed 10 Sep-
tember 2021.
38
Matthew U Scherer, ‘Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies, and Strategies’ (2016)
28 Harvard Journal of Law & Technology 171.
39
Stuart Russell and Peter Norvig, Artificial Intelligence: A Modern Approach (3rd edn, Pearson 2014) 1039.
40
Solicitors Regulation Authority, ’Improving access - tackling unmet legal needs’ (Solicitors Regulation Authority, June
2017) <https://www.sra.org.uk/risk/resources/legal-needs.page> accessed 26 September 2021.
41
Mireille Hildebrandt, ‘Law As Computation in the Era of Artificial Legal Intelligence. Speaking Law to the Power of Stat-
istics’ [2017] SSRN Electronic Journal 458 <https://www.ssrn.com/abstract=2983045> accessed 26 September 2021.
42
Layman E Allen, ‘Symbolic Logic: A Razor-Edged Tool for Drafting and Interpreting Legal Documents’ (1957) 66: 833 Yale
Law School Yale Law School Legal Scholarship Repository Faculty Scholarship Series Yale Law School Faculty Scholar-
ship 834 <http://digitalcommons.law.yale.edu/fss_papers/4519> accessed 25 September 2021.
43
Bruce G. Buchanan and Thomas E. Headrick, ‘Some Speculation About Artificial Intelligence and Legal Reasoning’ (1970)
23 Stanford Law Review 40 <https://www-jstor-org.libproxy.ucl.ac.uk/stable/pdf/1227753.pdf?refreqid=excelsior%
3A4a38c2a70c73298a0260daf592954fd4> accessed 25 September 2021.
324 H. ALESSA

research field which is about to experience a revolution.44 As noted above, substantial


bodies of literature exist regarding AI and the law as a whole. Therefore, it is pertinent
to examine the literature that deals primarily and exclusively with the use of AI in ODR.
The relevant literature regarding the current role of AI in ODR can be split into the descrip-
tive: what is happening/what will happen; and the evaluative: the extent to which AI and
ODR can be seen as a positive thing.

2.1.1. The Descriptive approach


The descriptive approach to AI in ODR can be further divided into two sub-categories;
those which seek to chart the actual AI tools which are utilised in ODR, and those
which evaluate the effects that AI in ODR might have. In other words, those that chart
specific AI-based ODR mechanisms, and those that chart AI-based ODR as a phenomenon.
In the former categories, commentators like Carneiro et al45 note the substantial
growth not of advanced AI-based ODR systems as predicted, but of supportive AI-systems.
For example, systems which allow the more effective management of ODR cases or
support users by dealing with complex calculations and algorithms on their behalf. None-
theless, there are some commentators like Orr46 willing to categorise certain advanced
algorithmic systems as being precursors to fully-blown AI-led ODR, such as those used
to organise and store the data relating to the multitude of traffic code violations that
occur each year. In the latter group are authors like Rifkin47 and Katch48 who seek to
describe the role of AI in ODR not in technological terms, but as a phenomenon that
shapes the nature of the discourse – as a so-called ‘fourth party’ negotiator. This way
of considering the issue describes AI in ODR not as a mere tool to aid resolution, like a
telephone or a calculator might be considered, but rather can be regarded as a means
of resolution itself.
A particularly common theme across all commentaries is the implementation gap
between those technologies which are proposed and predicted within the field, and
those which have been realised. Carneiro et al 49 note that AI in ODR has been victim
of that same obstacle that has beset other applications of AI: slower than expected devel-
opment. Syme50 also argued that it is foolhardy to associate the development of a tech-
nology with the uptake of that technology. Thus, the slow development of new ODR
techniques as a whole might be attributable to several factors that political, economic,
psychological and cultural in nature, rather than merely technological advancement.
This pattern in the literature cannot be regarded as surprising given the emphasis on
AI. A critical analysis here posits an explosion in optimistic predictions, followed by a reca-
libration of expectations once the realities of the technology become apparent as a
pattern which is true for practically all fields of AI research.51
44
Kevin Ashley, Artificial Intelligence and Legal Analytics: New Tools for Law Practice in the Digital Age (CUP 2017).
45
Carneiro (n 9) at 238.
46
Orr and Rule (n 13) at 10.
47
Rifkin J, ‘Online dispute resolution: Theory and practice of the fourth party’ (2010) 19 Conflict Resolution Quarterly 117
48
Katsh and Rifkin (n 19) at 84.
49
Carneiro (n 9) at 238.
50
David Syme, ‘New Worlds of Dispute Resolution’ in Tania Sourdin (Ed.) Alternative Dispute Resolution and the Courts
(Federation Press, 2004) 134.
51
David Welham, ‘AI in training (1980–2000): Foundation for the future or misplaced optimism?’ (2008) 39 British Journal
of Educational Technology 287.
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 325

2.1.2. The evaluative approach


The evaluative approach to AI in ODR ranges between optimistic and pessimistic out-
looks. Thompson, for example, asserts52 that the ultimate benefit of AI in ODR will be
the creation of highly supportive systems which will uncork bottlenecks in the judicial
system and replace red tape with an efficient process. This implies that such systems
will provide the average individual with expertise, guiding the individual to a satisfactory
outcome based on the circumstances of their case or dispute.53 In other words, AI in
ODR will augment access to the law. It has also been noted that this is not too far
away from the current realities of using automated processes in dispute resolution.54
Other commentators argue that access to the law might also be increased via the use
of AI to decide low value civil cases via ODR.55 Additionally, it has been argued that
AI might be used to create solutions to highly complex disputes in order to overcome
the traditional problems associated with such negotiations: emotive discussion and
impassioned viewpoints.56 Most, if not all, optimistic commentators approach the tech-
nology in a way which assumes that one day it will be highly developed, beyond usual
human capacities.
The proposed technologies are not without their detractors however, and the current
literature illustrates this. In contrast to statements positing a beneficial effect on access to
justice, it has also been argued that as with other technologies, AI-assisted ODR will first
and foremost be accessible by wealthier clients and firms, and will therefore fail to
increase the access to justice for those who cannot afford them.57 Furthermore, there
exists a band of commentators (such as Zeleznikow58 and Abrahams et al.59) who are con-
cerned with the negative effects that automated ODR processes might render, including a
lack of proper oversight or an overreliance on automated ruling procedures leading to
detrimental adherence to any decisions rendered. Such assertions are arguably made
by the aforementioned AI-as-fourth-party advocates, who hold that the creation of
another party to the dispute resolution process beyond the original two is likely to be det-
rimental to the outcome,60 especially if that party exhibits non-human characteristics.61
There is, therefore, a lack of consensus in the relevant literature regarding the precise
effects that AI-based ODR might have. Nonetheless, the literature highlights certain key
issues that might arise as a result of AI implementation and provides a fertile ground
for further analysis.

52
Darin Thompson, ‘Creating New Pathways to Justice Using Simple Artificial Intelligence and Online Dispute Resolution’
(2015) 1 International Journal of Online Dispute Resolution 1, 53.
53
Ibid.
54
Amber Jenner, ‘The future of dispute resolution: AI’ (Kennedys Law, 2017) <https://www.kennedyslaw.com/thought-
leadership/article/the-future-of-dispute-resolution-ai> Accessed 21 September 2021.
55
Ibid.
56
Arno Lodder, John Zeleznikow, Enhanced Dispute Resolution Through the Use of Information Technology (CUP, 2012) 92.
57
Jenner (n 54).
58
John Zeleznikow, ‘Can Artificial Intelligence and Online Dispute Resolution Enhance Efficiency and Effectiveness in
Courts’ (2017) 8 International Journal for Court Administration 30, 43.
59
Brooke Abrahams, Emilia Bellucci, John Zeleznikow, ‘Incorporating Fairness into Development of an Integrated Multi-
agent Online Dispute Resolution Environment’ (2012) 21 Group Decision and Negotiation 3.
60
Ayelet Sela, ‘Can Computers Be Fair? How Automated and Human-Powered Online Dispute Resolution Affect Procedural
Justice in Mediation and Arbitration’ (2018) 33 Ohio State Journal on Dispute Resolution 91.
61
Daniel Rainey, ‘Third Party Ethics in the Age of the Fourth Party’ (2014) 1 International Journal of Online Dispute Res-
olution 37, 55.
326 H. ALESSA

Arguably, there exist two primary ways in which AI technology is currently being
employed in the field: either in a support capacity, or a substitutive one. The difference
between the two applications is perhaps best understood via reference to traditional
models of dispute resolution. The traditional model calls for two parties or more negotiat-
ing concurrently with a third-party mediator, arbitrator or similar.62 When AI is used in a
support role, the third party uses it as a tool to accomplish their goal. In other words, their
work is simply enabled or supplemented by the use of AI. When used in a substitutive
capacity, the AI begins to take on the essential functions traditionally associated with
the third party altogether, for example by coming to decisions or making inquiries of
the first and second parties.63 This is not to say that it is necessary for a system to entirely
replace a third-party negotiator. When AI is used in a substitutive capacity, there is no
requirement for an android mediator, but merely that the system deals with a portion
of the third-party negotiator’s work, as further described below. It should be noted that
there is a significant crossover between the two categories, largely predicated on the
level of advancement in the technology. For example a system which calculates the
proper level of compensation to be paid in an industrial arbitration might be supportive
if it takes the form of spreadsheet used by an arbitrator, but can be considered substitu-
tive if that same system could ascertain that a far better fiscal outcome would be to create
trade deals which favour the party that prevails. Nonetheless, the distinction between
supportive/substitutive systems can be used to demonstrate the highest potential
areas of AI in ODR. The next section discusses how these may shape the role of AI in
ODR in the future.

2.2. The future role of AI in ODR


2.2.1 Supportive AI systems
Supportive AI systems are the most prevalent technologies in the contemporary ODR
environment.64 This is arguably an unsurprising phenomenon; it is far easier to develop
a tool for a human to use, rather than as a replacement for the human altogether. Sup-
portive AI systems can be further categorised regarding the nature of the support that
they provide.

2.2.1.1. Decision support systems. This is arguably one of the most promising current
uses of AI in ODR. It is the area which has witnessed the greatest efforts at development
since the early 90s.65 This is perhaps not surprising – the ability of a system to weigh up
different factors and compute the optimal outcome or course of action is a clear indi-
cation of its efficiency and effectiveness. For example, systems which plotting shipping
lanes, self-driving cars and actuarial software all act on this principle. Therefore, the appli-
cation of these AI processes to dispute resolution is not a particularly novel
62
Carole Silver, ‘Models of Quality for Third Parties in Alternative Dispute Resolution’ (1996) 12 Ohio State Journal on
Dispute Resolution 37, 39.
63
Carneiro (n 9) at 215.
64
Zeleznikow (n 58) at 32.
65
Emilia Bellucci, Arno Lodder, John Zeleznikow, ‘Integrating Artificial Intelligence, Argumentation and Game Theory to
Develop an Online Dispute Resolution Environment’ (2004) Proceedings of the 16th IEEE International Conference on
Tools with Artificial Intelligence 749.
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 327

development.66 This explains why supportive AI-based systems are sound points for
thorough consideration when it comes to ODR.
Although their exact form and function varies, such systems are best characterised by
their ability to provide information on the level of agreement or disagreement between
two parties.67 For instance, relatively simple situations like an amicable divorce settlement
might be examined. At first glance, it might appear simple for a financial calculation to be
made – a group of objects are given a monetary value and then split down the middle.
However, this fails to take into account the true nature of divorce negotiations – the
value of a given object is not dictated objectively, but subjectively, often in accordance
with certain emotional factors.68 Such factors are by definition abstract and therefore
the ‘true’ value of a given object is also largely abstract. This means that there is a
huge potential for disagreements to drag on as the two parties, even if helped by a tra-
ditional mediator, find themselves having to compare a large number of items against
each other, each with a different subjective value to each party (e.g. a record collection
against a prized mantelpiece ornament.) This is the point at which a decision-making
support system might be used. One such example is the Family Winner system, which
asks each party to list their disputed items, and then give each item a subjective priority
value.69 The system will then employ algorithms to come up with a nominally optimal sol-
ution for distribution, which can then be rejected or accepted. In case of rejection, the
system allows the parties to rank those items whose allocation is contested, so that the
highest priorities of each party might be met.70
Similar systems have also been crafted to guide third parties on the likely satisfaction
that negotiating parties will have with a given outcome, as seen in the SmartSettle sys-
tem.71Similarly, parties provide preferences regarding certain outcomes, e.g. a high
rating for receiving a valuable patent right or a low rating for ownership of a poorly main-
tained building, allowing mediators to establish outcomes that are likely to satisfy each
party. Most notably, parties are able to adjust their wishes as negotiations develop72 –
arguably mirroring the nature of traditional negotiation.
Due to the substantial AI-based ODR development which has occurred, there are a sig-
nificant number of other systems which have emerged utilising AI in ODR. These include
the Adjusted Winner, AssetDivider, ALIS and GetAid systems.73 Although they rely on auto-
mated processes, they still require substantial human input, with third party human
experts guiding the use of the output of each system, as well as often providing a
bridge between the system and the negotiating parties due to the complexity of the
values required for them to work.74 Nonetheless, the application of such systems is
wide. In order to test the geopolitical viability of certain systems, e.g. Adjusted Winner,
they have been applied to historical disputes to ascertain whether their outputs are

66
Efraim Turban, Decision Support and Expert Systems: Management Support Systems (Prentice Hall, 1993) 443.
67
Lodder and Zeleznikow (n 56) at 92.
68
Nial Muecke, Andrew Stranieri, Charlynn Miller, ‘Re-Consider: The Integration of Online Dispute Resolution and Decision
Support Systems’ (2008) 430 CEUR Workshop Proceedings 62, 63.
69
Lodder and Zeleznikow (n 56) at 78.
70
Ibid.
71
Carneiro (n 9) at 228.
72
Ibid.
73
Ibid, 227-229.
74
Michael Araszkiewicz et al., ‘The Role of New Information Technologies in Alternative Dispute Resolution of Divorce
Disputes’ (2014) 1 European Dispute Resolution Journal 549, 552.
328 H. ALESSA

similar to the eventual resolutions of the disputes.75 For example, when Adjusted Winner
was applied to the Israel-Palestine dispute around the time of the Camp David Accords in
the late 70s, it rendered an output which was substantially similar to that offered by the
real Accords.76 Such a pattern was also reported with regard to the Suez Canal Treaty.77
This adds value to how AI and ODR can be sewed together to better serve the parties’
needs in the future as it would have in the past.

2.2.1.2. Knowledge support systems. Although decision support systems provide pro-
cedural support, AI can also be used to provide non-traditional means of accessing infor-
mation relevant to a given dispute. Perhaps an oversimplification, such systems can be
thought of merely as very advanced search engines. However, the complex nature of
advanced ‘knowledge representation’ should not be underestimated.78 Instead, a truly
‘intelligent’ search engine would need to be able to take in the relevant details of a pre-
sented scenario, requiring a sense of understanding and meaning, and ascertaining the
relevant information to present (or omit) in an understandable manner.79 For example,
a knowledge support system might search through rules by examining and applying
those statutory provisions which are relevant to the scenario at hand, and case studies
by taking into account those cases which are similar in nature and therefore give rise
to applicable precedent at the same time. Once both of these tasks are undertaken, it
will be possible to deliver the relevant information for a given scenario.80 It is perhaps
telling that this is a highly difficult manoeuvre81 even in the highly indexed world of
law, and would arguably be orders of magnitude more difficult given some of the ODR
contexts which arise (e.g. industrial disputes which rely on ‘rules’ created only by
course of dealing or which otherwise involve complex knowledge.) Disregarding the
difficulty of designing such systems momentarily, they are arguably highly promising in
that they remove a substantial barrier to the resolution of disputes – the need to ascertain
where exactly the relevant information lies, be it in the form of rules, evidence, or studies
of previous similar disputes.

2.2.1.3. Intelligent interface systems. Rather than a directly supportive system like the
above, intelligent interface systems are those which aim to bridge the significant com-
munication gap between human users and other AI systems, for example by enabling
the use of natural language as inputs and outputs of the system.82 By no means is the
desire to create such systems exclusive to ODR; natural language processing is somewhat
of a holy grail in many AI fields. Nonetheless, the ability to avoid the translation problems
discussed below will be an extremely significant leap within the field of ODR, although
75
Lodder and Zeleznikow (n 54) at 89.
76
Steven Brams and Jeffrey Togman, ‘Camp David: Was the Agreement Fair?’ in Frank Harvey, Ben Mor (Eds.) New Direc-
tions in the Study of Conflict, Crisis and War (Macmillan, 1998) 306.
77
Ibid.
78
John Sowa, Knowledge representation: logical, philosophical, and computational foundations (MITP, 2000) 51.
79
Cade Metz, ‘AI Is Transforming Google Search. The Rest of the Web Is Next’ (Wired, 2 April 2016) <https://www.wired.
com/2016/02/ai-is-changing-the-technology-behind-google-searches/> Accessed 10 September 2021.
80
James Popple, ‘Legal Expert Systems: the inadequacy of a rule–based approach’ (1991) 21 Australian Computer Journal
11.
81
Ronald Brachman, Knowledge Representation and Reasoning (Morgan Kaufmann, 2004) 336.
82
Carneiro (n 9) at 222.
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 329

until such leaps are made there is little to be said as regards the original contribution of
intelligent interface systems to the field of ODR.

2.2.2. Substitutive AI systems in ODR


Although substitutive AI systems are arguably those which come to mind when AI
is discussed (e.g. the ‘robot’ third party) they are still subject to slow development on
account of their relative complexity. It should be noted that although only two main
systems are described below, the true breadth of substitutive systems is wider than
those described. Any system which could be regarded as truly substitutive would be that
one that could combine a number of the systems discussed above and below to form a
more complete virtual third party. Nonetheless these ‘multi-agent systems’ are a while
off, since naturally they require not only for their sub-systems to be competent, but also
their own internal workings to be developed in order to combine those systems.83

2.2.2.1. Case reasoning systems. Case reasoning systems take knowledge of past out-
comes and apply it to the situation at hand. Thus, an AI which is aware from past experi-
ence or data input that a particular course of action leads to negative outcomes can avoid
that course of action.84 For instance, an AI might avoid suggesting that two divorcing
parties simply burn all of their jointly held possessions as a means of ending a protracted
settlement process, since it is aware that in the past this has led to low satisfaction levels.
There is a clear potential for such systems in the area of ODR, especially where disputes
are subject to legal and quasi-legal systems due to the tendency for clear documentation
of the facts of cases and statements regarding the exact reasoning behind a certain
decision.85 Thus, a dispute before an AI third party which is similar can be decided in a
manner which takes account of the success or validity of previous cases. It should be
noted that there is no direct need for such AIs to learn directly – they can be programmed
to recognise important variables from the start, as is the case with the Split-Up AI.86 In
order to provide suggested divorce settlements, the system looks for the presence of
94 different factors - child care arrangements, income etc. – and then provides sugges-
tions based on the outcome of previous cases which exhibit similar factors.87
Although seemingly simple, (after all, traditional third parties rely on their previous
experience) there remain significant difficulties in ‘teaching’ AIs to differentiate
between relevant and irrelevant details.88 Nonetheless, cases of low complexity may
well benefit in the near future from such technology: for example an AI working for an
online sales platform like Amazon or eBay might realise, via previous feedback on custo-
mer satisfaction, that low value disputes regarding items not being sent are far easier
settled by quick compensation from its own coffers rather than via protracted investi-
gations into what has actually occurred.89 The PERSUADER system, which deals with
83
Carneiro (n 9) at 224.
84
A Aamodt, E Plaza, ‘Case-based reasoning: foundational issues, methodological variations, and system approaches’
(1994) 7 AI Communication 39.
85
Carneiro (n 9) at 223.
86
Lodder and Zeleznikow (n 56) at 82.
87
Ibid.
88
Carneiro (n 9) at 223.
89
Anna Tims, ‘If eBay’s customers are always right, who will protect its sellers?’ (The Guardian, 2014) <https://www.
theguardian.com/money/2014/jul/11/ebay-buyer-complained-decide-against-seller> Accessed 15 September 2021.
330 H. ALESSA

labour disputes, retains the results of each case it has dealt with so that it might be
more efficient in the future, both in removing the need to evaluate each novel case
in its entirety and by employing solutions which have previously been shown to be
effective.90

2.2.2.2. Rule-based systems. Operating in a similar manner as case-based systems, rule-


based systems apply set principles and rules to a given case. This might be a simple
matter: e.g. ‘IF a party broke a contract term, THEN apply the relevant penalty’,91 ‘IF
both parties have a predicted satisfaction value >90% THEN provide a conclusive settle-
ment agreement.’ It is troubling that such simple examples fail to illustrate the complex-
ities which can occur. For example, what if the issue was of string sales encompassing an
array of rules. Nevertheless, systems become exponentially more complex as the number
of applicable rules, inputs and outputs increase, creating a system which imitates even a
codified set of rules (e.g. statute law, terms of dealing) is no mean feat.92 Indeed by
definition, dispute resolution can be regarded as a matter of conflicting rules – that is
to say, each party believes that the best course forward would be what they prefer,
based on the circumstances at hand. Nonetheless, decision tree systems have been suc-
cessful in automating simple financial decisions which previously took quite some time to
process manually, such as whether particular parties to a dispute should have access to
legal aid93 or the resolution of low value consumer-supplier disputes.94 Such systems
can also be regarded as extremely useful when the issue is merely presenting the
correct template for dispute resolution, as in the case of the DoNotPay system, which
appeals parking tickets via an automated online process.95 Such systems can also be valu-
able to the resolution of commercial disputes. Within eBay and PayPal a system, Square-
Trade has been deployed for some twenty years, aiming to filter those complex disputes
which require additional third party attention from those which might be more simply
solved via automated processes.96 Thus a case which simply requires a seller to resend
an item because they simply forgot to do so can be solved via an automatic but appeal-
able ruling which will in turn free up the resources to deal with convoluted cases involving
abstract thought (e.g. a complaint that a purchased sweater is maroon in colour when it
was advertised as being burgundy.)
From what has been illustrated, it can be said that AI plays and will continue to play a
fundamental role in ODR. However, it must overcome some important obstacles such as
the translation issues between parties to a dispute and the machine systems. An attempt
is made in the next section to further identify and discuss these issues in order to clearly
delineate the role of AI in ODR.
90
Lodder and Zeleznikow (n 56) at 82.
91
Carneiro (n 9) at 226.
92
Kevin Ashley, Edwina Rissland, ‘Law, Learning and Representation’ (2003) 150 Artificial Intelligence 17-58, 19
93
Zeleznikow (n 58) at 37–38
94
Steve Abernathy, ‘Building Large-Scale Online Dispute Resolution & Trustmark Systems’ (UNECE Forum, 2003) <https://
www.mediate.com/Integrating/docs/Abernethy.pdf> Accessed 10 September 2021.
95
Samuel Gibbs, ‘Chatbot lawyer overturns 160,000 parking tickets in London and New York’ (The Guardian, 28 June 2016)
<https://www.theguardian.com/technology/2016/jun/28/chatbot-ai-lawyer-donotpay-parking-tickets-london-new-
york> Accessed 26 September 2021.
96
Colin Rule, ‘Making Peace on Ebay’ (2008) ACResolution (Online) Available at: http://colinrule.com/writing/acr2008.pdf
[Accessed 21 September 2021].
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 331

3. AI-ODR and issues of machine translation


A key element of ADR is communication between the parties and commonly a third party
facilitating the dispute.97 Historically, scholars have attributed the success of ADR to face-
to-face meetings and open communication between the negotiating parties.98 The ques-
tion which then emerges is whether the impersonality of AI in ODR will have a substantial
negative effect on the efficacy of the process. This is because although AI can emulate a
human, it is at the end of the day only a machine. Therefore, a primary issue facing the use
of AI in ODR is human-machine translation.99 In this regard, this paper places emphasis on
AI-ODR which incorporates the AI as a third or fourth party; as opposed to merely supple-
menting a human third party. Arguably, in the absence of natural language processing
and strong machine intelligence,100 there will remain a problem of translating the
inputs necessary for dispute resolution (e.g. desires, statements of facts etc.) into a
form that an AI might understand. Further, translating the outputs of an AI third party,
back to those that might be understood and accepted by human parties can be proble-
matic. As noted above, this is referred to the issue of the ‘fourth party;’101 which demon-
strates that the use of AI may have a palpable effect on dispute resolution proceedings.102
In particular, there are three main negative effects that this translation process might
have. First, a reliance on AI in ODR will arguably place a limit on the communication that
might occur between the negotiating parties. Arguably, where the parties intend to con-
tinue business relations, this may be disadvantageous. Second, a fourth party AI system
will struggle to deal with certain overt principles which affect the negotiation process.
Third, AI will have a normative effect on ODR. These negative effects are discussed and
analysed in the next subsections.

3.1. Quantifying qualitative information


It should be noted that questions regarding the wide-spread viability of automated ODR
processes are hypothetical. This is due to a tendency to only use semi-automated processes,
where abstract questions and disputes have been settled and only financial or numerical
issues remain. In other words, AI-ODR tends to be used not to solve disputes, but to deal
with distributive problems.103 For example, the Family_Winner system104 deals with the
financial side of divorce settlements; indicating that systems do not solve problems but
merely help decide how the “winner” will be treated. Arguably, this demonstrates that
AI-ODR systems are currently not sufficiently sophisticated to deal holistically with
dispute resolution processes; and that those who employ them are aware of this fact.
97
Feliksas Petrauskas and Kybartiene E, ‘Online Dispute Resolution in Consumer Disputes’ (2011) 18 Jurisprudence 921,
935.
98
William Zurilla, ‘Alternative Dispute Resolution’ (1997) 45 LA Business Journal 352.
99
James E Cabral and others, ‘Using Technology To Enhance Access To Justice’ (2012) 26 Harvard Journal of Law & Tech-
nology 241 <http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=90231665&site=ehost-live>.accessed
26 September 2021.
100
Soufiane El Jelali, Elisabetta Fersini and Enza Messina, ‘Legal Retrieval as Support to EMediation: Matching Disputant’s
Case and Court Decisions’ (2015) 23 Artificial Intelligence and Law 1.
101
Janet Rifkin, ‘Online dispute resolution: Theory and practice of the fourth party’ (2010) 19 Conflict Resolution Quarterly
117.
102
Sela (n 60) at 91.
103
Petrauskas and Kybartiene (n 97) at 936.
104
Abrahams, Bellucci and Zeleznikow (n 59) at 3.
332 H. ALESSA

Nonetheless, it can be argued that there have been attempts to automate the more
abstract or human aspects of dispute resolution. For example, the SmartSettle system
allows the use of ‘satisfaction values.’105 Consequently, something which would tradition-
ally be regarded as a personal and internal experience can be translated into something
which can be grappled with by a machine.106 Although it might appear that this is a more
direct connection between the user and the system, it can be countered that this process
does not represent the machine ‘understanding’ the desires of a human party; rather it is a
codification of the avowed human experience that enables the machine to match certain
preferences with certain outcomes. There is a vast difference between a mediator under-
standing a party’s desire to, for example, repair a relationship with a previously antagon-
istic business partner and a system’s rating of that desire as a ‘9/10 priority;’ AIs lack the
ability to see the qualitative ‘worth’ of a solution. They cannot consistently use human
emotions in a positive way to enhance communication or defuse conflict.
It can be further argued that distributive benefits are only half the story. In a significant
number of disputes the parties seek an acknowledgement that they are “right.”107 This
highlights one of the major issues facing the use of AI in ODR; namely, that certain
desires might not be achievable via the application of AI. Instead, they require direct
human interaction. It is notable that regardless of outcome, there is a higher satisfaction
level associated with traditional face-to-face ADR than ODR. This is arguably due to the
openness and officiousness of the process.108 It is exacerbated where an individual has
hidden values, which an AI machine cannot access or even suspect. As described
above, the Israel-Palestine dispute has been subjected to the Adjusted Winner system.
The solution provided chimes with that provided by some of the foremost experts of
the time. Nevertheless, it can be seen that even the most “optimal” solutions were redun-
dant due to subversive human values; for example, an inherent distrust of other parties.109
This is because rational systems often struggle with irrational values, which could easily be
understood by human third party.
Furthermore, it can be argued that in seeking to translate human desires certain desires
may be obfuscated by negotiating parties. For example, people are unlikely to openly state
their desire to be “right;” fearing that they will be seen egotistical or prejudicial. This is not to
say that the in-building of such inputs is impossible. To the contrary, it has been suggested
that via the proper inputting of background information on parties, innate desires such as
this might be catered for.110 For example, in order to maintain an even-keeled negotiating
environment, a party who is regarded as prone to nervousness might be offered a minor
concession on the basis that this will relax their attitude towards future points of
dispute.111 However, the lack of willingness of individuals to be privy to such profiling pro-
cesses and their concurrent desire to hide information that might aid these processes is

105
Carneiro (n 9) at 228.
106
Ibid.
107
Thomson Reuters, The Impact of ODR Technology on Dispute Resolution in the UK (Thomson Reuters, 2016) 19
Available at: https://blogs.thomsonreuters.com/legal-uk/wp-content/uploads/sites/14/2016/10/BLC_ODRwhitepaper.pdf
[Accessed 20 September 2021].
108
Ibid.
109
John Zeleznikow, Emilia Bellucci, ‘Using Asset Divider to investigate the Israel - Palestinian dispute’ (6th International
Workshop on Online Dispute Resolution, December 2010) <http://ceur-ws.org/Vol-684/ODR2010proceedings.pdf>
accessed 26 September 2021.
110
Carneiro (n 9) at 237.
111
Ibid.
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 333

arguably a large obstacle to overcome. This is of course not an issue which is inherent to the
use of AIs in ODR, but rather a limitation which may be dealt with by human third parties
without their explicit knowledge that this is what they are doing.112
This issue is further aggravated by the fact that a human needs to enter (and some-
times interpret) the relevant information, regarding the dispute, into AI system.113 This
means that any AI-driven systems operate at the whim of the third-party operator, who
will be privy to their own prejudices and failure to notice relevant factors in the parties.
Consequently, this will curtail the ability of AIs to render solutions which might be
more effective than those noticed by traditional third parties. However, if this can be over-
come, AI can introduce ‘out of the box’ thinking.114 Nevertheless, the prospect of devel-
oping this technology is somewhat of a chicken-and-egg situation. Without sufficient
advancement, there is a lack of willingness to use AI-systems on qualitative, rather than
quantitative, matters. At the same time, without a will to use AI-systems to address quali-
tative issues, the avenues for advancement are slim due to a reliance on older models of
human-machine interface, which reduces the role of AI in ODR.

3.2. The overarching principal problem


A further translation issue is how over-arching goals might be achieved; for example,
“welfare” or “success.” Occasionally, disputes require the programming of a stated prin-
ciple into an AI system. To do this, parameters need to be drafted, above and beyond
that which appear in legislation. For example, if Section 1 of the Children Act 1989
(“child’s welfare shall be the court’s paramount consideration”) were to be entered into
an AI system, the programmer would need to make value judgments on how to interpret
the terms ‘welfare’ and ‘paramount’. This is not to say that the judiciary or a traditional
third party do not engage in similar value judgments. However, these value judgements
can be considered soft and abstract; whereas to provide hard edges is another matter
altogether. Another example is under Section 172 of the Companies Act 2006, which
states that “a director of a company must act in the way he considers, in good faith,
would be most likely to promote the success of the company for the benefit of its
members as a whole.” However, success is abstract. It might mean profitability, size, long-
evity, reputation; all these notions are legitimate, but reasonable people will reasonably
disagree on their specific contents. Therefore, asking an AI to act with a quasi-infinite
goal in mind is an easy way to render results which can be considered a failure, either
because the system can never consider its work to be done, or because the solutions it
provides become extreme and absurd.115 Thus, there may be scenarios which are not suit-
able for AI-ODR in the near future.
An additional problem is that unwritten rules also govern dispute resolution.116 For
instance, a commercial dispute might revolve around the meta-principle that any
112
Peter Guy, ‘The biggest limitation of artificial intelligence is it’s only as smart as the data sets served’ (South China
Morning Post, 4 February 2018) <http://www.scmp.com/business/china-business/article/2131903/biggest-limitation-
artificial-intelligence-its-only-smart> accessed 12 September 2021.
113
Petrauskas and Kybartiene (n 97) at 936.
114
Ibid.
115
Paul Ford, ‘Our Fear of Artificial Intelligence’ (MIT Technology Review, 11 February 2015) <https://www.
technologyreview.com/s/534871/our-fear-of-artificial-intelligence/> [Accessed 22 September 2021].
116
Carneiro (n 9) at 226.
334 H. ALESSA

resolutions forum should maintain an environment of commercial freedom117 or the


uncodified rule that customer satisfaction is to be maintained, regardless of the validity
of their claim. It can be argued that in the absence of efficient machine learning, incorpor-
ating these principles into an AI-ODR system would be difficult. Further, it would remove
the inherent flexibility currently existing within ADR.
It can also be argued that to regard any advanced AI as having a complete picture of a
given dispute would be foolhardy, regardless of how advanced its inputs are. To return to
the Israel-Palestine case study above, it was noted by those who used ODR-systems on the
dispute that for all the positive impact of the agreement in real life, the surprise assassina-
tion of the Egyptian President following the Accords upended their stability. This indicates
that even a comprehensive AI-ODR system will be prone to the effects of the ‘principle of
chaos’, just as any traditional third-party resolution.

3.3. The potentially normative effect of AI in ODR


In addition to issues of input, it can be argued that there is significant potential for AIs to
have a substantive normative effect on the dispute resolution process. Although a com-
prehensive description of the normative effects of technology is beyond the scope of this
paper, it can be said that the way any given technology operates impacts upon society.118
For instance, televised media can be said to increase society’s awareness of international
issues.119 Arguably, the reduction of traditional negotiation to a numbers game will have
potential negative effects. For example, the Family_Winner system attempts to take an
emotional and personal process and render it mathematical.120 This can be said to
have an effect on the divorce process via the emphasis on the impersonal and the de-
emphasis of the personal.121 That is to say there has been a normative effect caused by
the Family_Winner system. This is because it has changed the way the parties would ordi-
narily negotiate. Whether such a normative effect would be positive or negative is subjec-
tive. On the plus side, there is a clear benefit to a system being able to de-escalate
situations which might be worsened via protracted traditional forms of dispute resolution.
For example, it may be able to help reduce the workload of the courts. To the contrary
however, there exists a correlation between personal, face-to-face negotiation and
long-term satisfaction and conciliation.122 In short, AIs do not speak ‘human’ but rather
humans are asked to speak ‘AI’.
Additionally, it can be argued that the increased use of machine learning in ODR (e.g.
the PERSUADER labour dispute resolution system)123 is a clear indication of how these nor-
mative processes might be compounded, via the creation of a unique AI ‘culture.’ If an AI
117
Katsh, Rifkin and Gaitenby (n 27).
118
Tsjalling Swierstra, ‘Identifying the normative challenges posed by technology’s “soft” impacts’ (2015) 9 Nordic Journal
of Applied Ethics 5, 14.
119
Ibid.
120
Davide Carneiro and others, ‘Enriching Conflict Resolution Environments with the Provision of Context Information’
(2017) 34 Expert Systems 1.
121
Thomson Reuters (n 107) at 19.
122
Jennifer Parlamis, ‘Face-to-Face and Email Negotiations: A Comparison of Emotions, Perceptions and Outcomes’ (2010,
USF: Organization, Leadership and Communication) <https://pdfs.semanticscholar.org/1cb8/b183a5b85c0dbb7cb6b
0a90462d8a3a2ab41.pdf> [Accessed 20 September 2021]
123
Arno R Lodder and John Zeleznikow, ‘Developing an Online Dispute Resolution Environment : Dialogue Tools and
Negotiation’ (2005) 10 Harvard Negotiation Law Review 287.
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 335

system is learning from past successes, it is foreseeable that it will develop efficient tem-
plates to deal with future disputes. Thus, the AI system will have an incentive to ensure
that the disputes that it deals with fit those templates. For example, a scenario can be con-
sidered in which a number of workers for a mining company call for a strike, on the basis
that the company’s healthcare policy does not cover certain industry-related diseases. A
well-trained labour dispute AI system might have deduced from previous experience that
the most efficient way to deal with labour disputes is to simply offer pay rises. Whereas a
traditional third party is likely to recognise the benefit of an expanded healthcare policy.
Such an example demonstrates the downside to employing systems which perpetuate a
norm of maximised efficiency, unless the norms they create are explicitly resisted, they
can have unintended effects.124
This is not to say that the creation of certain norms is entirely negative.125 For example,
eBay’s dispute resolution process solves more than 60 million conflicts a year.126 Arguably,
the quasi-objective nature of its automated ODR processes could be used as an opportu-
nity to defuse previously stressful situations: parties who might be described as “deadbeat
buyers” were instead described by their ODR tool as “non-paying bidders.”127 This can be
regarded as an example of the static, uncompromising nature of computerised systems
conferring an additional benefit to ODR.

3.4. Recommendations regarding issues of machine translation


In the absence of forthcoming intelligent interface systems, there needs to be a significant
bridging exercise undertaken by traditional third parties. This will aid the communication
process between automated ODR systems and negotiating parties. In the long term, intel-
ligent interface systems will be key to effective ODR processes. This is because this will be
the only way that an ODR AI system might become aware of all of the options available to
it in solving the dispute at hand.
Furthermore, the issue of AI systems applying overarching principles will require those
dealing with a given dispute to reflect upon their own overarching principles, in order to
state them in a form which might reasonably be applied by an AI system in an ODR
setting. Additionally, many principles will not exist explicitly in legislation and so will
require identification and validation before they can be allowed for. This might be a rela-
tively simple process, if the principles are ones which deal with the more concrete aspects
of the world. For instance, an AI system settling commercial disputes might need to have
in the back of its mind the need to ensure that the UK maintains a competitive business
environment.128 This will require identification and replication within ODR systems. It is,
for example, easy to declare justice as being a driving factor of dispute resolution, but
it is a different matter altogether to describe the concept in terms that an AI system
might understand. In the future, Parliament’s input regarding the interpretation of legis-
lation may further facilitate the role of AI in ODR.
124
Swierstra (n 118) at 16.
125
Herm Jooston, Josee Bloemer, Bas Hillebrand, ‘The Effects of Third-Party Arbitration: A Field Experiment’ (2016) 50
Journal of Consumer Affairs 585.
126
Katsh, Rifkin and Gaitenby (n 27).
127
Rule (n 96).
128
Jordan Daci, ‘Legal Principles, Legal Values and Legal Norms: are they the same or different?’ (2010) International Scien-
tific Journal <http://www.academicus.edu.al/nr2/Academicus-MMX-2-109-115.pdf> [Accessed 21 September 2021].
336 H. ALESSA

There is also substantial value in doubting the abilities of AI systems to operate entirely
holistically in the case of ODR. As noted above, AI systems are just as prone to the effects
of randomness as any traditional third party. Therefore, whilst AI systems might become
exceedingly efficient in their processes, they should not be thought of as acting perfectly.
In short, the value of proper oversight should not be disregarded, particularly in the field
of high-stake negotiation processes.129
Finally, there is much to be said for an awareness of the norms that might inadver-
tently be soft-coded into an AI-ODR system. Rather than blindly accepting the per-
ceived wisdom of AI systems, questions should be asked regularly of the ODR
culture that is being created by those AI systems and how that culture is affected
by the dispute resolution process. It will also be necessary for such norms to be re-
examined at regular intervals. This will ensure that AI systems do not find themselves
making faulty assumptions, as a result of their past experiences.130 Although this “hill-
climbing problem” is well known,131 no substantially developed solution exists. None-
theless, this is not to say that all norm-setting will be detrimental. As noted above,
ODR systems have the potential to deescalate and sanitise those disputes which
would proceed negatively in a traditional ADR environment. Further, as long as
there is a keen awareness of the effects, the latter can be used for positive ends,
not negative.

4. AI and access to justice


As with the development of many new technologies, the development of AI in ODR is
ostensibly a process of liberalisation; representing the growth of easier, cheaper and
more accessible and efficient resolution processes.132 However, this assertion can be cri-
tiqued as an over simplification of the long-term effects that AI might have within ODR. In
fact, it can be asserted that the opposite is true; rather, that the implementation of AI will
in fact be detrimental to enhancing access to justice.133 Arguments regarding this process
can be split into two. First, it can be argued that the removal of traditional third parties will
have certain positive effects. Second, it can be posited that the nature of AI as a technol-
ogy will mean that it will create a two-tiered system of dispute resolution. Both perspec-
tives are discussed in more details below.

4.1. The benefits of the dehumanisation of the dispute resolution process


Most new technology is expensive in the early stages of implementation. Therefore, the
technology’s true potential is not realised, until it is refined and available in cheaper forms.
As a result, only the wealthiest organisations benefit from the technology at first. For
example, when mobile phones were first introduced, they were for high end businesses,
129
Christopher Mims, ‘Without Humans, Artificial Intelligence is Still Pretty Stupid’ (Wall Street Journal, 12 November
2017) <https://www.wsj.com/articles/without-humans-artificial-intelligence-is-still-pretty-stupid-1510488000>
[Accessed 26 September 2021].
130
Engr. Chijindu, ‘Search in Artificial Intelligence Problem Solving’ (2012) 5 African Journal of Computing and ICT 37, 38.
131
Ibid, 38.
132
Thompson (n 52) at 5.
133
Suzanne Van Arsdale, ‘User Protections in Online Dispute Resolution’ (2015) 21 Harvard Negotiation Law Review 109,
109.
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 337

long before they became commonplace. It can be argued that will also occur with AI-ODR
techniques.134 Thus, just as state-appointed mediators have been made part of the tra-
ditional judicial sphere, it can be predicted that a similar phenomenon will occur with
AI-based ODR.135 This appears more likely when one considers the advantages that the
technology might one day have over traditional third parties, such as more effective
and swifter communication and costs reduction.
As such, the argument that AI will one day fulfil a demand within ODR is not beyond
the imagination. Many individuals find themselves involved in civil disputes, without the
ability to pay for the services of a professional advisor (e.g. lawyers, dispute resolution
experts).136 Instead, these disputants are left to either self-represent or refuse to pursue
what are often rightful claims.137 This is aggravated by several factors, including court clo-
sures and increased fees.138 Furthermore, the lack of legal aid is problematic. Unless indi-
gent individuals find themselves lucky enough to receive the services of a pro bono lawyer
or legal aid (an unlikelihood in the wake of the Legal Aid, Sentencing and Punishment of
Offenders Act 2012),139 their ability to access justice is reduced significantly. This indicates
that there exists a gap in the market. Where traditional human actors involved in the
dispute resolution process are expensive, AI systems are likely to be relatively cheap
after a few years of operation, subject only to maintenance costs, once they have been
sufficiently developed.140
Moreover, AI in ODR may help an individual receive help finding professional advisors.
For example, the GetAid system allows the process of applying for legal aid to be auto-
mated, freeing up significant resources, which could be deployed elsewhere.141 There
is little to suggest that this process might not occur in all areas of dispute resolution.
Labour intensive processes could be undertaken by an AI system, freeing up resources
which may be employed elsewhere. Therefore, the resolution process may become
cheaper and easier for everyone involved. This is what systems like Family_Winner have
achieved to a certain extent; lowering the cost of professional divorce settlement to
make it available to a wider market. Also, in the case of the ODR systems used by eBay
and PayPal, AI is employed as a means of promoting access to justice for consumers, or
traders who would ordinarily not have access to dispute resolution processes.142 In con-
trast, if these disputes were solved using traditional methods of dispute resolution, settle-
ment would require more time and effort, wasting important resources. This often leads
to litigants having no choice but to lose their claims or opportunities to enforce their
rights.143 It is therefore unsurprising that more than 80% of the disputes that have

134
Carrie Menkel-Meadow, ‘Ethics in Alternative Dispute Resolution: New Issues, Now Answers from the Adversary Con-
ception of Lawyers’ Responsibilities’ (1997) 38 South Texas Law Review 408, 413.
135
Jenner (n 54).
136
Zeleznikow (n 58) at 30-31.
137
Ibid.
138
‘Access to Justice’ (The Law Society, 2018) <http://www.lawsociety.org.uk/policy-campaigns/campaigns/access-to-
justice/> accessed 22 September 2021.
139
Catherine McKinnell, ‘Lawyers can’t be expected to plug the gap in legal aid provision’ (The Guardian, 2015) <https://
www.theguardian.com/commentisfree/2015/nov/06/lawyers-legal-aid-lawyers-justice> [Accessed 20 September
2021].
140
Zeleznikow (n 58) at 32-33.
141
Ibid at 37.
142
Rule (n. 96).
143
Lilian Edwards and Ashley Theunissen, ‘Creating Trust and Satisfaction Online : How Important Is ADR? The UK EBay
Experience’ [2006] Connecticut Law Review 1.
338 H. ALESSA

been settled by eBay involved the use of an AI software.144 This evidences the clear poten-
tial of AI in the future of dispute resolution. One may then wonder how many disputes can
be resolved on a daily basis if this technology is developed and rendered available around
the globe.
Arguably, if such technology can resolve the simplest claims, complex cases, which
may normally struggle to receive sufficient court time, will get the diligence they
require. However, this is not without drawbacks. For example, using AI to resolve disputes
relating to minor criminal offences (such as speeding) may certainly reduce the burden on
the courts, but does not enhance the fairness of outcomes. It is likely that many individ-
uals would lack the awareness that by pleading guilty, they would be accepting a criminal
offence on their record. This would be particularly problematic for individuals who are
subjected to DBS checks. As such, the implementation of AI in ODR, must be done with
caution. A similar scenario can be said to have emerged in the public sphere in the
case of the DoNotPay service, which subjects simple citizen-government disputes (i.e.
parking tickets) to an automated ODR service.145 Nonetheless, what these examples
show is that AI in ODR should not be regarded as a complex technology which will even-
tually be simplified for majority use. It is sometimes a ‘trickle-up’ technology, one that is
relatively simple when applied widely, but might be tailored and developed into more
complex forms for minority use.146

4.2. How AI can inhibit access to justice


Whilst it is certainly tempting to regard any new technology with an air of optimism, it is
important to critically analyse the technology’s potential and identify any negative impact
upon equality. This allows for calculated measures and safeguards to be implemented to
mitigate the risk of the negative impact.
As such, before AI in ODR is implemented in any jurisdiction, such an analysis must be
undertaken. First, it must be determined whether the minimum level of familiarity and
knowledge has been achieved across the board. As discussed above, there are substantial
barriers to human-AI interfacing, due to the innate translation processes. Thus, in order for
individuals to easily access the services offered by AI in ODR, they would require a sus-
tained ability to interact confidently with modern technology,147 which is not a given
yet. This may be particularly problematic for the older generation, who may be less com-
puter-literate.148 As such, for them to truly benefit from AI in ODR, education would be
required. If this education is not provided, an access gap is likely to develop.149 Therefore,
one might argue that providing access to AI in ODR alone is not enough. For example,
with the UK government’s attempts to move towards a digital world, it may leave

144
Sanjana Hattotuwa, ‘Conversation with Colin Rule, Director of Online Dispute Resolution for EBay and PayPal’ (Ict for
Peacebuilding, 2006) <https://ict4peace.wordpress.com/2006/09/21/conversation-with-colin-rule-director-of-online-
dispute-resolution-for-ebay-and-paypal/> accessed 10 September 2021.
145
Gibbs (n 95).
146
Jason Kornwitz, ‘Why ’trickle-up’ innovation may shape the global economy’ (Phys.org, 2012) <https://phys.org/news/
2012-07-trickle-up-global-economy.html> [Accessed 15 September 2021].
147
Jenner (n 54).
148
Ibid.
149
Runddy Ramilo, ‘Critical analysis of key determinants and barriers to digital innovation adoption among architectural
organizations’ (2014) 3 Frontiers of Architectural Research 431.
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 339

behind the 5 million people in the UK who have never used the Internet.150 Consequently,
education must precede deployment to support individuals to engage in the digital
world.151 Thus, if AI-based ODR becomes the primary means of solving disputes, and
without any analogue alternative, it may well prevent a certain stratum of society from
accessing justice.
It can also be asserted that once AI has been sufficiently developed in the private
sphere, a two-tier dispute resolution system may be created. This will be composed of
expensive service providers with access to highly efficient AI systems, and cheaper
service providers who do not. This is arguably not hard to imagine. For instance, once pre-
liminary supportive AI services are developed in the area of knowledge support, a gap
develops between those parties. On the one hand, some individuals have access to AI
systems and are able to deal with large amounts of disclosed documents very quickly.
In contrast, those who have to deal with that same volume manually are seriously disad-
vantaged. Jenner describes this as a ‘knife to a gun-fight’ scenario.152 If this occurs,
particularly in complicated commercial disputes involving many corporations, there
may be an inequality of arms; thereby, leading to a scenario where only some parties
have access to justice.153 Arguably, this is likely to be a common feature since the
current concentration of AI research related to complex commercial cases suggests an
association between the development of AI systems and wealthy commercial firms.154
For example, Susskind indicates that the ‘magic circle’ law firms have already signed up
and invested with AI provider.155 In this regard, only those individuals that can afford
such law firms will benefit.
Notwithstanding this potential disadvantage, one might argue that it is also a problem
with litigation and traditional ADR mechanisms. For example, one party is able to afford a
multi-person legal team made up of Queen’s Counsel and magic circles lawyers; whilst the
other may only be able to afford high-street lawyers. Nevertheless, AI does not make any
original significant contribution to dispute resolution if it is unable to overcome common
obstacles confronted by litigation and traditional ADR mechanisms. As it stands, an indi-
vidual can opt for free dispute resolution services (if available, depending upon the law in
issue) and thereupon can decide to pay more for better and more comprehensive
services – for example by hiring a magic circle law firm to assist in arbitration. The
market is on a gradient, with a correlation between quality of outcome and cost of
lawyer.156 As such, as long as an individual or company is of moderate means, they
may well be able to afford representation which is commensurate with the value or com-
plexity of the case. In contrast, if a two-tier system develops, there will be a marked differ-
ence between those companies that are able to deploy advanced dispute resolution
technologies like AI systems and those that cannot. A situation might therefore

150
Emma Bailey, ‘Digital by Deftaul?’ (We are Citizens Advice, 2018) <https://wearecitizensadvice.org.uk/digital-by-default-
e91f6711927> accessed 22 September 2021.
151
Ibid.
152
Jenner (n 54).
153
Richard Chernick, ‘ADR Comes of Age’ (2004) 4 Pepperdine Dispute Resolution Law Journal 187.
154
Carneiro (n 9) at 219.
155
Richard E. Susskind, Tomorrow’s Lawyers : An Introduction to Your Future (OUP 2017).
156
Chris Hanretty, ‘Lawyer rankings either do not matter for litigation outcomes or are redundant’ (2016) 23 International
Journal of the Legal Profession 185.
340 H. ALESSA

develop where the determinative factor, in any given dispute, is not which party is in the
right, but rather which party has access to AI technology.
The disparity of a two-tier system might further evolve, via the erosion of those
dispute resolution services which are used by those who cannot afford premium pro-
viders. Precursor AI systems tend to be employed to deal with those tasks which
require less experience: low-level cases, document and case management. These
also tend to be those tasks which are performed by junior lawyers or legal assistants,
due to the low experience threshold.157 At the same time there is a tendency for junior
lawyers to be the providers of pro bono services, due to the correlation between
simpler cases and clients who lack means.158 Complex and high value commercial
cases on the other hand tend to involve companies with the ability to pay for the
required level of advice. It can therefore be argued that the development of AI in
ODR may reduce the availability of pro bono advice.159 This is because there will be
a lower demand for junior lawyers, since their functions will be replaced by AI
systems.160 Obviously, there will still be junior lawyers (else how might senior
lawyers exist?) but they will, in the future, find themselves operating outside of
major firms that can afford AI replacements for their labour. They are likely to
instead be part of far smaller satellite firms that are less capable of offering pro
bono or low cost services.161 Thus, with the absence of junior lawyers in large firms,
the availability of a pro bono workforce may be lowered.162
It can further be contended that a market dumping phenomenon may occur. AI will be
either purposively or inadvertently used to reduce the proportion of the market that pro
bono or low-cost lawyers hold. At the same time, AI systems can be offered as a replace-
ment for those lawyers; thus, securing the hold of AI upon the market as the primary
alternative for those who can only access lower cost or free legal services. Whether
such a phenomenon is a concern for access to justice on the whole is dependent on
how accessible those replacement AIs are to the average disputant. If access is guaran-
teed through either charitable intent on the part of the developers or government regu-
lation, then the proposed phenomenon will be of little danger to access to justice.
Conversely, if mismanaged, the phenomenon has the potential to affect in a negative
manner how the traditionally disempowered access justice.
Even if such a phenomenon does not take place, it can further be contended that by
design, it is likely that there will be an emphasis on AI-led ODR dealing with lower value
cases. Indeed, this is the future predicted by Lord Justice Briggs in his call for the creation
of online court systems, with a financial threshold that must be crossed before access to
traditional courts is permitted.163 This will be of little import if fully competent AI systems
are developed which can match those services that might be provided by traditional third

157
Art Cockfield et al, ‘How will artificial intelligence affect the legal profession in the next decade?’ (Queen’s University,
2018) <https://law.queensu.ca/how-will-artificial-intelligence-affect-legal-profession-next-decade> [Accessed 20 Sep-
tember 2021].
158
The Law Gazette, ‘Record number of junior lawyers working pro bono’ (Law Gazette, 7 November 2013) <https://www.
lawgazette.co.uk/practice/record-number-of-junior-lawyers-working-pro-bono/5038629.article> [Accessed 26 Septem-
ber 2021].
159
Ibid.
160
Cockfield (n 157).
161
Ibid.
162
Jenner (n 54).
163
Lord Justice Briggs (n 8).
INFORMATION & COMMUNICATIONS TECHNOLOGY LAW 341

parties. However, there is good reason to believe that there will be a marked difference in
the nature of the justice meted out in a two-tier court system such that the message sent
is essentially that the progression of a case is dependent not on the notion of justice but
rather on the notion of financial value. In other words, although a two-tire court system
will facilitate litigation of the larger and complex cases (as courts will essentially deal with
such cases), access to justice might become the preserve of the rich, with only an
AI facsimile available to the poor.

4.3. Maximising access to justice


It will be important to ensure that access to AI systems is not dependent upon access to
technology as a whole. Arguably, this would lead certain users to be disadvantaged. This
will require one of two things; either the creation of an intuitive interface which can be
utilised by the vast majority, or the deployment of intermediaries to aid communication
between clients and AI systems. Obviously, this second solution is not ideal in that it par-
tially counters the major benefit of AI-use: the removal of the time/resource constraints
associated with human agents.
Of further interest is the gaps which might open up between parties with access to AI
systems and those who do not. This is a difficult problem to solve. On the one hand, the
creation of a two-tier system will be detrimental to justice, but on the other hand, it would
be an unpalatable state of affairs to curtail the development of important technologies
with great promise. Perhaps the most beneficial outcome will be to rely upon the
natural processes of technological development to deal with this balance issue. Certainly,
those parties which are already advantaged (e.g. wealthier organisations and their clients)
will have access to advanced AI technologies before others, but over time it is likely that
the technologies will spread and become more prolific at all levels, just as any other com-
puter technology has in the past. Therefore, the most likely outcome is the unsettling
observation of disparity at the onset, which then levels off over time. Of greater
concern is the fact that changes in legal/ADR/ODR culture might occur as a result of
this temporary disparity which may not be rectified in the short term, especially if it is
accompanied by the erosion of the role of junior lawyers. In order to ensure that low-
cost dispute resolution services are maintained, there will either have to be a renewed
commitment of organisations using AI to provide pro bono services or a commitment
of additional government resources to the legal aid system.
Finally, it is important that AI-based resolution systems are not seen as a cheap and
easy way to deal with lower value cases if those AI systems are offering a substantially
worse service than their human counterparts might. To do so would be to create a
system in which the application of justice is dependent upon one’s bank balance.

5. Conclusion
This paper has critically examined the role that AI currently plays in shaping ODR, and how
that role might develop in the future to enhance the efficiency of ODR as regards the
settlement of legal disputes. It is shown that advances in technology and the continuous
increase in transnational transactions interconnecting developed and developing
countries will inevitably augment the demand of AI systems to settle disputes.
342 H. ALESSA

Historically, AI has played a role in the application of justice, preservation of rights, and
the promotion of social values. AI has been described as a phenomenon that shapes the
nature of legal discourse in ODR – as a so-called ‘fourth party’ negotiator. Hence, it is not a
mere tool to aid resolution, like a telephone or a calculator, but rather a means of resol-
ution or forum in itself. However, many commentators have observed that AI in ODR has
faced the same problem that has beset other applications of AI: slower than expected
development. The analysis in this paper therefore posits as a pattern which is true for
AI research and implementation: an explosion in optimistic predictions, followed by a
recalibration of expectations once the realities of the technology become apparent.
However, there is good reason to believe that the ultimate benefit of AI in ODR will be
the creation of highly supportive systems which will uncork bottlenecks in the judicial
system and replace red tape and litigation with a process that is even more efficient
than traditional ADR models. Nonetheless, this will likely result in a two-tiered system
of dispute resolution. The consoling prospect is that although wealthier organisations
and their clients will have access to advanced AI technologies before others, over time
it is likely that the technologies will spread and become more prolific at all levels.
Nonetheless, blind reliance on AI systems could be detrimental if they are not re-exam-
ined periodically and regularly. Leaving the wisdom of AI to develop its own parameters
on dispute resolution may cause it to drift away from what it was created for in the first
place. Accordingly, critical observation of how the AI-based ODR culture evolves is of
paramount importance. Another issue that weighs the same if not more to machine trans-
lation is if AI-based ODR adversely affects access to justice. Thus, it is imperative that AI-
based ODR does not develop into an obstacle that exacerbates the problem of access to
justice for the less affluent. Ensuring that AI-based ODR complies with access to justice
standards will further enhance its effectiveness and efficiency as a dispute resolution
tool or forum.

Disclosure statement
No potential conflict of interest was reported by the author(s).

You might also like