The Role of AI Robotics in Arbitration

Emma Okonji examines the influence of artificial intelligence robotics on arbitration, vis-a-vis the views of experts on how emerging technologies will play the role of humans in mediation

Before now discussions were focused on how technology like artificial intelligence (AI) robotics will take the roles of workers in organisations and cause job losses for millions of workers, given the rate at which organisations are adopting that technology.

Although the perception of job losses has raised fears among workers across global organisations, technology experts are still battling to convince the world that the evolution of AI robotics will rather create additional jobs as against the general belief that it would cause job loss.

Just as the impact of AI robotics technology on company workers is being debated upon, some judicial experts who are arbitration specialist, have also raised some concerns that the same AI robotics technology could influence arbitration cases.

Only recently, some members of International Court of Arbitration and Hogan Lovells Partners, Mr. Winston Maxwell, Mr. Laurent Gouiffès and Senior Associate, Gauthier Vannieuwenhuyse, met at the firm’s Paris office to evaluate how AI blockchain, and other technologies are changing the process of arbitration.
The experts discussed what new technologies meant for the future of arbitration, and whether humans or robots will play the primary roles.

Impact of AI technology on the legal matters
At the recent Paris office debate which was monitored by THISDAY, Gouiffès was of the view that AI builds greater efficiency and accuracy into the legal system with capabilities that include natural language processing (NLP). He said blockchain’s highly secure distributed ledger feature, could transfer information or property without third parties, and this has had an impact on contract law, with the development of smart contracts. Capabilities like these spawned the LegalTech trend, which started in the United States and is now a fixture in Europe, and supports services such as automated contracts and online case management. But new technologies also create new challenges such as compromised confidentiality, issues of due process, and more, Gouiffès said.

Technology and arbitration
Looking at what type of technologies can be applied to arbitration and in what ways they can be useful, Maxwell, who viewed it from what AI could do in the area of natural language processing, said it could analyse and extract meaning from thousands, tens of thousands, or hundreds of thousands of documents that may be relevant for litigation. This, he said, has been around for a long time, in terms of e-discovery.

“But previously, AI was limited to looking for keywords, whereas now, it can actually extract meaning from written materials, e-mails, and voice conversations. So the most basic use of AI in arbitration or litigation is to help manage massive amounts of documentation that previously had to be reviewed and checked by junior lawyers,” Maxwell said.

Vannieuwenhuyse described the AI tools as “predictive justice,” where arbitrators could use AI to analyse arbitration or court decisions in order to statistically derive probabilities about how individual case is going to be decided.

According to Vannieuwenhuyse, “All these technologies may impact the key actors in arbitration proceedings.
“Take the example of counsel, or even the arbitrators themselves. When they use digital tools such as document management tools or NLP, this can save a lot of time and money. It’s especially relevant in the discovery phase. Sometimes we, as counsel, receive thousands of pages of documents, which would take a whole team a number of hours to review. But now we can have a tool or robot that can analyse the relevant data that is of crucial importance to our case,” he said.

“Another example is the digitalisation of the arbitration process, where arbitrators can use electronic submissions instead of sending hard copies. In arbitration cases, it is not unusual to sometimes have 300 exhibits and a brief of 200 pages, multiplied by five examples, because arbitrators need to send them to the whole tribunal, whose members may be located in New Zealand, Switzerland, and the United States, and also to the other counsel — that’s a lot of documents to print,” Vannieuwenhuyse said.

He added: “You could also have hearings take place via a video platform. So instead of having a hearing located in Paris, the arbitrators will stay in, to take the same example, New Zealand, Switzerland, and the United States. No one is traveling, everybody stays in his or her office and uses the online platform to conduct the hearing. Of course, that saves on costs.”

This, he said, is extremely interesting for arbitration institutions, because it also expands the arbitration market to lower-value disputes, which historically have not really been the subject of arbitration, because arbitration might sometimes be costly.

Faster arbitration with AI robotics

In addressing how arbitrators could use AI robotics tools to make the discovery phase faster and more efficient, Maxwell said: “I think the most fascinating aspect of all this is whether arbitrators themselves can be robots. That gets into a philosophical question that’s not as absurd as it first sounds.

He said: “With the development of blockchain, you have what they call ‘smart contracts,’ which automatically perform themselves. It is quite possible that you could agree, in a smart contract between Vannieuwenhuyse and myself, that if we have a disagreement, it will be referred to an outside artificial intelligence robot to resolve.”

AI and the fear of job loss
While looking at the fundamentals of AI robotics and whether arbitration is necessarily a human activity, Vannieuwenhuyse explained that the general view had been that it is extremely problematic from a legal standpoint.
“We can wonder whether it’s even lawful to have robots as arbitrators, first because there is no legislation that expressly addresses this possibility. It is not dealt with in the existing legislation because, of course, this issue was not envisaged as a possibility at the time of their drafting. And this raises a problem with the composition of the arbitral tribunal: in some legislations, the arbitrators are defined as persons, so by definition they cannot be robots. But in others, there’s a gray area, and as such the question remains unanswered,” he said.

During the discussion, the issue of whether a robot arbitrator that renders its decision in the form of a code, can be considered as an arbitral award, but it was argued that in France for instance, it would not be seen as a decision, because a decision needs to include legal reasons expressed in words to justify it.
Maxwell, however, explained that the overall limit, of course, is our constitutions and conventions on fundamental rights.

“The U.S. Constitution provides for due process and we have similar rights in Europe. Due process currently means that you have a right to a fair trial, and a fair trial currently means that humans are considering your situation, because humans combine strict applications of the law with more subtle considerations of equity. And I don’t think anyone would accept the legitimacy of robots as judges or arbitrators because they are not human, they don’t have a heart, and they don’t apply equity. So as soon as your arbitration needs to be enforced outside of the blockchain, an arbitral award by a robot currently will be considered null and void, and therefore unenforceable,” Maxwell said.
According to him, the more interesting question right now is, what if I don’t need to seek enforcement in these smart contracts?
“Because of the robot awards you 150 bitcoins, my account is automatically debited 150 bitcoins. It’s just done. There’s no court involved to enforce the award — it’s completely disconnected from the judicial system and the constitution,” Maxwell said.

But according to Vannieuwenhuyse, “In that case, you don’t need to enforce anything before any court because it will have been directly enforced. So it’s a completely closed circuit.”
Still on the debate whether arbitration necessarily has to be human, Maxwell said the chairman of the ICC International Court of Arbitration, who attended Vannieuwenhuyse’s and Gouiffès’ event in January, was fascinated by the question, does arbitration necessarily have to be human? He said we are all sitting comfortably in this room, but we have to think that, in 10 years’ time, people will be thinking very differently and the idea of having robot arbitrators may be considered acceptable.
“We are all conditioned by our own cultural and historic context, and those could evolve over time.”

Challenges of technology to arbitration process

Looking at the challenges of new technology to the entire arbitration process, Vannieuwenhuyse said: “The challenges are not only related to blockchain, but to all new technologies. First, there is a challenge with confidentiality. It is a generally accepted principle that arbitration is confidential. However, recourse to digital technologies or AI will involve, to some extent, human input at the end.”

He explained that humans, who are completely external to the arbitral proceedings, will programme and handle these technologies, adding that it is an issue that needs to be addressed by an arbitral tribunal. He however insisted that a simple confidentiality agreement would be enough to protect the confidentiality of the proceedings.
Then there was an issue on due process, especially with the predictive justice tools. At present, these tools are not perfect, because they usually put the facts of the case and the reasoning of past courts basically at the same level. It was argued that it might prejudice the fundamental right to be heard to some extent. They all said if the arbitrators blindly follow the results of the predictive justice tools, it will prejudice the right of the parties to be heard, because there is a risk that the arbitrators give too much weight to the precedents compared to the actual facts of the case.

In addressing this issue, Maxwell gave an example with a criminal case in the U.S.- Loomis, where the sentencing judge made use of AI algorithms to compare his own sentencing decision with a computer-generated probability score of whether a given person would be a repeat offender.

That was challenged before the Supreme Court of Wisconsin, which said that the judge’s use of an AI tool was permitted, because he just used the tool for information, and he did not rely on the tool for his decision. So if it was just a tool to help a judge gather information and guide his decision, then it’s okay, Maxwell said, but further explained that it obviously can’t replace his own decision, explaining that the Loomis decision is highly controversial.

According to Gouiffès, one of the big challenges is the question of the control of the proceedings. Some authors have described the AI and these new technology tools as the extra arbitrator. He said the question could be whether these tools just help to make a decision, or do they make the decision themselves and arbitrators just follow them because they trust them more than humans.

In his response, Maxwell said as AI tools get better and better, the problems vis-à-vis humans will increase, because we as humans will give more and more weight to what the robot says, and the robot cannot be wrong.

Related Articles