AI arbitrator selection tools and diversity on arbitral panels

Back to Arbitration Committee publications

Allyson Reynolds
White & Case, New York
allyson.reynolds@whitecase.com

Paula Melendez
White & Case, Geneva
paula.melendez@whitecase.com

Advances in artificial intelligence (AI) have led to the rise of products aimed at facilitating the work of international arbitration practitioners. This article discusses the question of diversity in international arbitration and whether AI technology can help to bridge the diversity gap apparent in arbitral tribunals.

Arbitrator diversity is a key area of interest in international arbitration given the global spread of parties and their disputes, and yet, arbitral panels are distinctly homogenous, predominantly consisting of ‘older white males’.1 According to statistics released by the Stockholm Chamber of Commerce in 2019, only 23 per cent of appointed arbitrators were women, revealing a significant gender imbalance in arbitral appointments.2 Of course, gender is only one aspect of diversity.3 Diversity encompasses a range of characteristics including race, age, geography, language, and ethnicity. However, the data in these categories is no more favourable. For instance, the neutrals roster at the American Arbitration Association is approximately 23 per cent diverse for gender and race.4 A study of closed ICSID cases found that tribunals were composed entirely of all Anglo-European arbitrators nearly half of the time (45 per cent), and only 11 cases (four per cent) were arbitrated by entirely non-Anglo-European panels.5

Numerous studies suggest that there is a positive correlation between increased diversity and improved team performance. Research in the corporate sector has shown that diverse perspectives reduce ‘groupthink’ while increasing creative problem-solving in boardrooms.6 Another study suggests that diverse groups focus more on facts, observing that racially heterogeneous panels in mock juries made fewer factual errors in discussing evidence than their homogenous counterparts.7 Extending these psychological findings to an arbitration context suggests that appointing a diverse tribunal of arbitrators may improve the efficiency and effectiveness of the tribunal as a whole.8 Institutional stakeholders also recognise the value of diversity in tribunals. In the 2018 International Arbitration Survey conducted by Queen Mary University of London and White & Case, 40 per cent of respondents expressed the opinion that diversity across an arbitral panel would improve the quality of the tribunal’s decision-making, while 19 per cent responded that the inquiry is irrelevant because diversity is inherently valuable.9

However, there are significant barriers to increasing diversity on arbitral panels. For one, data suggests that experienced arbitrators are favoured over new faces. In 2018, only 13 per cent of arbitrator appointments in LCIA cases were first-time appointments, down from 17 per cent in 2017.10 Furthermore, the confidential nature of arbitration and the opaque appointment process limits the available information from which parties and arbitral institutions can identify and screen potential arbitrators, which reduces the visibility of diverse candidates.11 Too often, the selection of arbitral candidates involves looking at arbitral institutions’ lists, business cards, word of mouth, basic web navigation and disclosures which are too often incomplete, a process which perpetuates appointments of the ‘same faces’.12

At the same time, AI has made its entry into the world of international arbitration, offering tools that may help to correct the diversity deficit.13 From applications that are ripe for counsel to use, such as those assisting with the review of documents (so-called Technology Assisted Review (TAR)), to more seemingly futuristic predictive AI software, which are useful for third-party funders as well as counsel, AI in international arbitration is here to stay.14

Tools applying AI to help users with the appointment of arbitrators have not come to market yet. Online databases using data analytics do exist, and some of these are already attempting to bridge the transparency gap in the arbitrator selection process. Catherine Rogers’ Arbitrator Intelligence Questionnaire (AIQ), for instance, aggregates biographical information about various arbitrators as well as statistics related to past decisions such as claimants’ rates of recovery, with the aim of providing reports about arbitrators that can be purchased by counsel and third-party funders.15 AIQ aims to promote diversity by increasing information and reducing subjectivity in the arbitrator selection process. According to Rogers, more information allows newer arbitrators to be more fairly compared to experienced arbitrators based on objective criteria and would help overcome the opacity in the process described above.16 Similarly, GAR’s Arbitrator Research Tool aims to deliver ‘insight and raw data’ on arbitrators and lets users search by filters including gender.17 Applying AI to enhance the data analytics element of a usual database by trawling through a vast amount of data to suggest the ‘best’ candidate for a particular case based on parties’ stated preferences therefore appears to be the logical next step in the development of such technology.

Thus, now is the time to be asking what impact, if any, AI can have on diversity in arbitral panels. AI arbitrator selection tools offer a potential avenue for improving diversity. Diversity is an objective that data scientists can model like any other. Should a client prioritise diversity in their selection, factors such as race, age, and gender can all be built into a model to suggest new candidates. Alternatively, a chosen algorithm could turn a blind eye to race, geography, age and gender, focusing instead on suggesting candidates based on their knowledge in particular areas of law, languages spoken, average time taken to render a final award, any potential conflict of interest, and even availability.18

A conflict with diversity may arise, however, where AI is expected to deliver greater predictive accuracy on outcomes. If users seek to use AI arbitration selection tools to predict the decision of a particular candidate given the specific facts of a dispute, the need for a longer track record presents a significant barrier to entry for more diverse arbitrator candidates. Indeed, a longer track record increases the ability of AI tools to predict future outcomes and, in turn, improves predictive accuracy.19 In the case of arbitrators, a track record is built on a given individual’s experience in case management and decision-making, including a number of pending and concluded arbitrations. It is easy to see how such a model would favour experienced arbitrators and keep newcomers and those with less experience out of the race.

AI can replicate petty human biases, a well-known weakness of this technology,20 but advocates suggest that the technology itself is, at its core, agnostic to human preferences and therefore malleable in accordance to these. Users must ultimately decide whether to model diversity over predictability. In theory, if diversity becomes the stated objective (for instance, through insistence by arbitral institutions), AI arbitrator selection tools could mine candidates’ articles and their decisions in domestic tools to predict their positions rather than looking solely at their arbitral track record, offering one potential solution to the predictability versus diversity conundrum.

In practice, if AI tools are to be used to foster more diverse arbitral panels, practitioners will need to engage with some of the thorny issues surrounding both the use of AI and arbitrator diversity. Limited technological expertise should not keep lawyers away from this important dialogue with technology developers. In fact, many of the questions driving the outcome require broader thinking that should engage the gamut of the international arbitration community. For instance, what is meant by ‘diversity’ in the context of arbitral tribunals? Diversity is multi-dimensional and nuanced, and practitioners must ask themselves how committed they are to rectifying imbalances not only in gender, but also in geographical, racial and socio-economic backgrounds of arbitrator panels.

As one arbitration practitioner noted in 2018,’An observer from planet Mars may well observe that the international arbitral establishment on earth is white, male and English-speaking’.21 If the arbitration community is to move away from such a description, it needs all the help it can get, including from AI. The development of AI arbitrator selection tools need not turn into yet another instance of ‘machine versus human’, instead it presents an opportunity to use technology to improve the offering of international arbitrators.

Notes

  1. F Peter Phillips, ‘Diversity in ADR: More Difficult to Accomplish Than First Thought’, 15 No 3 Dispute Resolution Magazine 14 (2009), available at https://digitalcommons.nyls.edu/cgi/viewcontent.cgi?article=2334&context=fac_articles_chapters.

  2. SCC Statistics 2019, Arbitration Institute of the Stockholm Chamber of Commerce, available at https://sccinstitute.com/statistics.

  3. International Council for Commercial Arbitration, Report of the Cross-Institutional Task Force on Gender Diversity in Arbitral Appointments and Proceedings, ICCA Reports No 8, vii (2020), available at www.arbitration-icca.org/media/15/04754826794972/icca_report_8_v4.pdf.

  4. Sasha A Carbone and Jeffrey T Zaino, ‘Increasing Diversity Among Arbitrators: A Guideline to What the New Arbitrator and ADR Community Should Be Doing to Achieve this Goal’ (NYSBA Journal 2012), available at https://adr.org/sites/default/files/document_repository/Increasing%20Diversity%20Among%20Arbitrators_0.pdf.

  5. C Dolinar-Hikawa, ‘Beyond the Pale: A Proposal to Promote Ethnic Diversity Among International Arbitrators’, (Transnational Dispute Management July 2015), available at www.sidley.com/-/media/publications/tv124article17.pdf.

  6. McKinsey & Company, ‘Women Matter: Ten Years of Insights on Gender Diversity’, 19 (2017), available at www.mckinsey.com/~/media/McKinsey/Featured%20Insights/Women%20matter/Women%20Matter%20Ten%20years%20of%20insights%20on%20the%20importance%20of%20gender%20diversity/Women-Matter-Time-to-accelerate-Ten-years-of-insights-into-gender-diversity.pdf.

  7. Samuel R Sommers, ‘On Racial Diversity and Group Decision Making: Identifying Multiple Effects of Racial Composition on Jury Deliberations’, 90 Journal of Personality and Social Psychology, 4, 597 (2006), available at www.apa.org/pubs/journals/releases/psp-904597.pdf.

  8. Samaa A Haridi, ‘Towards Greater Gender and Ethnic Diversity in International Arbitration’, International Arbitration Review of the Bahrain Chamber for Dispute Resolution, 311 (21 January 2016), available at www.hoganlovells.com/en/publications/towards-greater-gender-and-ethnic-diversity-in-international-arbitration.

  9. White & Case, 2018 International Arbitration Survey: The Evolution of International Arbitration, available at www.whitecase.com/sites/whitecase/files/files/download/publications/qmul-international-arbitration-survey-2018-19.pdf.

  10. 2018 Annual Casework Report, (LCIA 2018), available at www.lcia.org/News/2018-annual-casework-report.

  11. Gemma Anderson, Richard Jerman, and Sampaguita Tarrant, ‘Diversity in International Arbitration’, available at https://uk.practicallaw.thomsonreuters.com/w-019-5028?transitionType=Default&contextData=(sc.Default)&firstPage=true&bhcp=1.

  12. Daniel Becker & Ricardo Dalmaso Marques, ‘Why the use of technology in arbitrators’ selection process – although fostered – must still be handled carefully’ (CBAr 23 July 2019),available at www.cbar.org.br/blog/artigos/why-the-use-of-technology-in-arbitrators-selection-process-although-fostered-must-still-be-handled-carefully.

  13. AI is a family of computational algorithms that are capable of automated statistical learning based on data sets. This definition can be further broken down into two families of AI machine-learning methods: (1) deep learning; and (2) classical machine learning models.

  14. James Kwan, James Ng & Brigitte Kiu, ‘The Use of Artificial Intelligence in International Arbitration: Where Are We Right Now?’, International Arbitration Law Review, Issue 1 (2019).

  15. See https://arbitratorintelligence.com/faqs.

  16. Catherine A Rogers, ‘The Key to Unlocking the Arbitrator Diversity Paradox?: Arbitrator Intelligence’, (Kluwer Arbitration Blog, 27 December 2017), available at http://arbitrationblog.kluwerarbitration.com/2017/12/27/on-arbitrators/?doing_wp_cron=1597589994.3238279819488525390625.

  17. Available at https://globalarbitrationreview.com/arbitrator-research-tool.

  18. Kwan, Ng, & Kiu at 21 (see n 14 above).

  19. Maxi Scherer, ‘Artificial Intelligence and Legal Decision-Making: The Wide Open?’, Journal of International Arbitration, 539, 556 (2019), available at https://sifocc.org/app/uploads/2020/04/Artificial-Intelligence-and-Legal-Decision-Making-The-Wide-Open-Maxi-Scherer-041119.pdf.

  20. Matthew Hutson, ‘Even artificial intelligence can acquire biases against race and gender’ (Science, 13 April 2017), available at www.sciencemag.org/news/2017/04/even-artificial-intelligence-can-acquire-biases-against-race-and-gender.

  21. Caroline dos Santos, ‘Diversity in international arbitration: A no-woman’s land?’ In Leo Staub (Ed), Beiträge zu aktuellen Themen an der Schnittstelle zwischen Recht und Betriebswirtschaft III, Schultess, Zurich (2017), pp. 207-231, available at www.lalive.law/wp-content/uploads/2018/12/Diversity-in-international-arbitration_dos-Santos.pdf.

Back to Arbitration Committee publications