A right to call my lawyer’s chatbot
Hin Han Shum
Squire Patton Boggs, Hong Kong
Communication has always been an indispensable part of the practice in the age-old profession of law. Be it taking instructions, preparing advice or offering submissions in court, communication has been at the forefront of achieving results. With recent technological developments, particularly in the field of artificial intelligence, can chatbots providing legal advice replace human advice?
This article will examine the concept of chatbots providing legal advice and a selection of legal issues that may arise for law firms using chatbots from a Hong Kong legal perspective.
In a nutshell, chatbots are programs which are designed to allow more efficient and accessible communication methods between humans and machines. Chatbots allow for a more interactive, human way of communicating than the usual search engines that display pages of results. For example, instead of searching within a Frequently Asked Questions webpage, a person can just type up their question and the chatbot would (in theory) be able to generate a directed answer within seconds (depending on internet connection), based on the data that has been input in their system.
Chatbots are being used by companies as customer service representatives due to their ability to interact with people via written message or on the phone. The service is offered around the clock and streamlines the manpower needed to carry out those functions. This is especially usefully in the time of a pandemic, where face-to-face interaction is not encouraged.
Many chatbots use artificial intelligence components such as expert systems (where a set of pre-programmed rules will lead the machine to generate a decision) and machine learning (where data is input into the machine, so that it can break it down into small pieces to process and learn from it). They also use natural language processing (NLP) which enables the machine to process human language and syntax, and be able to understand questions, to provide the appropriate replies.
Issues for law firms
Use of chatbots may help to lower the legal cost for preliminary consultations and enable better access to justice, but there are various items to consider before a law firm implements chatbots as part of its services.
Principle 1.07 of the Hong Kong Solicitors' Guide To Professional Conduct Vol. 1 allows Hong Kong lawyers to make use of new information communication technologies in their practice, as long as they ensure that the technology and their use of it complies with all the relevant laws, and practice directions and guides.
It is likely that the reference to information communication technology could cover chatbots, which in essence are a means of communication between the law firm and the clients. If lawyers do use chatbots, they have an obligation to ensure the use is properly supervised and managed. This would include the implied duty to have proper training for the operation of the chatbots and proper maintenance of the program.
Big data and confidentiality
A key component to forming a usable chatbot in the legal industry is to ensure that enough data is in the system to allow for accurate reiterations of the law to be generated. The chatbots therefore need to be programmed so that they can receive updates in statutory laws and case law. As the machine learning aspect of the chatbot allows for it to learn from the data provided, it would only be able to process users’ questions based on the extent of the data that it is given. If the data is inaccurate, the answer generated would also be inaccurate. In ensuring the chatbots have updated information, various risk management policies should be in place within the firm to scrutinise the performance of the chatbot.
There is also an issue of confidentiality. For the chatbot to operate and utilise its machine learning abilities, it will usually store and process information so that it can improve its records and expand its association of words and information. If the data was provided by the users to further their own legal case, collection of data by the chatbot for self-improvement (and possibly transfer to the open source programs that the chatbot is based on), might lead to a breach of the principle of confidentiality under Principle 8.01 of the Hong Kong Solicitors' Guide To Professional Conduct Vol. 1, depending on the programming of the chatbot. It sets out that:
‘a solicitor has a legal and professional duty to his client to hold in strict confidence all information concerning the business and affairs of his client acquired in the course of the professional relationship, and must not divulge such information unless disclosure is expressly or impliedly authorized by the client or required by law or unless the client has expressly or impliedly waived the duty.’
This duty extends to the solicitor’s staff, whether admitted or unadmitted, and it is the responsibility of the solicitor to ensure compliance. Even if the solicitor is not operating the chatbot personally, but has other colleagues – for example, information technology colleagues – operating the chatbot, it is the responsibility of the solicitor to ensure compliance with these duties.
A user disclosing personal information to a chatbot may only expect that the chatbot will use that information to generate a response. The user may not be aware that the chatbot may be storing this information to improve its systems. Therefore, there may be an issue in relation to the purpose of the collection of data and the consent for its use and further transfers.
If personal data is imparted during the use of a chatbot, the firm controlling the chatbot should consider what type of data access services they have in place. Various data protection regimes mandate that data subjects may make data access requests and receive copies of their data within a certain period of time. Pursuant to the Personal Data (Privacy) Ordinance of Hong Kong, for example, a person can request to access their personal data, and it should be provided to the person within 40 days.
The General Data Protection Regulation (GDPR), which took effect on 25 May 2018, has added further obligations for parties who collect and use personal data. The GDPR has extraterritorial effect to protect those who provide personal data (data subjects) where those persons are in the European Union and their personal data is for processing activities related to the goods or services offered, or such data subjects have their behaviour in the EU monitored or where the company collecting or processing the personal data is established in the EU.
The GDPR creates a new obligation relating to automated profiling, which is the ‘automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements’.
Where a decision is made based solely on automated processing, including profiling, and it produces legal effects concerning or significantly affecting a data subject, that would be in breach of the GPDR. Therefore, if chatbots are programed to make decisions based on an automated profiling of a user, under certain circumstances, the operator of the chatbot may be breaching the GDPR.
To avoid falling foul of data protection laws, law firms should:
- be conscientious of the definition of personal data in the applicable jurisdictions it operates in;
- consider what type of information is collected and how (ie, whether proper consent is obtained); and
- consider technological ways to protect and secure the personal data (eg, to anonymise personal information if collected for chatbot improvement purposes).
Know your client obligations and conflicts of interest
As users may be able to discretely access chatbots and form a solicitor-client relationship, it may be non-compliant with anti-money laundering and counter-terrorism-related regulations which require know your client (KYC) exercises to be performed.
Furthermore, if such a solicitor-client relationship does arise from chatbot communication with a user, this could create a conflict of interest, as proper conflict checks may not be conducted or envisaged at that stage.
Though these two concerns are serious in nature, they may possibly be addressed by the programming of the chatbot and requiring information to be input before the chatbot offers any legal advice.
As chatbots are usually programmed to run at all hours of the day, it may not be possible (nor cost effective) to constantly monitor the communication they have with users. As they may refine their responses using machine learning from the big data they can access, the accuracy and comprehensiveness of the chatbot’s responses may not be as reliable or correct as that provided by lawyers. There may be situations where the chatbot may give incorrect responses. Depending on what the chatbot’s communication relates to (and if any legal advice is involved), the law firm may be at risk of potential negligence claims.
Although disclaimers may assist in this regard, there is a limit to what can be disclaimed. Care should also be taken as to whether the chatbots communication establishes a solicitor-client relationship. If it does not, this should be made clear for all users prior to their use of the chatbot.
Having robots replace lawyers is still a long way off in Hong Kong. However, using chatbots and other enhanced technological tools will become more commonplace. It will be a commercial decision as to whether a firm will want to invest money to enhance their toolkit or continue to grow its practice with a human touch.
In any event, it is important for lawyers to keep abreast of technological developments as it changes the world around us, including the forms of communication, the potential areas of dispute for litigation cases, for prevention of claims against a firm and to further improve efficient case handling.
 Commentary 3 of Principle 8.01 of the Hong Kong Solicitors' Guide To Professional Conduct Vol. 1.
 Article 4(4) of the GDPR.
A rticle 22(1) of the GDPR.