More and more businesses are using virtual assistants to make websites easier to use and increase customer engagement. Although this is beneficial in many respects, we must not forget about the legal aspects of this type of solution.
The use of chatbots in online contracting, combined with the well-known phenomenon of artificial intelligence (AI) hallucinations, i.e. incorrect or misleading results obtained by AI, has created new challenges, especially in the field of contract law. This includes, m.in, determining liability in the event that a customer concludes a contract based on incorrect information provided by a chatbot on the company’s website.
A breakthrough decision
The recent decision in the case against Air Canada was considered a milestone in the growing field of H2SI digital interactions and the responsibilities of the various actors within these interactions. The case concerned whether a company could be held liable for misleading information provided by an automated chatbot on its website.
In November 2022, Mr. Moffatt, who lives in Vancouver, was looking for travel options to attend a relative’s funeral in Toronto. He asked about Air Canada’s mourning policy, which allows passengers traveling due to the death of an immediate family member to take advantage of reduced prices. The Air Canada chatbot informed him that he could book a flight and then submit a refund request within 90 days of purchase. Mr. Moffatt, based on the directions provided by the chatbot, booked the flights and applied for a partial refund upon his return to Vancouver. Air Canada refused to process a request for reimbursement because its bereavement policy does not allow claims to be submitted after travel.
Air Canada argued that it could not be held liable for the chatbot’s misleading information, considering it a separate legal entity. This explanation is, frankly, a bit surprising.
The Court held that Air Canada was liable for misleading information provided by a chatbot made available on a publicly accessible website, regardless of whether it came from a static website or a chatbot. According to the Court, in general, “the applicable standard of diligence requires a firm to exercise due diligence to ensure that the information is correct”.
And how is it in your company?
Does the contract with the supplier regulate liability for errors in input and output data in genAI systems?
From a legal point of view, it is crucial to have appropriate safeguards, such as appropriate disclaimers and limitation of liability, as well as compliance with any other relevant legal requirements, including data protection issues.
For further information, contact:
Renata Warchoł-Lewicka, Partner
Gorazda, Świstuń, Wątroba i Partnerzy adwokaci i radcowie prawni, Kraków
e: lp.moc.wsg@akciwel.ataner
t: +48 12 4224459
#WLNadvocate #Poland #Krakow #law #legal #lawfirm #corporatelaw #ITlaw #technologylaw #contracts #business #chatbot #liability #dataprotection #dataprivacy #AirCanada #virtualassistant