AI Gone Wild: Airline Has to Honor a Refund Policy Its Chatbot Fabricated

Advertising Law

In a recent case brought in front of the Civil Resolution Tribunal (“CRT”) of British Columbia, Moffatt v. Air Canada, 2024 BCCRT 149, the CRT found in favor of an airline customer who relied on information about an airline’s refund policy provided by a chatbot after the customer booked a flight to attend his grandmother’s funeral.

In November 2022, Mr. Moffatt, who was grieving the loss of his grandmother, inquired about bereavement fares for a flight using the airline’s online chatbot. The chatbot, instead of directing him to the correct information, suggested he book a regular ticket and request a partial refund under the airline’s bereavement policy within 90 days. Mr. Moffatt followed the chatbot’s advice and purchased a full price ticket from Vancouver to Toronto. However, when applying for the refund, he learned from Air Canada employees that the airline did not permit retroactive applications.

As evidence in the case, Mr. Moffatt provided screenshots of the interaction with the chatbot to the CRT, which read, “Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family. If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”

Air Canada acknowledged that the chatbot responded with “misleading words” but argued that Moffatt “did not follow proper procedure to request bereavement fares and cannot claim them retroactively” and that the airline “cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot.”

Air Canada attempted to distance itself from the chatbot’s misleading information, arguing that it should not be held liable for the actions of the chatbot, which it deemed a separate legal entity. However, CRT member Christopher Rivers, dismissed the airline’s defense, questioning why customers should be expected to double-check the information provided by different parts of the airline’s website. In its ruling, the CRT ordered Air Canada to provide a partial refund to Mr. Moffat and additional damages to cover interest and CRT fees.

Why it matters

Ultimately, in this case, the CRT found that Air Canada owed a duty of care to its chatbot users, and the airline “did not take reasonable care to ensure its chatbot was accurate” and that these inaccuracies amounted to a negligent misrepresentation. However, on a broader level, this case highlights the increasing accountability companies will face for the actions of their AI systems, especially those that are consumer facing. The fact that a chatbot was able to entirely fabricate a company’s refund policy should be a big red flag for companies who are relying on chatbots to provide accurate information about the companies’ policies to its customers. While chatbots are often licensed technology that is provided by a third-party vendor, the company will still bear the brunt of the fallout after an automated system goes rogue. Thus, it is important that brands regularly review and update the information provided to their automated systems to ensure accuracy and consistency with the company’s posted policies. Knowing that the company will most likely be bound by the chatbot’s advice, regardless of whether it is accurate, companies should work with their vendors to clearly define the limitations of chatbots and other automated systems in providing information and forming contractual obligations with customers through their websites and other online platforms. Finally, it is important that customer service representatives (i.e., the real folks behind those chatbots) implement formal procedures for handling discrepancies between any information provided by automated systems and official policies. In the end, the CRT ruled that Moffatt was entitled to a partial refund of $650.88 in Canadian dollars (CAD) (around $482 USD) off the original fare, and looking back, $500 would have been a (very) small price to pay compared to all the negative attention that Air Canada has garnered from its AI gone wild.  

manatt-black

ATTORNEY ADVERTISING

pursuant to New York DR 2-101(f)

© 2024 Manatt, Phelps & Phillips, LLP.

All rights reserved