Air Canada AI Chatbot spreads misinformation only to fetch hefty legal penalty

When attempting to book a flight ticket on an air travel website, it’s common to encounter a chatbot designed to assist in completing the transaction. However, what happens if this chatbot provides misinformation that could result in costly consequences?

This scenario unfolded for Jake Moffatt, a Canadian resident faced with the urgent need to book a flight to attend his grandmother’s funeral. Relying on the guidance of an AI-based chatbot, Jake believed he was entitled to a reimbursement under Air Canada’s bereavement policy, which offers discounted fares for emergency travel, valid for up to 90 days from the date of ticket purchase.

To Jake’s dismay, upon contacting Air Canada officials, he discovered that 20% of the ticket amount would not be reimbursed, contrary to what the chatbot had indicated. Air Canada acknowledged that their chatbot had provided misinformation and issued an apology for the confusion.

Despite repeated attempts to resolve the issue with Air Canada’s customer support, Jake ultimately sought recourse through the Civil Resolution Tribunal of British Columbia. The tribunal not only ordered a partial refund as promised by the chatbot but also imposed penalties on Air Canada for misleading a customer through false claims.

Air Canada attempted to downplay the responsibility of its chatbot, arguing that it was merely a machine and not trained to interpret the nuances of its bereavement policy. However, the tribunal rejected this argument, emphasizing that the actions of the chatbot should be attributed to Air Canada, prompting the company to reconsider its approach.

While there may be arguments suggesting external influences on the chatbot’s actions and the potential inconclusiveness of the lawsuit, Tribunal Officer Christopher Rivers carefully analyzed the evidence presented by both parties. As a result, Air Canada was ordered to refund Mr. Moffat $483, along with $23 in interest and $89 in legal fees and miscellaneous charges.

Furthermore, the tribunal urged Air Canada to clarify its AI-based customer support practices to ensure transparency regarding its policies and implementation. Failure to do so could lead to substantial penalties and even a ban for misleading customers regarding policy details.

Ad
Naveen Goud
Naveen Goud is a writer at Cybersecurity Insiders covering topics such as Mergers & Acquisitions, Startups, Cyber Attacks, Cloud Security and Mobile Security

No posts to display