After months of resistance, Air Canada was strength to partially reimburse a grieving passenger who was misled by an airline chatbot inaccurately explaining the airline’s bereavement travel policy.
The day Jake Moffatt’s grandmother died, Moffat immediately visited the Air Canada website to book a flight from Vancouver to Toronto. Moffatt asked Air Canada’s chatbot to explain how Air Canada’s bereavement fares work.
The chatbot provided inaccurate information, encouraging Moffatt to book a flight immediately and then request a refund within 90 days. In fact, Air Canada policy explicitly states that the airline will not refund bereavement travel once the flight is booked. Moffatt dutifully attempted to follow the chatbot’s advice and request a refund, but was shocked that the request was rejected.
Moffatt tried for months to convince Air Canada that a refund was due, sharing a screenshot of the chatbot that clearly stated:
Air Canada argued that because the chatbot response linked to a page containing the actual bereavement travel policy, Moffatt should have known that bereavement fares could not be requested retroactively. Instead of a refund, the best Air Canada would have done was promise to update the chatbot and offer Moffatt a $200 coupon to use on a future flight.
Unhappy with this resolution, Moffatt refused the coupon and filed a small claims complaint with the Civil Resolution Tribunal of Canada.
According to Air Canada, Moffatt should never have trusted the chatbot and the airline should not be held responsible for the chatbot’s misleading information because, according to Air Canada, “the chatbot is a separate legal entity that is responsible for its own actions “, A court order said.
Experts say it Vancouver Sunshine that Moffatt’s case appeared to be the first time a Canadian company had attempted to argue that it was not responsible for the information provided by its chatbot.
Tribunal member Christopher Rivers, who ruled in favor of Moffatt, called Air Canada’s defense “remarkable.”
“Air Canada asserts that it cannot be held responsible for information provided by any of its agents, servants or representatives, including a chatbot,” Rivers wrote. “He doesn’t explain why he thinks that’s the case” or “why the webpage titled ‘Grief Journey’ was inherently more trustworthy than his chatbot.”
Additionally, Rivers concluded that Moffatt had “no reason” to believe that one part of Air Canada’s website would be accurate and another would not.
Air Canada “does not explain why customers should verify information found in one part of its website in another part of its website,” Rivers wrote.
Ultimately, Rivers ruled that Moffatt was entitled to a partial refund of $650.88 in Canadian dollars off the original fare (approximately US$482), which was $1,640.36 CAD (approximately US$1,216 ), as well as additional damages to cover interest on the plane ticket. and Moffatt’s court costs.
Air Canada told Ars that it would comply with the ruling and considered the matter closed.
Air Canada chatbot appears to be disabled
When Ars visited Air Canada’s website on Friday, there appeared to be no chatbot support available, suggesting that Air Canada had disabled the chatbot.
Air Canada did not respond to Ars’ request to confirm whether the chatbot is still part of the airline’s online support offerings.
“Travel nerd. Social media evangelist. Zombie junkie. Total creator. Avid webaholic. Friend of animals everywhere. Future teen idol.”