Airline held liable for its chatbot giving passenger bad advice

When Air Canada's chatbot gave incorrect information to a traveller, the airline argued its chatbot is "responsible for its own actions"

Feb 26, 2024

Artificial intelligence is having a growing impact on the way we travel, and a remarkable new case shows what AI-powered chatbots can get wrong – and who should pay.

Key takeaways

  • Air Canada's chatbot mistakenly offered a discount to a passenger needing to book a full-fare flight for his grandmother's funeral, promising he could apply for a bereavement fare later;
  • However, when the passenger tried to claim the discount, the airline denied it, stating the request needed to be made before the flight. Air Canada argued the chatbot was a separate legal entity responsible for its actions;
  • The British Columbia Civil Resolution Tribunal disagreed, stating Air Canada is ultimately responsible for all information on its website, whether from a static page or a chatbot.

Get the full story at BBC

Related must-reads


Get our Daily Brief in your inbox

Consumers are changing the face of hospitality - from online shopping to personalized guest journeys and digitalized guest experiences ...
we've got you covered.

By submitting this form, you agree to receive email communication from and its partners.