Have you ever had a conversation with a chatbot that left you feeling like you’d rather speak to a brick wall? Well, Air Canada recently learned that the wrong response can come back to bite you—literally, in the form of a court-ordered refund!
Air Canada’s chatbot, apparently developed in a fit of AI enthusiasm, provided a customer with inaccurate information about a bereavement travel policy. This led to a refund request being denied, forcing the passenger to file a complaint.
The airline initially tried to pass the buck, arguing that the AI powering their customer service was a separate legal entity. But the tribunal shot down that argument, saying that Air Canada is responsible for all the information on its website, regardless of where it comes from.
Now, Air Canada is 650.88 Canadian dollars and a lot of embarrassment poorer. And I wonder what the point of resisting giving the refund was, especially weighing a few hundred dollars with the loss of trust with the customer and the embarrassing public relations show.
What does this situation teach us about the dangers of automated chat?
1. Chatbots aren’t always right.
Remember, chatbots are only as smart as the data they’re trained on. If that data is inaccurate, so will be the responses.
2. Don’t take chatbots at face value.
Always verify any information you get from any AI, especially if it sounds too good to be true.
3. Chatbots can be frustrating.
Let’s face it, chatbots can be annoying. They often give canned responses, don’t understand what you’re asking, and can be downright infuriating when they don’t help you resolve your issue. If you’re going to go this route, make sure you invest in the best solution possible that will provide the most accurate answers.
4. Chatbots can hurt your reputation.
If your AI gives customers inaccurate information, it can damage your company’s credibility. In Air Canada’s case, the chatbot’s mistake led to a negative news story and a court case. Be sure to audit your AI’s knowledge regularly and compare its output to the real-world policies and processes it’s supposed to help with.
5. Chatbots can be a waste of money.
If your AI isn’t providing accurate or helpful information, it’s not worth the investment. Air Canada’s initial investment in its AI may have been higher than the cost of paying a human to handle simple queries, but they’ll likely reconsider now that they’ve had to pay a hefty price not in refunds but in reputation.
Admit fault gracefully and accommodate the customer
I have a personal story about Air Canada. In 2019 we had planned a cruise to Europe for 2020. We bought travel insurance, our tickets were paid for, and we were excited to finally see some things we’d dreamed of seeing.
Then, COVID happened and the travel restrictions hit. First, they changed our flight from Canada to Europe by a day, which would have made us miss our cruise.
Then they canceled our flight from our departure point in Canada to Europe, but without canceling our flight from our home city to Canada.
When we tried to get answers to our questions (such as what good does it do us to fly to Canada if they weren’t going to get us to our cruise), we kept getting no answer.
Then when we tried to get a refund, they refused to give it to us. I no longer remember if they were trying to offer us a credit or other compensation, but they certainly weren’t offering us anything close to the value of what we paid.
Finally, we had to claim it through our travel insurance. But because Air Canada was somehow showing as having partially refunded us (which they didn’t), we didn’t get a full refund through the insurance either.
We tell this story every time air travel to or from Canada comes up in conversation. I’m not sure that Air Canada would like that, but our perception is now their reality. Was it worth it to Air Canada to do this to us? We’ll never know.
However, in the case of the passenger mentioned in the Ars Technica article, it objectively wasn’t worth Air Canada keeping $650.88 of airfare. Especially from a bereavement fare customer who was confused by their poorly trained chatbot. And then for the company to have to suffer the embarrassment of an Ars Technica article rubbing salt into the wound of a poor customer experience. The reputation cleanup alone will probably cost many thousands more, if not millions in lost airfare bookings due to people finding the article while planning their travel and quickly losing trust in the brand.
Conclusion
While chatbots can be a convenient tool for providing customer service, it’s important to use them wisely. Make sure they’re accurate and helpful, and don’t give customers the runaround. (And, as we found, that also goes for general customer service unrelated to chatbots.) Otherwise, you may end up like Air Canada—paying the price for a chatbot that couldn’t tell its flight prices from its funeral rates.
0 Comments