Travel tackles AI hallucinations


Shortly after Matador Network launched its GuideGeek generative AI travel assistant early last year, an employee testing the tool queried it about her hometown of Pittsburgh.

After a while, she shifted gears to ask about Crete. But instead of typing “Crete, Greece,” she accidentally wrote “Crete Freeze.”

Flummoxed by the typo, GuideGeek thought the query was about an ice cream parlor in Pittsburgh. But when it couldn’t find a real parlor called Crete Freeze, it made one up, complete with a background story about its founders and a claim that the shop makes its own ice cream, said Matador Network CEO Ross Borden, who relayed the story.

“She realized that it was completely invented and this ice cream parlor doesn’t exist,” Borden said, referencing the employee.

Similarly, Brian Shultz, the chief information officer for Cruise Planners, which introduced its Maxx Intelligence generative AI tool for travel advisors in December, once queried the original ChatGPT about himself.

“It actually crafted a whole career I never had,” he said. “It told me I was the chief technology officer at Royal Caribbean.”

Such fictional responses are known in the parlance of generative AI as hallucinations. And they are a pitfall that travel companies and travel advisors need to manage and be mindful of as they begin deploying and using AI-powered tools.

“Hallucinations are a big deal,” said Michael Coletta, senior manager of research and innovation for Phocuswright, who expects to deliver a paper on generative AI in the next few weeks. “You can have 10 answers that are correct. And the 11th one, it just makes it up, because it wants to give you an answer.”

In the travel world, for example, Coletta said he’s seen generative AI make up ferry and train routes that don’t exist.

Hallucinations in the context of travel queries can appear as incorrect flight information, misleading hotel descriptions or flawed travel recommendations, said Megan Hastings, head of customer insight strategy at the digital analytics platform Quantum Metric.

“The frequency of these hallucinations can vary based on the AI’s understanding of the domain, its ability to process and interpret data accurately and the level of human supervision and quality control,” said Hastings, who works with digital teams at major travel industry brands.

Keeping hallucinations in check

One way that companies can reduce hallucinations as they roll out generative AI tools is to properly train the interface on their own data.

Travel companies will typically layer their proprietary generative AI solution over OpenAI’s ChatGPT, Google’s Gemini or Microsoft’s Copilot. But they must teach their system when and how to retrieve information from the company’s own data set, rather than those much larger data sets, Coletta said.

Intensive human intervention is also a key to minimizing hallucinations, said Borden, who added that Matador has reduced the rate of hallucinations and other confusion by GuideGeek from 14% last April to just over 2% currently.

Speaking at the Mountain Travel Symposium ski industry conference at the Palisades Tahoe ski resort in California this month, Borden said that Matador hired six employees to root out and resolve hallucinations.

Collectively they’ve read more than 900,000 GuideGeek conversations in the past nine months.

“They have been key for us figuring out the conditions where hallucinations happen,” he said in a later interview. “There are only certain things you can do without a human looking at it.”

One situation that tended to cause GuideGeek to hallucinate, Borden said, is exemplified by the problem the Matador employee encountered with her Pittsburgh query. The tool was more likely to get confused when it was asked a bunch of questions about the same topic, and then have the user abruptly switch gears — especially if there is a typo.

Queries about especially specific topics, for example a coffee shop in a very small town, also caused hallucinations.

Matador has addressed these and other GuideGeek issues with software solutions, Borden said.

He noted that in addition to layering on top of ChatGPT-4, GuideGeek uses direct connects to other platforms, including Sky Scanner and Expedia, to produce real-time responses about flight schedules, hotel availability, weather and currency conversion. Queries can be made via WhatsApp, Instagram, Messenger or within the GuideGeek website.

Matador will also customize GuideGeek for travel agencies, airlines and other travel companies. Twelve destination marketing organizations are currently using a customized version of the tool.

Cruise Planners has instituted less labor-intensive means to address the risk of hallucinations faced by travel advisors using Maxx Intelligence. When using the platform, Cruise Planners franchisees must first click a box accepting responsibility for the information they choose to disseminate, Shultz said. With each query response, they also must acknowledge that they’ve reviewed the material.

The company also trains agents on best practices for using Maxx Intelligence, which include fact-checking any specific response, especially those related to restaurants, hotels and other places of commerce. ChatGPT-4, which Maxx Intelligence is also layered over, is only current through this past September, Shultz noted, and that’s something of which advisors need to be aware.

Courtesy of Travel Weekly

woman looking at her smartphone