Some time ago, together with my friend Daniel Doppler, I wrote about AI hallucinations in the travel industry. Back then, we suggested that sooner or later we would reach a rather peculiar stage: a tourism ecosystem populated by what I call “hallucinated bookings.”
The mechanism is simple. Generative models pull information from scattered, incomplete, and often contradictory sources. Then they do what LLMs do best: fill in the gaps using linguistic probability.
If that sounds like science fiction, consider what happened recently in Tasmania. An AI-generated article began describing the existence of thermal springs in Weldborough, a tiny village of just 33 residents in the northeast of the island. The result? Tourists calling hotels asking about the spas. Then tourists actually showing up (spoiler: the spas don’t exist).
Now imagine this dynamic applied at scale to hospitality. The average traveler—let’s say our beloved “boomer” who still prints their boarding pass in 2026—arrives at a hotel holding a ChatGPT conversation printed on an A4 sheet, Comic Sans, size 30.
The only problem? The room doesn’t exist. Or the panoramic restaurant has long been taken over by the competitor next door.
This is where things get interesting: a true epistemological short circuit. Reality versus model-generated reality.
To understand what’s happening, we need to remember how the web worked before AI. Yes, incorrect information could exist online—but it was usually buried on page six of Google, which is arguably the best place to hide a body.
Information had a hierarchy: authority, ranking, links, reputation.
Today, that hierarchy has collapsed. Generative models don’t prioritize truth or authority—they optimize for statistical likelihood.
So when a marginal, outdated, or simply incorrect source enters the dataset, the model can elevate it to canonical truth. What was once a forgotten piece of nonsense can now become a perfectly written, highly convincing—but entirely false—answer.
That’s why we are entering the era of hallucinated tourism, built on layers of plausible truths.
Today it’s imaginary thermal baths in Tasmania. Tomorrow it will be rooms with views that don’t exist, services never offered, and reviews of stays that never happened.
And when a system built on probability meets an industry driven by expectations, the risk is not just error—it’s the creation of parallel tourism realities.
Long before LLMs, Philip K. Dick wrote that reality is that which continues to exist even when you stop believing in it. The problem is that in generative tourism, we are beginning to see the opposite: realities that exist simply because someone believed in them.
See you next week,
Simone
SIMONE PUORTO

