
AI is revolutionising hospitality, offering insights once considered science fiction. But while tools like ChatGPT are being embraced by hoteliers eager to personalise offers or automate marketing, they come with serious – and often overlooked – risks. Guest data is being fed into public large language models (LLMs) without proper safeguards. If the industry doesn’t act now, we’re staring down the barrel of a major data privacy scandal.
What we’ve been observing is that many hospitality businesses are moving fast to experiment with generative AI, but without the necessary checks and balances in place. There’s excitement, yes—but also confusion. And that’s a dangerous combination when it comes to sensitive guest data.
ChatGPT and similar models are trained on vast swathes of internet data. They weren’t built for securely managing hotel guest information. When hoteliers upload personal data – names, preferences, stay history – they may be exposing it to external servers, often in jurisdictions with weaker privacy laws. In worst-case scenarios, this data could be inadvertently retained or used to train future models, creating privacy breaches and triggering hefty fines under GDPR and similar regulations.
Real risks, not just theoretical ones
If you’ll allow me a bit of friendly advice—imagine this for a moment. You’re trying to analyze repeat guest behavior, and someone on your team decides to use ChatGPT. They copy and paste booking histories, maybe even personal emails and stay preferences, into the tool. It feels like a harmless shortcut. But just like that, the data is out of your hands. It’s now sitting on external servers you don’t control. If that information gets leaked or misused, you’re not just looking at legal trouble – you’re risking the trust your brand has spent years building.
And here’s the frustrating part: the results often aren’t even worth it. In our experience, the outputs from public LLMs in this context tend to be vague and disconnected from your actual systems. They might give you a few ideas, but they won’t deliver the kind of tailored insights or automation that truly move the needle. So really, it’s all risk and very little reward.
If hotels want to leverage AI effectively and responsibly, the solution lies in private or internal LLMs. These can be trained on a hotel’s own data, within a secure environment that keeps guest information private. While they may require more setup and investment, they offer far greater control and customisation – without compromising compliance.
Know the limits, but embrace the control
No AI solution is perfect. Internal models still require thoughtful governance and monitoring. But when guest data stays inside your own ecosystem, you retain ownership and oversight. That’s a game-changer for both compliance and customer trust. Technology alone isn’t enough. Hotels need a solid data foundation and staff who understand both the capabilities and boundaries of AI. That means investing in training, partnering with data protection experts, and establishing clear internal policies. Tools like Microsoft Copilot, integrated securely into your own tech stack, offer a much safer route than pasting data into public web interfaces.
Don’t panic – but do act
If your team has already used public LLMs with guest data, don’t sweep it under the rug. Conduct a privacy audit, talk to a law firm specialising in data compliance, and implement safeguards immediately. You may not need to self-report, but you do need to take action, fast. It’s never too late to bring structure and protection back into the process, for example, by centralising guest data in secure, compliant environments and establishing strong governance frameworks to manage access and use.
The truth is, there’s currently no global standard for how consumer data should be used with AI tools. That’s unacceptable. We’re calling on industry associations, leading hotel groups, and regulators to come together to create clear, enforceable guidelines. Let’s not wait for a scandal to force change. The hospitality industry must lead by example and demand accountability – from tech providers and from ourselves.
It’s time for urgent, coordinated action. We call on bodies like the World Travel & Tourism Council, HOTREC, and AHLA to establish a cross-industry task force to define best practices and push for global regulation. Until those standards arrive, hoteliers must take the lead in building compliant, future-ready infrastructures – because if we don’t protect guest data today, we’ll all be paying the price tomorrow.