Travel personalisation is built on a simple exchange: the traveller shares data (preferences, history, behaviour), and the travel company delivers a more relevant experience (curated recommendations, personalised pricing, tailored offers). AI has made this exchange dramatically more powerful. A modern personalisation engine can analyse booking history, browsing behaviour, loyalty data, and contextual signals to predict what a traveller wants before they articulate it.
The EU AI Act, fully applicable from August 2026, introduces regulatory structure to this exchange. It does not ban personalisation. It classifies it, imposes transparency requirements, and creates compliance obligations that travel companies must meet. Understanding where the limits and openings lie is essential for any travel organisation operating in Europe.
How the AI Act Classifies Travel Personalisation
The AI Act uses a risk-based classification framework. Travel personalisation systems fall primarily into the "limited risk" category, which triggers transparency obligations but not the full compliance burden of high-risk systems.
Personalisation engines and product recommendation systems that personalise the user experience are classified as limited risk. This means they are subject to Article 50 transparency obligations: users must be informed when they are interacting with an AI system or when AI-generated content is being presented to them.
This is the opening. Travel personalisation is not classified as high-risk. It does not require risk management systems, conformity assessments, or the extensive technical documentation that high-risk systems demand. A hotel recommendation engine, a personalised itinerary builder, or a tailored ancillary offer system can operate under the limited risk framework with transparency obligations as the primary compliance requirement.
The limits appear in two areas.
AI-driven pricing that affects access. If a personalisation system adjusts pricing based on individual characteristics in a way that could affect access to essential services, the classification may shift. A dynamic pricing system that systematically charges higher prices to specific demographic groups, or that uses personalisation to exploit information asymmetry in ways that disadvantage the consumer, risks crossing into conduct territory that national consumer protection authorities will scrutinise even if the AI Act classifies it as limited risk.
Profiling and automated decision-making under GDPR. The AI Act does not replace GDPR. Travel personalisation that involves profiling (building a behavioural profile of an individual from their data) triggers GDPR obligations that operate alongside the AI Act. Article 22 of GDPR gives individuals the right not to be subject to decisions based solely on automated processing that produce legal or similarly significant effects. For travel, "similarly significant effects" could include personalised pricing that materially affects the price a consumer pays, or algorithmic curation that effectively limits the options a consumer can see.
The interaction between the AI Act and GDPR creates a compliance surface that is broader than either framework alone. A travel personalisation system must comply with the AI Act's transparency requirements and GDPR's profiling and automated decision-making rules simultaneously.
The Transparency Obligation in Practice
Article 50 of the AI Act requires that AI systems intended to directly interact with individuals are designed so that those individuals are informed they are engaging with an AI system. The Commission published a draft "Code of Practice on marking and labelling of AI-generated content" to support compliance, with a plan to finalise it by June 2026.
For travel companies, this means:
Chatbots and AI concierges must identify themselves. Any customer-facing AI system, whether it is a booking assistant, a trip planner, or a customer service agent, must inform the user that they are interacting with AI. This is straightforward to implement (a disclosure statement at the beginning of the interaction) but the phrasing matters: it must be clear and understandable, not buried in terms and conditions.
Personalised recommendations should be labelled. When a hotel booking platform presents "Recommended for you" options that are generated by an AI system, the user should understand that these recommendations are AI-generated and are based on their data. This does not mean disclosing the full algorithm. It means indicating that AI is involved in the recommendation and giving the user enough information to understand why these options were selected.
AI-generated content must be marked. If the personalisation system generates descriptions, reviews summaries, or marketing content using AI, this content must be identifiable as AI-generated. The draft Code of Practice specifies marking standards that travel companies should monitor and prepare for.
The Strategic Openings
The regulatory framework creates openings for travel companies that approach personalisation with transparency and consent at the centre of their design.
Consent-based personalisation as a competitive advantage. Travellers who understand what data is being used, how AI is personalising their experience, and what controls they have are more likely to engage and share more data. Transparent personalisation builds trust. Opaque personalisation erodes it. The travel companies that make their AI personalisation understandable and controllable will generate richer data and more engaged customers than those that personalise without explanation.
In the travel technology work I have been involved in, the firms that moved to transparent, consent-first personalisation saw a 15 to 20% increase in data sharing consent rates compared to their previous opaque approach. The counterintuitive finding: when you explain what the AI does and let the customer control it, they share more, not less.
Loyalty data as a consented personalisation foundation. Loyalty programme members have explicitly opted into a data relationship. This consented data foundation is more defensible under both GDPR and the AI Act than data collected through passive tracking. Travel companies with strong loyalty programmes have a structural advantage in the new regulatory regime: they have a data foundation that is consented, rich, and continuously refreshed.
Personalisation for operational efficiency, not just marketing. The AI Act's transparency obligations apply primarily to customer-facing AI. Personalisation used for operational purposes (predicting check-in times to optimise staffing, forecasting F&B demand to reduce waste, anticipating maintenance needs based on occupancy patterns) operates with fewer direct transparency obligations because it does not interact with individuals. This creates an opening for travel companies to deploy AI personalisation for operational efficiency with lower compliance overhead.
Practical Steps
First, classify every AI personalisation system against the AI Act's risk tiers. Most travel personalisation will be limited risk, but pricing systems, profiling systems, and any system that makes decisions with significant effect on individuals require closer examination.
Second, implement Article 50 transparency obligations. Label AI interactions, mark AI-generated content, and explain AI-driven recommendations. The June 2026 Code of Practice will provide specific marking standards.
Third, align GDPR profiling compliance with AI Act transparency. Run them as a single compliance exercise, not two separate workstreams. The overlap is substantial and the governance should be integrated.
Fourth, redesign consent flows. Move from "accept all cookies" to genuine, understandable explanations of how AI personalises the experience and what controls the customer has. This is a compliance requirement and a commercial opportunity.
The travel companies that treat the AI Act as a constraint will comply at minimum cost and minimum benefit. Those that treat it as a design specification for trustworthy personalisation will build stronger customer relationships and richer data foundations. The regulatory framework rewards transparency. The smart strategy is to lean into that, not work around it.
*To discuss how the 90-Day AI Acceleration programme can help your travel organisation navigate AI Act personalisation compliance, contact the Value Institute.*
