The most honest way to describe where enterprise AI stands today is this: it is no longer experimental, but it is not yet simple. What is emerging is not a clean transition, but a messy, deeply operational shift that touches everything from budgeting to org structure to infrastructure.
In a recent series of conversations with IT and AI leaders across banking, media, retail, healthcare, consulting, tech, and sports, Aaron Levie CEO of Box.com captured this transition with great clarity. His observations read less like hype and more like field notes from companies actually trying to make AI work inside complex organizations.
He starts with a structural shift that is easy to underestimate. Enterprises are moving beyond what he describes as the “chat era” of AI. The focus is no longer on interfaces that answer questions, but on systems that act. As he puts it, companies are now deploying “agents that use tools, process data, and start to execute real work in the enterprise.” This is not just an incremental step. It changes what software is expected to do. Instead of supporting workflows, it begins to perform them.
That shift is forcing companies to rethink how they adopt technology. The early phase of AI adoption often followed what Levie calls a “let a thousand flowers bloom” approach. Teams experimented freely, trying different tools and use cases without much coordination. That phase is ending. Enterprises are now narrowing their focus toward “targeted automation efforts applied to specific areas of work and workflow.” In other words, AI is moving from curiosity to infrastructure.
But infrastructure introduces friction. One of the most consistent themes Levie highlights is that change management is becoming the central challenge. Most enterprise workflows “aren’t setup to just drop agents directly in,” which means organizations need both internal coordination and external support. In some cases, this is leading to entirely new structures. One company he mentions has “a head of AI in every business unit that rolls up to a central team,” just to keep efforts aligned. That alone signals how seriously AI is being operationalized.
At the same time, a more unexpected constraint is emerging: compute economics. Levie describes companies going through “very real trade-off discussions right now on how to budget for tokens.” In traditional enterprise IT, budgets were allocated annually and relatively predictably. AI breaks that model. Usage can scale quickly, and costs are tied directly to activity. One company even experimented with a “shark tank style” internal process for allocating compute budget. This is a new kind of resource planning, where tokens become as strategic as headcount or capital expenditure.
Underneath all of this sits a more familiar problem. Legacy systems. Levie notes that “fixing fragmented and legacy systems remain a huge priority,” with many enterprises still operating on decades-old architectures, whether on-premise or poorly modernized cloud systems. The implication is simple but critical. Agents are only as powerful as the systems they can access. If data is fragmented, agents are limited. So before AI can scale, infrastructure has to catch up.
Interestingly, despite all the noise around job displacement, Levie observes that “most companies are not talking about replacing jobs due to agents.” The dominant use cases are different. Companies are focusing on work they could not previously do, or could not prioritize. That includes automating back-office processes, upgrading software, and extracting insights from large volumes of documents. The emphasis is on growth and capability expansion, not just cost reduction.
This is also reshaping how software itself is evaluated. Levie points to the rise of “headless software” as a dominant requirement. Enterprises want systems that can operate across multiple agents, not tools locked into a single interface or ecosystem. Vendors that fail to support this flexibility risk being replaced. In a multi-agent world, interoperability becomes a core feature, not a nice-to-have.
And yet, for all this progress, there is a strong sense of uncertainty. Levie notes that companies are hesitant to standardize too early because “no one wants to get stuck in a paradigm that locks them into the wrong architecture.” This creates a paradox. The pace of innovation is forcing decisions, but the risk of choosing incorrectly is higher than ever. As a result, most enterprises are operating in a deliberately flexible, somewhat fragmented state.
One of his more subtle observations may be the most telling. Despite the promise of automation, “everyone is working more than ever before.” AI is not reducing workload yet. It is increasing it. Teams are experimenting, building, integrating, and learning all at once. The productivity gains may come later, but the investment phase is intense.
Finally, Levie highlights something that is often overlooked in the broader narrative. While AI is frequently framed as making complex tasks easier, the most powerful implementations are becoming more technical, not less. Concepts like MCP, CLIs, and system orchestration are not intuitive for most business users. This means that technical talent remains essential. Engineers may not be writing software in the traditional sense, but they are becoming the architects and operators of these new systems.
When you contrast this with the hotel industry, the gap becomes clear.
Hospitality is still heavily constrained by legacy infrastructure. PMS, CRS, POS, and other core systems are often fragmented, poorly integrated, and slow to evolve. The idea of agents seamlessly operating across workflows is, in most cases, still theoretical. Where AI is appearing, it is frequently driven by vendors rather than by internal demand from operators. That creates a different dynamic. Adoption is less about solving deeply understood operational problems and more about reacting to external innovation cycles.
There is also a structural difference in incentives. Many of the enterprise use cases Levie describes are tied to scale, data volume, and process complexity. Hotels operate differently. The operational layer is highly human, highly variable, and often constrained by physical realities. This does not mean AI will not transform hospitality. It will. But the timeline is likely longer, and the path more uneven.
What is happening in broader enterprise environments offers a preview rather than a blueprint. The direction is clear. Agents will move from assisting to executing. Systems will need to become interoperable. Compute will become a managed resource. And technical expertise will remain central.
But getting there is not a simple upgrade. It is a full rearchitecture of how work gets done. And for industries like hospitality, that rearchitecture has barely begun.

