Three years after the launch of ChatGPT, artificial intelligence (AI) has reached adoption levels that are exceptional by historical standards. Nearly 800 million people have interacted with generative AI, a scale of engagement that took the internet more than a decade to achieve.
At the same time, hyperscalers are committing substantial capital to support this expansion across Graphics Processing Units (GPUs), data centres, power infrastructure and networking.
Beneath this spending, the technology itself continues to advance: task complexity is doubling roughly every seven months; multimodal use is increasing token intensity; and unit token costs are falling at close to a 10-fold annual pace, creating favourable conditions for application development.
But cracks are appearing. Monetisation continues to trail adoption, and many enterprise users remain in an experimental phase as they seek clearer productivity gains and return on investment.
Physical constraints are also emerging. Power availability and networking bottlenecks are limiting utilisation, leaving expensive computing capacity idle for significant portions of time.
In addition, the growing use of leases, vendor financing and private credit introduces layers of opacity into the funding structure that warrants careful monitoring.
While capital intensity is rising, the largest platforms continue to generate substantial cashflow, allowing them to fund investment internally rather than through balance-sheet stress.
History provides useful context. Previous infrastructure cycles, from telegraphy to telecommunications to the internet, were marked by periods of overbuilding that ultimately proved constructive. Excess capacity reduced costs, accelerated adoption and eventually led to productivity gains.
Just like there are two internets in the US and China, today two distinct AI ecosystems are emerging. The US model is a high-cost innovation engine, fueled by massive capital expenditure, premium GPUs and broad commercial experimentation.
Constrained by export controls, the Chinese model has evolved into a low-cost efficiency system, innovating around scarcity and achieving competitive benchmark performance at just 18% of US hyperscaler expenditure.
Its open-source ecosystem and deep partnerships across emerging market foundries, servers, memory and networking underpin a parallel AI supply chain.
China’s highly digitised economy is enabling even faster adoption and applications. The result is not one global AI future, but two architectures with different cost curves, capabilities and geopolitical implications.
The US leads in developing the most advanced technologies, while China leads in global installations, particularly outside of America and its closest allies. Winning the AI race will depend not only on model capabilities but also on usage.
Just as electricity and the internet became ubiquitous, artificial intelligence is likely to do the same. The first phase of this cycle has been defined by the scale of enablers, compute and capital.
The next phase will be defined by use. As AI moves from models to agents, through agentic AI in software and embodied AI in hardware, it is no longer just analysing the world, but beginning to operate it.
This shift is already visible in autonomous systems on factory floors, in the low-altitude economy across airspace and increasingly in space, while also reshaping human interaction through digital companionship.
As this transition unfolds, value will migrate away from those who build the most advanced systems toward those who deploy them most effectively, both in the virtual and physical economy.
The outcome will be uneven across sectors, geographies and companies, particularly in a world where two distinct AI ecosystems are emerging and where adoption, not only invention, increasingly determines impact.
AI remains a long-duration theme but treating it as a single trade or index exposure risks missing the point. Broad benchmarks will increasingly mask dispersion. The more durable opportunities lie in vertical applications, horizontal enterprise platforms and AI-native challengers that embed intelligence directly into workflows, products, and physical systems.
Incumbents will benefit where they combine data, trust and distribution, but challengers, unburdened by legacy systems and organisational inertia, will often move faster.
Outside developed markets, technology diffusion will not follow a linear path; in many cases emerging economies will leapfrog rather than lag.
In that sense, the AI opportunity is not only large, it is selective, uneven, and still early. The first act rewarded scale. The second will reward judgment.
Jitania Kandhari is deputy CIO of the solutions and multi-asset group and co-lead manager for the Passport Equities strategy at Morgan Stanley Investment Management. The views expressed above should not be taken as investment advice.