
By Stantin Siebritz
In early 2026, the global technology sector is showing unmistakable signs of what can only be described as a memory buying frenzy.
AI data centres are acquiring DRAM and high bandwidth memory with the urgency of Desert Dash riders scrambling for the last bottle of water at a checkpoint.
This aggressive stockpiling is driving up component prices, tightening availability, and creating immediate consequences for everyone else in the value chain from smartphone manufacturers to enterprise IT planners.
Yet a larger question looms. Are these massive investments grounded in long term demand, or are we witnessing the creation of tomorrow’s stranded infrastructure?
The growing sense of unease begins with competitive and financial reality catching up to the AI gold rush. The recent wobble in confidence surrounding OpenAI is not a story about one company’s missteps.
It is a signal that the economics of leadership in AI are far more fragile than the market narrative suggests. Advantages compress quickly, costs remain stubbornly high, and the moment enterprise clients begin asking where the return on investment sits, the hype driven premium falls away.
When frontrunners stumble, the tremors spread instantly to chipmakers, equipment manufacturers, cloud providers and the investors underwriting these multibillion dollar buildouts. The entire sector is discovering that the margin for error is far narrower than expected.
At the same time, the assumptions driving aggressive infrastructure expansion are beginning to drift out of alignment with how AI is actually evolving. For years, the industry relied on a simple scaling philosophy.
Add more compute, buy more memory, train larger models. But efficiency is emerging as the real frontier. Researchers are shifting toward smarter inference strategies, more compact architectures and innovation paths that do not translate into a default instinct to build ever larger memory intensive data centres.
As this shift accelerates, the risk grows that facilities designed around yesterday’s scaling curve will no longer serve tomorrow’s computational needs. Infrastructure built too quickly on the wrong assumptions becomes a liability rather than a competitive asset.
Compounding this is the uncomfortable gap between benchmark performance and real world capability. On paper, AI continues to post impressive scores. In practice, systems still struggle to execute messy, multi step, end to end tasks without human oversight.
The Remote Labour Index, which evaluates AI against human performance on paid freelance work, reports failure rates above 96 percent for deliverable grade outputs.
This is the difference between a model that dazzles in a controlled environment and one that can reliably handle the complexity of real workloads.
Unless that gap closes, the market will not generate the level of pervasive AI demand that current infrastructure planning assumes. It is one thing for presentations to imagine a future of AI woven into every workflow. It is another for clients to trust systems that cannot consistently finish the job.
This confidence mismatch has global consequences, particularly for regions like Namibia and the broader African market. As international buyers hoard memory and manufacturers raise prices, the ripple effects arrive quickly.
Devices become more expensive, SME upgrades slow down and digitisation projects face new financial hurdles.
A global overbuild in AI infrastructure driven by speculative expectations rather than validated demand creates a drag on technology access where it is needed most. The result is not merely a correction in the sector but a macroeconomic ripple that deepens the digital divide.
The risk here is not that AI is an illusion or that its potential has been overstated. The real risk is subtler and more systemic. Too many players are acting as though the hardest challenges such as reliability, autonomy, generalisation and cost efficient deployment have already been solved.
They have not. The future is still under construction, yet the world is investing as though the finish line has already been crossed. If the assumptions underpinning these trillion dollar memory hungry data centres do not mature into durable, high value AI use cases, the industry may find itself holding an astonishing number of very expensive white elephants.






