In today’s rapidly evolving technological landscape, Artificial Intelligence (AI) stands out as a powerful force with the potential to revolutionize various industries, ranging from healthcare to finance. However, amidst the excitement surrounding AI, it’s essential to pause and reflect on whether we are witnessing a scenario similar to the dotcom bubble of the 1990s. This era was characterized by exaggerated expectations and speculative investments that often outpaced actual utility. Stewart Laing, the CEO of Asanti Data Centre, raises important questions about what lessons we can draw from the past and how the data centre industry can thrive in the midst of this transformation.
AI has emerged as a catalyst for innovation, with groundbreaking applications such as generative AI and real-time language translation leading the way. Nevertheless, a significant portion of the investments in AI-focused data centres appears to be driven more by hype than genuine demand. This trend mirrors the dotcom era, where a surge of venture capital flowed into technology start-ups that failed to deliver on their profitability promises.
A critical aspect to consider when it comes to AI is discerning between its two main phases: training and inference. While training AI models requires substantial computational power, it is a relatively short-term process. In contrast, the inference phase demands far less infrastructure once the models are trained. Failing to acknowledge this distinction could lead to overbuilding infrastructure and wastage of resources, particularly in crucial areas such as power and connectivity, which are already pressing challenges for the data centre sector.
Compounding this situation is the UK Government’s AI Action Plan, which, while well-intentioned, appears to amplify the hype surrounding AI without delving into essential nuances.
One key issue is the government’s vague definition of AI and its associated infrastructure requirements. AI operates in two distinct phases, as mentioned earlier, with differing resource needs for each phase. The current proposal for one or two large AI data centres overlooks this crucial distinction, resulting in a centralized strategy that may not align with practical needs in the real world.
Instead of concentrating resources in massive facilities, a more effective approach would involve adopting a distributed model utilizing regional and edge data centres. This decentralized strategy would bring AI infrastructure closer to end-users, supporting essential services like hospitals and schools while mitigating the inefficiencies associated with centralization. Moreover, it has the potential to stimulate local economies by collaborating with existing data centres across the country, thereby ensuring broader benefits rather than concentrating opportunities in a few large-scale sites.
By focusing on the practical needs of AI infrastructure rather than succumbing to hype, we can avoid the pitfalls of the past and build a future that aligns more closely with reality.
While AI may seem like a new and revolutionary technology, it has been evolving for decades, with significant advancements seen since 2012. Applications such as Alexa, facial recognition, and predictive analytics have already been integrated into everyday life for quite some time. The current surge in generative AI and large language models is driving a scale of deployment that is unprecedented.
Investing speculatively in AI infrastructure without a clear alignment with actual requirements could lead to resource wastage and unmet expectations. This issue is particularly critical for the data centre sector, where ambitious AI projects must confront two fundamental infrastructure challenges: power availability and network connectivity.
The power dilemma is a significant concern, as data centres currently consume around 2% of global electricity, a figure expected to rise with the growth of AI. In the UK, the high cost of energy poses a challenge, making data centres less competitive on an international scale.
However, tapping into green energy sources is hindered by inefficiencies in the UK’s power generation and distribution systems. Unlike other countries with interconnected power grids, the UK data centre industry relies heavily on the National Grid, leading to increased costs and limited access to more efficient private wire options. Without scalable and direct power solutions, the cost of expanding AI applications will remain unsustainable.
The absence of a cohesive national energy strategy further complicates the situation. While Scotland produces surplus green energy, regions in the South struggle to meet increasing demand for power. Renewable energy producers are currently being paid to reduce production, resulting in wastage of taxpayer funds. Addressing these inefficiencies is crucial not only for the future of AI but also for the growth of the data centre industry and the overall UK economy.
In addition to power concerns, robust fibre optic connectivity is equally vital for the data centre industry. Despite advancements, the UK still lags in deploying full-fibre connectivity. New data centre installations are limited to areas with both power and fibre availability, which are becoming scarce due to the lack of coordinated infrastructure development. This, combined with opposition to data centre planning in certain regions, poses a challenge to advancing AI and data-driven industries in the UK.
Rather than competing with the existing data centre industry in the UK, the Government should collaborate with it to develop a sustainable and distributed model for AI infrastructure. By focusing on supporting smaller, regional facilities, the UK can bolster businesses across the country, not just those located in designated ‘AI Growth Zones.’
The AI revolution is undoubtedly real, but so are the risks of hype and overinvestment. Both the industry and the Government must proceed cautiously, drawing lessons from past experiences to ensure that investments in AI infrastructure are grounded in reality rather than speculation.
Addressing foundational issues such as power, connectivity, and genuine demand is key to building a future driven by AI that delivers tangible benefits.