As artificial intelligence (AI) workloads become more demanding, legacy data centres—once built to handle far lighter computing loads—are quickly showing their age. Older data centres, which were designed for smaller workloads and simpler computing tasks, now face significant challenges when tasked with handling the energy-hungry, high-performance needs of AI. It’s clear that these facilities require substantial updates to support the intensive requirements of modern technology.
The Challenge of AI Workloads on Legacy Systems
In the past, data centres were designed to handle racks of servers running at relatively low power (around 5-10 kW). Today, AI-driven applications demand far more from infrastructure, with some racks now pushing hundreds of kilowatts. This shift makes the traditional designs of legacy data centres obsolete. To stay relevant, these facilities must undergo significant retrofitting to accommodate the next wave of high-powered workloads.
Cooling Upgrades: From Air to Liquid
One of the most notable areas where legacy data centres fall short is cooling. Traditional air-based cooling systems were sufficient for older, less power-hungry servers. However, with the introduction of advanced AI servers and GPUs, these systems are increasingly ineffective and inefficient. Liquid cooling solutions, such as direct-to-chip cooling, have become the new standard. While overhauling cooling systems entirely can be expensive, introducing liquid cooling incrementally can significantly improve efficiency and reduce energy consumption.
Retrofitting for the AI Age
Rather than completely replacing legacy infrastructure—which would be prohibitively expensive—many data centre operators are choosing to retrofit their existing facilities. By upgrading rather than rebuilding, operators can extend the life of older plants and make them capable of supporting AI workloads. This approach offers a balance between cost and performance, avoiding the huge capital expenses that come with starting from scratch.
Retrofitting often involves reinforcing the physical infrastructure of a building. Older data centres may not have been built to support the weight of the heavy AI racks now required. To make these upgrades feasible, floors can be reinforced, and cabinets expanded to accommodate newer, larger servers. These adjustments allow older facilities to handle the advanced, energy-intensive equipment needed for AI computing.
Addressing New Technology Demands
According to Jordi Sinfreu, Head of Data Centres for Southern Europe at JLL, higher-density racks demand not only stronger physical structures but also different cooling solutions. As AI, edge computing, and hyperscale systems require more power and generate more heat, traditional air cooling systems can no longer keep up. Liquid cooling methods are better equipped to handle these needs, and data centre designs must evolve to integrate these newer technologies effectively.
Training staff is also crucial in this transition. Many legacy data centres were never designed with AI in mind, and operators may need additional education on how to manage and maintain upgraded systems. Legacy monitoring systems, for instance, may lack the sophistication required to track GPU utilization, latency, and other key performance indicators essential for AI operations. Updated systems with greater visibility can help identify potential bottlenecks and inefficiencies, making it easier to optimize performance.
Environmental Benefits of Retrofitting
While retrofitting presents technical challenges, it also offers significant environmental advantages. Instead of demolishing and rebuilding entire data centres—an extremely carbon-intensive process—retrofitting can reduce waste and energy consumption. With data centre energy use and carbon emissions coming under increasing regulatory scrutiny, updating existing facilities rather than constructing new ones can help companies reduce their overall carbon footprint.
Retrofitting also opens up opportunities to adopt newer, more sustainable technologies. For example, immersion cooling systems, which use liquid to cool servers rather than air, can dramatically improve energy efficiency. These systems are a prime example of how retrofitting can make legacy data centres more energy-efficient while reducing operational costs in the long run.
Preparing for the Future of AI
While fully transforming legacy data centres may not always be feasible, targeted retrofitting can make a significant difference. By focusing on the areas that will have the greatest impact—such as improving cooling systems, modernizing physical storage, and upgrading energy management systems—older data centres can be made fit for the AI-driven future. These retrofits not only help companies stay competitive but also contribute to sustainability goals by reducing the carbon footprint of existing infrastructure.
In the end, retrofitting legacy data centres is about more than just keeping up with AI demands. It’s about adapting older systems to meet the needs of a rapidly evolving technological landscape while reducing environmental impact. For many organizations, this strategy offers the best way forward.