How to design and build a data center for the new AI era.
Published by Sustainability Magazine 4 December 2024
Data centres are now being designed and developed in a new and innovative way to accommodate demanding AI workloads and greater networking capabilities.
When it comes to the data centre sector, artificial intelligence (AI) continues to dominate the headlines. From optimising workloads, to improving customer satisfaction, the technology has fast been touted as an integral solution for the next era of data centre operation.
However, AI has already led to data centres feeling the strain, particularly as far as energy consumption is concerned. As a result, businesses within the sector are being forced to design and build facilities in a new way.
With insights from atNorth and other industry leaders, we focus on how data centre companies can design and construct new data centres to accommodate new and disruptive technologies moving forward.
Confronting a new era
The vast majority of data centres currently in operation today were not designed to support the high-power requirements of AI-led workloads. New infrastructure requirements are different from traditional data centres, as they generate greater levels of heat that current facilities cannot remove fast enough. AI workloads also require almost-instant processing of vast amounts of data, which also requires a significant amount of energy. This means that data-intensive businesses will be looking for more modern sites that are designed specifically for AI.
“A data centre configured for typical enterprise applications might require 7 – 10 kilowatts (kW) of power per rack. But for AI, the power requirement increases to more than 30kW per rack,” says Anna Kristín Pálsdóttir Chief Development Officer at atNorth. “As a result, legacy data centre campuses are having to be upgraded – not just to accommodate the digital infrastructure associated with AI workloads but to allow for significant cooling systems and power distribution units (PDUs), generators, and uninterruptible power supplies (UPS).”
Questions and place (and space)
With AI being extremely computational, it requires increased power and cooling demands. This means training or inference models don’t need the same levels of uptime that traditional cloud computing requires, which could result in space and efficiency and cost related opportunities.
This is why location is a strategic consideration when it comes to data centre construction. Already, data centres are having to move AI workloads closer to the network edge in order to handle large data volumes.
“Traditionally, data centres were situated on-prem or close to their main business location,” Anna explains. “This is still sometimes necessary for sensitive data or regulatory compliance, yet it is no longer necessary or even advantageous in many cases. Most AI workloads do not require very low latency networks making them location agnostic and as a result there will be a shift away from metro location sites to areas that can better meet the needs of AI workloads.”
She cites the Nordics as an example: “Regions with a cool and consistent climate can utilise more energy efficient cooling technologies such as natural air cooling or Direct Liquid Cooling (DLC). These techniques significantly lower the PUE of data centres, a 33% improvement in energy efficiency and significant carbon reductions, particularly when combined with Iceland’s use of renewable energy.”
“The Nordic countries also practise circular economy principles and so it is possible to employ heat reuse technology to recycle waste heat from data centre sites.”
To read the full story in the magazine, click HERE.