Schneider: Transform Old Data Centres into AI Hubs

Share
Share
Schneider Electric's technologies and solutions have been deployed at Start Campus in Portugal (Credit: Schneider Electric)
Schneider's advocate highlights faster permitting, lower latency, liquid cooling upgrades and ESG gains as brownfields meet surging compute demands

As AI transitions from experimental technology to business-critical infrastructure, the data centre industry faces a fundamental challenge. How can operators meet surging demand for AI-ready facilities without the decade-long timelines and enormous capital requirements of greenfield construction?

For Steve Carlini, Chief Advocate for Data Centre and AI at Schneider Electric, the answer lies not in building from scratch but in transforming what already exists. Retrofitting existing data centres for AI workloads represents more than a pragmatic solution to capacity constraints – it is a democratising force that could reshape the competitive landscape.

This highlights the opportunities for construction firms in upgrading existing facilities with power systems, liquid cooling and infrastructure mods, aligning with UK ESG goals by slashing embodied carbon.

Youtube Placeholder

While hyperscalers pursue massive greenfield developments, small and medium-sized operators are discovering that brownfield sites offer a faster, more strategic path to AI capability. With the right strategic approach to power distribution, liquid cooling and infrastructure upgrades, yesterday’s traditional data centres can become tomorrow’s AI factories.

The strategic advantage of brownfield sites

The appeal of retrofitting existing facilities extends far beyond simple economics. Steve identifies several compelling advantages that make brownfield sites particularly attractive for smaller to medium operators entering the AI space.

“Sites that exist will already be permitted as a data centre and can circumvent any lengthy permitting cycle,” he explains. “Additionally, many existing data centres may be strategically located close to the data sources or applications, which can have significant advantages in faster speed, lower latency and lower data transfer costs.”

This proximity advantage becomes increasingly critical as AI applications evolve from centralised training models to distributed inference workloads. Businesses deploying agentic AI and real-time analysis tools need processing power located near their operations, not thousands of miles away in a hyperscale facility.

Steve Carlini, Chief Advocate for Data Centre and AI at Schneider Electric

Identifying the right sites for retrofitting

Not every existing data centre makes a suitable candidate for AI transformation. Steve outlines a clear hierarchy of priorities when evaluating potential retrofit projects and the availability of power tops the list.

“Sites that have an abundance of utility power are pure gold as accelerated compute AI requires more power,” Steve notes. “Second are sites where more utility capacity can be added quickly.”

The exponential power demands of AI workloads mean that sites with generous utility connections or the ability to secure additional capacity quickly become invaluable assets. However, location considerations extend beyond connectivity.

“Additional considerations, picking sites that are isolated from neighbourhoods and highly populated areas,” Steve advises.

“These areas can be difficult for the new breed of data centres with more generators and chillers that can make quite a bit of noise and can trigger complaints.”

Youtube Placeholder

Making data centre retrofits financially viable

The business case for retrofitting centres on the revenue potential of AI. Steve sees a clear path to return on investment as AI moves from experimental pilot projects to production applications.

“Many data centre operators would like to add accelerated compute AI and become and run applications for in-house application automation or offer pay for AI models,” he explains. “The monetisation of AI working models or inference is the next big wave. It is just ramping up as AI pilot models are transitioning to working models.”

Overcoming technical obstacles

The technical challenges of retrofitting for AI are substantial, requiring comprehensive upgrades to both power and cooling infrastructure. Steve identifies the electrical distribution system as the first major hurdle.

“On the power side, the main issue is grid power,” he states. “The second issue will be the entire power distribution inside the data centre. Traditional data centres were designed for lower power densities or distributed workloads. AI workloads demand concentrated power delivery, which may require upgrades to: PDUs, medium-voltage switchgear, low-voltage switchgear, transformers, circuit breakers and busways or cabling.”

The cooling challenge proves equally complex. Modern AI servers have fundamentally different requirements than traditional IT equipment.

“On the cooling side, most next-generation AI servers are natively liquid-cooled and come with integrated cold water inlet and hot water outlet connections,” Steve explains. “These are not optional – they are required for operation.”

Schneider Electric solutions seen in the Start Campus data centre (Credit: Schneider Electric)

Environmental benefits of retrofitting

Beyond operational advantages, retrofitting offers meaningful improvements regarding sustainability. Steve highlights two key areas where AI-ready retrofits can enhance environmental performance.

“Closed loop liquid-cooling systems use much less water than traditional cooling,” he points out. “Additionally, a new generation of 800VDC electrical power distribution to the AI servers will use less current and produce less heat in the future.”

Why 2026 marks a turning point

According to Steve, the convergence of several trends makes retrofitting particularly critical in 2026. He sees AI applications reaching a maturity threshold that will drive unprecedented demand for distributed computing capacity.

“As production-ready AI inference applications start gaining momentum, companies improve their business process efficiency and they start to automate their business processes with agentic AI and eventually start Artificial General Intelligence (AGI), each progression will require significantly more computing horsepower enabled by data centres,” he explains.

“These applications will benefit from being located in data centres that are located closest to the application and data generation and processing, which will be in smaller, retrofitted data centres.”

Company portals

Executives