Skip to content
Refpropos.

Refpropos.

  • Home
  • Automobile
  • HVAC
  • Supercar
  • Volvo
  • Entrepreneur
  • Toggle search form
Aging Data Centers Are a Goldmine for HVAC Contractors

Aging Data Centers Are a Goldmine for HVAC Contractors

Posted on June 2, 2025 By rehan.rafique No Comments on Aging Data Centers Are a Goldmine for HVAC Contractors

Virtually every aspect of modern life, whether it’s online shopping, streaming movies, or national security, relies on data centers, making these behemoths a critically important piece of American infrastructure.  

As demand for digital services grows exponentially, especially with the rise of AI, these facilities face immense pressure to operate with a high degree of efficiency and reliability, while also keeping operating costs affordable.  

HVAC systems play a pivotal role within these systems, combating the excessive heat produced by data center equipment, ensuring optimal temperatures to prevent downtime and data loss.  

Technology in this sector is becoming increasingly advanced, and with the rising adoption of liquid cooling and smart controls, new data centers are drastically reducing their energy footprint. 

But what about the data centers with older HVAC systems? Turns out, they are a perfect candidate for retrofits, and companies may be clamoring for this work once they learn about the benefits.  

HVAC contractors with the right skill sets can help to significantly reduce power usage, lowering the strain on the grid, while also supporting the American digital economy. Contractors can also seize on the opportunity brought on by aging data center infrastructure, and position themselves advantageously in a booming industry. 

 

Retrofitting Strategies for Efficiency  

When it comes to retrofitting a data center cooling system to be more efficient, Danielle Rossi, global director – mission critical cooling for Trane, theorizes there are two methods: replacing existing products with newer, more efficient equipment, or redesigning the entire system to allow for more efficient, higher-density compute power at the rack.  

“While the more efficient method is to optimize the entire system, identifying the cooling design at the rack will dictate water requirements and, in many cases, will require additional and/or larger water feeds into the data center white space,” Rossi said. “These changes will typically require downtime of the facility and additional costs when compared with greenfield design. Most retrofits in the data center space have been focused on product and component replacement to help limit costs and downtime, while still being able to achieve better efficiency.” 

Mihir Kalyani, data center business unit manager, Americas, Evapco, said he’s seeing a broad combination of technologies used in retrofits, including the use of heat exchangers, dry coolers, and cooling towers. 

trane rtu.

NEXT FRONTEIR: With data center cooling needs evolving fast, outdated HVAC systems are becoming big opportunities. (Courtesy of Trane)

“The increase in server rack densities for AI workloads has resulted in many large projects updating their cooling infrastructure from previous air cooling designs to using water-cooled or air-cooled chillers,” Kalyani said. “If deployed with water-cooled chillers, cooling towers, evaporative or hybrid fluid coolers, or adiabatic or dry coolers are the heat rejection options, the application on which depends on project-specific constraints (water, power, layout, etc.).” 

Traditional heat rejection technologies, such as evaporative cooling towers, continue to be applied, but projects that are constrained by water have no choice but to evaluate alternatives, such as closed-loop hybrid, adiabatic, or dry cooling systems. 

“Most of Evapco’s evaluations on projects with unique constraints involve comparisons between all of these technologies to establish baselines for all parameters such as cost, water consumption, energy consumption, total connected power, total installed footprint, sound, maintainability, and overall operational efficiency,” Kalyani said.  

 

The Rise of Liquid Cooling

As of now, “direct-to-chip” liquid cooling is a new and enormously influential trend, for a myriad of reasons.  

“Server racks have grown exponentially in ‘density’ with new racks now including capabilities of up to 600 kW. A few years ago, 5-10 kW racks were the norm,” Kalyani said. “This has resulted in giant alterations in data center designs. Existing data centers with air-cooled racks are being retrofitted with liquid-cooled server racks, connected to air-cooled or water-cooled chillers. “ 

Air-cooler chillers have larger footprints, high connected loads, and start to lose viability as big data center heat rejection requirements grow, which is certainly a trend right now.  

“We’re seeing data centers over 1 GW in size, requiring more than 280,000 tons of heat rejection, a huge change, recently,” Kalyani said. “When water-cooled chillers are used, Evapco’s heat rejection technologies can be used to cool them — entailing the use of evaporative, hybrid, dry, or adiabatic heat rejection, depending on site priorities and resource availability.” 

Rossi noted that transitioning to liquid cooling will be unique to each site.  

“Liquid cooling provides more direct heat transfer at the chip and is required to cool some of the highest density chips,” Rossi said. “The type of chip, and therefore the type of liquid cooling required, will determine what retrofits will be necessary. A direct-to-chip design will utilize vertical racks that are likely already in the white space. Comparatively, immersion cooling can provide higher densities but uses heavier, horizontal tanks. Rack space requirements, floor loading, inlet water temperatures and flow, serviceability, local code with respect to dielectric fluids, downtime availability, construction demolition requirements adjacent to the white space, etc., must all be considered when planning a retrofit to liquid cooling.” 

But even though liquid cooling is the hot trend right now, Rossi said it’s very hard to compare an air versus liquid retrofit.  

“The traditional component replacement is prevalent in the air-cooled space with some minimal airflow management, such as hot and cold aisles, ducting, etc. Liquid cooling is often more efficient than air cooling because of the opportunity for greater density within a given footprint as well as a better heat transfer at the chip level,” Rossi said. “The necessary changes during retrofit may be significantly more than an air-cooled retrofit, but there is considerable potential efficiency upside. Site efficiency, PUE, and maintenance can be optimized with the right design. However, there is a recognition in the industry that PUE may not be the best indicator of efficiency with a liquid-cooled design.” 

For example, a chip might be able to use higher temperature inlet water than an air-cooled design, which would utilize less power from the use of chillers, pumps, etc. However, the chip might function optimally at a lower water temperature.  

“The supporting heat rejection components would use more power in that application, but the compute power out of the chip may be significantly more than at the higher inlet water temperature,” Rossi said. “The ultimate determination of efficiency is optimizing compute power with the heat rejection system.” 

Liquid cooling technologies/equipment offer far greater efficiency than air-cooling, resulting in higher “density” of computation or storage in a smaller footprint, Kalyani noted. Air-cooled data centers will continue to exist, though most data center facility managers meeting the demands of AI workloads are much more likely to choose liquid cooling systems because of the efficiencies they offer.   

 

Challenges Facing Contractors

Each situation is unique, and like all other retrofits, it can be cost-prohibitive and the work invasive.  

“There is a large difference between the cost associated with retrofitting when compared with the smaller increased cost of planning during greenfield design,” Rossi said. “Avoiding retrofitting also helps customers protect the uptime of their operations. There are special challenges with retrofits to liquid cooling due to the water system requirements. While not all designs will require large changes to a site’s water plant or piping, some may require a significant amount of construction and demolition. The amount of disturbance to the site will determine the downtime needed for a retrofit.” 

Corey Flores, account executive, Johnson Controls, noted it’s also important to research local regulations, as some states have enacted stringent water conservation and energy efficiency regulations, and some locations may require special consideration to sound.  

By partnering with manufacturers that are experienced in mission-critical applications, contractors and engineers can ensure the equipment they specify is designed to meet the obstacles data center leaders face.  

york.

RETROFIT RESCUE: New products on the market, like the The York YVAM air-cooled magnetic bearing centrifugal chiller, are specifically designed for data center applications.  (Courtesy of JCI)

“The York YVAM air-cooled magnetic bearing centrifugal chiller is specifically designed for hyperscale and colocation data center applications,” Flores said. “For example, the York YVAM chiller operates with zero water use on site, supporting the growing importance of water usage effectiveness and the conservation of water. The York YVAM chiller also produces notably less sound than screw chillers, operating at just 65 dBA at 10 meters — equivalent to the noise level of background music at a restaurant.” 

To deal with this specialized equipment, Kalyani said they’re seeing a substantial trend toward dedicated facilities and operational teams, much more comprehensive training programs, and more involvement with equipment manufacturers to understand how to maintain and service cooling equipment.   

“Some manufacturers, including Evapco, provide full-service solutions and maintenance contracts to data center owners and operators,” Kalyani said. 

 

Future Trends

While these retrofits are targeted at data centers that are only decades old, new builds are increasingly becoming forward-thinking, understanding that the technology, even 10 years from now, could be vastly superior to the latest and greatest tech of today.  

Flores said “future-proofing” is a growing trend they’re seeing even within new data center developments, meaning the data center retrofits and system upgrades will be a source of work for contractors, even decades from now. 

“To delay the potential need for retrofitting, forward-thinking data center leaders are instead designing cooling systems that enable effective use alongside rapidly advancing IT infrastructures,” Flores said. “As a result, there has been an increased adoption of modular, air-cooled chillers, such as the York YVAM air-cooled magnetic bearing centrifugal chiller. These systems offer a flexible chilled water temperature (CHW) that can accommodate higher cooling loads as IT operating ranges increase or change over time. The modular design means systems can be easily scaled to increase flexibility.” 

Rossi also touched on the “future-proofing” trend for new builds.  

“There is a recognition that the rapid changes in chip design and associated heat rejection can result in site designs becoming obsolete much faster than previous designs,” Rossi said. “This forward-thinking, detailed planning is being done for thermal management, as well as high-density power distribution, controls, site selection for heat recovery, and other potential future design requirements as chip innovation evolves.” 

The average chiller is designed to operate for 20 years, Floras said, and data center IT will likely evolve three to four times within that lifespan. The inclusion of AI into BAS will also help to further optimize performance and energy efficiency.  

“In a recent study conducted by Forrester Research and Johnson Controls, only 7% of data center leaders surveyed said their systems were fully integrated with intelligent BAS solutions,” Flores said. “Intelligent controls can help to take equipment performance a step further by analyzing vast amounts of data and making autonomous adjustments in real-time based on operational goals. For example, features like predictive maintenance can identify performance drift or potential malfunctions before they happen to maximize uptime. If equipment does go out of service unexpectedly, the BAS can automatically re-optimize based on the remaining available equipment to help maintain desired outcomes.” 

For Kalyani, the future will be defined by greater computational/data densities, the demand for renewable energy, perhaps even nuclear power (another blooming trend, one that calls for small, dedicated nuclear reactors), and multi-use facilities that may involve crypto mining, AI data centers, etc.   

“Field-erected solutions like the eco-Air Titan, which were once only applied to large power plant type projects, are being applied to data centers to reduce footprint, reduce connected power, and provide substantial capacity,” Kalyani said. 

HVAC

Post navigation

Previous Post: Can the GWM take down the BYD Shark in a hybrid ute showdown?
Next Post: Here’s What Two-Time MotoGP World Champ Thinks Of The Lamborghini Temerario

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Porsche Experience Center Atlanta marks a decade of driving excitement
  • Would You Still Buy An SF90 Stradale That Flipped Onto Its Roof?
  • How Personalization Fuels Success: Spilled Milk Catering’s Winning Strategy
  • Subaru teases what could be a new WRX
  • Morgan Stanley Builds AI Tool That Fixes Major Coding Issue

Categories

  • Automobile
  • Entrepreneur
  • HVAC
  • Supercar
  • Volvo

Copyright © 2025 Refpropos..

Powered by PressBook Blog WordPress theme