When you ask a data center contractor or engineer what keeps them up at night, don’t expect a simple answer. The world of data centers is complex – messy, even, if you get down to the specifics. On a recent Wednesday afternoon, I found myself moderating a panel with three folks who know that complexity intimately: Derek de Jesus, president of DJC X and chair of the AABC Commissioning Group’s Education Committee; Randy Ports, sales director at Copeland; and Philippa Hall, senior product manager for mechanical insulation at Johns Manville.
The topic? Regional considerations in data center design – specifically, how mechanical, electrical, and plumbing (MEP) systems are evolving to meet the brutal demands of everything from high deserts to muggy coastlines. If you think that sounds like an engineering headache, you’re right. But it’s also the future of the internet – and, honestly, a pretty fascinating glimpse at the collision of climate, technology, and good old-fashioned problem-solving.
The Liquid Revolution
We kicked off with what’s probably the buzziest shift in the industry: the move from air to liquid cooling. You might think it’s just a matter of swapping out fans for pipes, but de Jesus was quick to point out, “The commissioning process itself doesn’t really change, but the means and methods do.” For the uninitiated, commissioning is the process of making sure a data center actually works the way the owner expects – and the devil is very much in the details.
Ports, from Copeland, jumped in to sketch out the practicalities. “There are really two major types of liquid cooling. Direct-to-chip, where you’re pumping fluid right to the processors, and immersion, where the servers are basically swimming in coolant.” Both, he explained, are a far cry from the air-cooled rooms of the past. The risks and requirements change: leaks, contamination, water loops that weigh more than you’d think, and a whole new set of skills for maintenance crews. “Fluids and electrical systems don’t play nice,” Ports deadpanned, and nobody disagreed.
Climate Control: Not Just for Comfort
When we turned to regional challenges, de Jesus painted a vivid picture: wildfires in Oregon, salt air chewing away at coastal equipment, and Texas summers where it’s 110 degrees with 80% humidity. “Air quality becomes a real problem,” he said. “You have to model wind, control humidity, and make sure your infrastructure can live in that environment.”
Ports nodded, adding that extreme environments push cooling equipment to its limits. “High lift capacity, corrosion resistance, reliability under high ambient conditions – it all has to be baked in from the start.” And water? In the desert, it’s gold. “Water scarcity is a big concern for water-cooled systems,” he said.
The Insulation Blind Spot
When Hall finally called in (after a classic round of technical difficulties – some things never change), she got right to the point: “One of the most overlooked aspects is the selection and proper installation of your insulation systems.” As power densities rise with AI workloads, even a small gap in the insulation can lead to condensation, corrosion, and a gradual, expensive breakdown. “A fully sealed insulation system isn’t just about thermal performance – it’s about system longevity and reliability,” she said.
Ports riffed on that, too: “If you want future-proofing, build modular, scalable, and upgradable systems. Otherwise, every upgrade is a nightmare.” Turns out, “plug and play” is a lot harder when pipes are sweating and compressors are aging out.
Prefab, Modular, and the Race for Speed
Modular and prefabricated data centers are all the rage. Everyone wants their space faster, cheaper, and more reliable. De Jesus, who’s been in the trenches, was candid: “We’re seeing more products leave factories missing parts, or with insulation that doesn’t meet spec. So now we’re putting commissioning agents and quality inspectors in the factories. Catch issues before they hit the site.”
He described a sort of arms race between speed and quality. “If you try to deploy a one-size-fits-all model, you’re going to get caught by local climate. Texas isn’t Alaska. If you ignore that, the product degrades fast.”
Humidity: The Silent Killer
Humidity crept up again and again. De Jesus called it out: “Turn everything on at once and you’ll lose control of humidity. Equipment will rust before you even finish the build.” His prescription was simple but relentless: keep the doors closed, use spot coolers, and stage the work so you’re always fighting to keep the environment dry and clean – even during construction.
Compressor Tech: Quietly Revolutionary
When the conversation drifted towards hardware, Ports detailed Copeland’s new oil-free centrifugal compressor – a mouthful, but basically a leap forward in reliability and efficiency. “Airlift bearing technology, no oil management, fewer moving parts. Up to 10% energy savings at full load, 40% at partial load. It’s designed for high ambient conditions – exactly what you need in the most brutal climates.”
AI, Automation, and Power Headaches
AI is changing everything. De Jesus summed up the new challenge: “AI servers ramp up from zero to 100% in a blink, and that hammers your electrical system. Harmonics, power quality, generator response – all become way more critical.” He even mentioned the potential for micro-nuclear plants at data centers if utility companies can’t keep up. “Until then, we build in small increments and wait for someone to make the giant leap.”
Future-Proofing: Plan or Perish
What’s the best advice for facility managers? “Think modular, upgradable, and avoid over-customization,” Hall said. “Work with the OEMs, make sure your insulation matches both the application and the local climate.” Ports added a warning: “Regulations and refrigerants are changing fast. Stay proactive, or you’ll get left behind.”
The One Big Misconception
As the session wound down, I tossed out what I thought was a softball: What’s the biggest misconception about regional design?
Hall was blunt: “That one size fits all. It never does.” De Jesus agreed: “You can have a template, but you always need site customization. Standardization saves time – until it doesn’t.”
Q&A: The Details That Matter
The audience questions pushed the panel even deeper into the weeds. One engineer from Atlanta asked about keeping up with refrigerant regulations, and Ports didn’t sugarcoat it: “It’s a moving target. You have to design for flexibility, because the refrigerant you spec today might be phased out tomorrow. Always run your plans by local code officials early. Trust me, it’ll save headaches.”
Someone else wanted to know how to balance energy efficiency with redundancy – a classic data center dilemma. Hall jumped in: “There’s no shortcut. You need to model for your worst-case scenario, but don’t overbuild just to feel comfortable. Strategic insulation and modular design can let you scale up reliability without blowing your budget or power allotment.”
A facility manager from Phoenix was blunt: “We’ve had to replace entire coils because of mineral buildup in our cooling systems. Is there any way around this?” De Jesus answered: “If your water treatment isn’t dialed in, it doesn’t matter how advanced your cooling tech is. Build a maintenance plan that fits your local water quality, not just the equipment spec sheet.”
There was even a question about AI-driven predictive maintenance – are we there yet? Ports was cautiously optimistic: “It’s coming fast. Compressor sensors, vibration monitoring, even insulation condition can be tracked with IoT tags now. But you still need boots on the ground to check what the data’s telling you.”
And for those staring down new builds, De Jesus had a final, practical tip: “Bring your commissioning agent in early. Don’t let them be the last one in the room. That’s how you catch mistakes before they catch you.”
These weren’t just technical answers – they were reminders that every region, every site, every team brings its own set of challenges. The smartest operators are the ones asking the hard questions before the first shovel hits dirt.
The Takeaway
If there’s a theme to all this, it’s that data center design is about adaptation. The climate is a moving target. Technology is sprinting ahead. And the only constant is the need for vigilance, flexibility, and a willingness to sweat the details – literally and figuratively.
As the webinar wrapped, the chat filled with questions about refrigerants, compressor sizes, and the quirks of VRF systems. There’s no shortage of technical rabbit holes to go down, but the big picture is clear: Whether you’re building for the desert or the coast, there’s no substitute for thoughtful, climate-aware engineering.
And if you’re wondering about the right answer to any of these problems? Like so much in this industry, it depends.