New Cooling Technologies Tackle Data Center Heat

Tech vendors from startups to Hewlett-Packard and IBM are developing data center cooling products that squirt liquid onto a computer's hot spots. In some cases, specially treated, nonconductive, noncorrosive H20 is used. In others, it's something more esoteric, but the goal is the same--to whisk away heat as efficiently as possible.

The trend is a return to the past. Twenty years ago, mainframes were cooled by water jackets that worked like a car's radiator, pulling heat from the engine and releasing it elsewhere. But there's resistance this time because some IT professionals don't want to reintroduce fluids to modern data centers in which temperatures are controlled by large air conditioning units. The thinking is that all those expensive servers, jam-packed with sophisticated electronics and software, are best kept dry.

Time to rethink. As data centers embrace racks of servers and closely packed blade servers, which kick off more heat per square foot than individual servers, liquids may be the only way to keep up. Says IBM's Jim Gargan, "On a warm day, do you get more relief from diving into a pool of water or sitting under a fan?"

\

SprayCool's chip-cooling system lives up to its name

Gargan, the VP responsible for IBM's System X cooling technology, predicts that by next year most businesses will spend more money to power and cool their data centers than they spent on the computer systems inside them. He calls data center power consumption and cooling "the IT battlefield for the next decade."

id
unit-1659132512259
type
Sponsored post

Strategies include buying servers with cooler processors, reconfiguring data center layouts to improve airflow, and moving cooling systems closer to computers. Then there are the emerging liquid-cooling technologies. They range from water jackets that surround servers to the more radical approach of spraying liquids on a server's electronic components.

ISR, a company that specializes in thermal management technology, recently introduced a commercial version of its SprayCool M-Series direct chip-cooling technology. An M-Series module attaches to a microprocessor and sprays a liquid mist onto a cold plate that surrounds the processor, removing more than half the heat. The SprayCool G-series, which has been available for several years and used mainly in government data centers, sprays nonconductive fluids directly across a server motherboard.

Emerson's Liebert and HP are preparing direct chip-cooling technologies for the commercial market. A year ago, Liebert acquired Cooligy, which developed a cooling approach that sprays chemically treated water onto a plate placed on top of a processor. The plate has 100 or more microchannels that direct coolant onto a chip's hot spots. The cooling system already is used in tens of thousands of workstations, Cooligy says.

A group inside HP called Cool Team is working on close-proximity liquid-cool systems, direct chip-cooling, and dynamic sensor control of servers and cooling systems. In January, HP introduced its first "environmentally controlled" chilled- water platform, called the Modular Cooling System, which attaches to a server rack and uses chilled water to distribute cool air across the front of the rack. Effective cooling allows more servers to be inserted into racks, which often have empty slots to avoid heat buildup.

HP has been working on a direct chip-cooling technique for several years that would use ink-jet spray heads from its printer division to distribute drops of coolant onto microprocessors. Chandrakant Patel, the HP Fellow who heads the company's Cool Team, expects new and existing cooling technologies--conventional airflow, direct attached, and chip level--to be used in combination for years to come.

Hot Stuff

Data center managers are doing a poor job handling heat. A study of 19 computer rooms with more than 200,000 square feet of combined floor space done by the Uptime Institute, a research group, found they had 2.6 times the cooling capacity required, but wasted more than 60% of capacity because of poorly designed layouts and airflow, among other deficiencies. As a result, more than 10% of the server racks ran too hot.

And get this: The average power consumption in those data centers was 2.1 kilowatts per rack. More densely packed server racks can consume as much as 20 to 40 kilowatts, generating several times as much heat.

Georgia Institute of Technology is using new technology that moves cooling systems closer to the source of the heat to save about $160,000 annually in utility bills. Jeffrey Skolnick, director for the university's Center for the Study of Systems Biology, oversaw the installation of an $8.5 million supercomputer where space and power considerations were crucial. It includes a 1,000-node cluster of servers in 12 racks using IBM's BladeCenter system; IBM's rear-door heat eXchanger, which places chilled water directly behind servers, does the cooling.

IBM's eXchanger, introduced last year, solves several problems in the 1,300-square-foot center. It needed only half the air conditioning expected--80 tons instead of 160 tons--and reduced airflow lowered noise. With four more racks to fill in the coming months, Skolnick is looking at chip-level cooling to cut power costs further. "At the time, [the heat eXchanger] was the most viable technology," he says. "Every time you do an upgrade, the rules change, and you have to look and see what's available."

The idea of introducing water or other liquids into a data center scares some IT managers because water can damage computer components and cause short circuits. Imagine a burst water pipe or a liquid sprayer gone awry. But there may not be a good alternative, says Leonard Ruff, an associate principal at Callison, a data center design firm. Callison, which has been testing the SprayCool M-Series, thinks direct chip cooling is so effective that a business can double the amount of electricity used to power computers and boost the number of servers in a rack without overheating a data center, resulting in a 285% increase in processing capability. Callison now markets the system to its customers.

More companies will adopt water-cooled technology even though there is "an almost unreasonable" resistance to water among data center managers, predicts Gartner analyst Carl Claunch. He recommends that businesses include infrastructure for water cooling when building new data centers even if there are no immediate plans for implementing such equipment. They'll eventually need to pipe in water because it's so efficient at cooling, he reasons.

Other Cool Ideas

In the past two years, power management and cooling equipment vendors like Liebert and American Power Conversion, as well as HP and IBM, have introduced data center cooling products that range from air conditioning systems to liquid- and refrigerant-based systems that attach to server racks.

IBM this week will announce enhancements to PowerExecutive, a tool that automates management of power consumption and moves energy resources to systems that need more power and away from those that are about to overheat, and Thermal Diagnostics, which monitors for potential heat-related problems inside a data center and directs PowerExecutive to take action.

Cool-Down Tactics

Slow Down

Switch from fast, hot processors to cooler dual and multicore processors to reduce heat generation

Alternate

\

Organize your servers with hot and cool aisles to improve data center airflow

Concentrate

\

Move cooling systems closer to the servers to make sure the cool air is concentrated in the right place

Look Underfoot

\

Make sure perforated floor tiles are properly installed so they don't impede the flow of cool air from under the floor

Tape Up

\

Seal holes and other openings to prevent cool air from escaping and hot air from circulating

Get Wet

\

Make greater use of water to cool the air in the data center or directly cool servers

Think Small

\

Consider cooling at the chip level

Cool-Down Tactics

Slow Down

Switch from fast, hot processors to cooler dual and multicore processors to reduce heat generation

Alternate

\

Organize your servers with hot and cool aisles to improve data center airflow

Concentrate

\

Move cooling systems closer to the servers to make sure the cool air is concentrated in the right place

Look Underfoot

\

Make sure perforated floor tiles are properly installed so they don't impede the flow of cool air from under the floor

Tape Up

\

Seal holes and other openings to prevent cool air from escaping and hot air from circulating

Get Wet

\

Make greater use of water to cool the air in the data center or directly cool servers

Think Small

\

Consider cooling at the chip level

Cool-Down Tactics

Slow Down

Switch from fast, hot processors to cooler dual and multicore processors to reduce heat generation

Alternate

\

Organize your servers with hot and cool aisles to improve data center airflow

Concentrate

\

Move cooling systems closer to the servers to make sure the cool air is concentrated in the right place

Look Underfoot

\

Make sure perforated floor tiles are properly installed so they don't impede the flow of cool air from under the floor

Tape Up

\

Seal holes and other openings to prevent cool air from escaping and hot air from circulating

Get Wet

\

Make greater use of water to cool the air in the data center or directly cool servers

Think Small

\

Consider cooling at the chip level

Not every data center needs liquid cooling; there are ways around it. INetU Managed Hosting just opened its fourth data center, near Allentown, Pa. The 1,500-square-foot facility uses raised floors, vapor-barrier walls to reduce humidity, extra fans in front and behind server racks for improved airflow, and redundant conventional air conditioning systems for cooling. President Dev Chanchani says he's able to avoid many of the heat problems confronting other data centers by simply expanding. If he needs more capacity, he'll build another data center.

If forced to pack more equipment into an existing data center--the challenge faced by many companies--Chanchani says he would look at some of the new cooling techniques. "If you're having to increase your server density within a small footprint," he says, "I think cooling modules would be exceptionally helpful."

Rensselaer Polytechnic Institute is building a research facility that will include a 70-teraflop supercomputer in a 5,000-square-foot data center. The facility required special design considerations to facilitate air cooling and keep water and other liquids out, says John Fisher, chief network architect. The school is building a four-foot raised floor in the data center to allow maximum airflow and plans to leave 25% of rack space empty. Fisher believes that even with the extra expense of the elevated floor and wasted rack space, it will be cheaper than a more "exotic" approach involving liquid-cooling.

Perhaps. But Rensselaer's data center will consume some 2.5 megawatts of electricity, enough to power thousands of homes. Of that, about 1 megawatt will be needed to power the servers. And what about the remaining 1.5 megawatts? Most of that will be dedicated to cooling.

Continue to the sidebar:
Battery Fires A Different Beast From Data Center Heat Issues