We changed our name from IT Central Station: Here's why

Top 8 Data Center Cooling Systems Tools

Liebert CoolingSTULZ CoolingSchneider Electric-APC Data Center Cooling SystemRittal Data Center Cooling SystemGreen Revolution Cooling CarnotJetCoolcentric Data Center Cooling SystemIBM Cool BlueEaton Heat Containment System
  1. report
    Use our free recommendation engine to learn which Data Center Cooling Systems solutions are best for your needs.
    563,780 professionals have used our research since 2012.
Why is cooling important in a data center?

Cooling is necessary in a data center because without it, your entire organization’s ITE is at risk. Safe temperatures must be maintained for servers. Because data centers are vital for storing information, it would be a detrimental loss to a company if their data center were to become compromised or damaged. Without proper data center cooling, servers can overheat and fail. And because servers are in use constantly, heat is always being generated.

How does a cooling system work in a data center?

A cooling system works by removing heat from the vicinity of the ITE’s electrical components to avoid overheating problems. When a server gets overheated, onboard logic usually turns it off to prevent damaging the server. In the case that the server gets too hot, it could negatively affect its lifespan. Data center cooling systems use cooling techniques made up of a combination of raised floors and computer room air conditioner (otherwise known as CRAC) or computer room air handlers (CRAH) infrastructure.

Below the raised floor, either the CRAC or the CRAH units work to pressurize the space, pushing cold air through perforated tiles and then into the server intakes. The cold air is then passed over the server components, is vented out in the form of hot exhaust, and then directed back to the CRAC or CRAH unit for cooling without mixing with the hot air. Usually, the return temperature of the unit is set as the main control point for the data floor environment. To maintain ideal and efficient operating conditions, data centers use a variety of innovative and modern data center cooling technologies.

What is the ideal temperature for a data cooling center?

The recommended temperature for server inlets (which is the air drawn in to cool interior components) is between 18 and 27 degrees Celsius, with the humidity range between 20 and 80 percent (according to the American Society of Heating, Refrigerating, and Air Conditioning Engineers). Take note, though, this is not the suggested temperature for the entire server room but is just the recommendation for the server inlets.

Some companies run on a model of “expected failure” if they are a large enterprise and have hyperscale data centers, anticipating that servers will fail on a somewhat regular basis. Thus, they prepare ahead of time by putting software backups in place to route around equipment if it fails. In fact, to replace failed servers more often than normal can actually be less expensive than the cost associated with operating a hyperscale facility at lower temperature levels. This is not usually the case for smaller companies, though.

What are the most popular cooling techniques in a data center?

Below are some of the most common cooling methods used in a data center:

  1. Cold/hot aisle: Although this is viewed as one of the most inefficient methods of cooling, it is the most common because it is easy to implement. This method works by separating the cold air from mixing with the hot, by means of facing the inlet sides of the racks (referred to as the cold aisles) with the outlet sides (the hot aisles). Oftentimes, large amounts of cold air have to be used to eliminate the “hot spots” generated across heavier-loaded racks.
  2. Cold or hot aisle air containment: This method isolates the possibility of hot air or cold from mixing and driving it directly to and from the CRAC unit. While this method works well, you still end up heating or cooling larger areas more than necessary.
  3. In-rack heat extraction: This data cooling technique extracts the heat that gets generated inside the rack, completely preventing it from going into the server room in the first place. Compressors and chillers inside the rack move the heat directly to the exterior area of the data center.
  4. Water cooled servers and racks: This type of cooling solution directs water in a protective way to secure server components, tacking that same water through the hot side in order to lower the air temperature before it enters the server room.
Mistakes to avoid in data center cooling

When it comes to controlling data center cooling and keeping it as efficient as possible, it is important to avoid making these common mistakes:

  • Poor cabinet layout: It is best not to use an island configuration for your cabinet layout, since this is very inefficient. To avoid this problem, make sure your cabinet layout can accommodate a hot and cold aisle data center cooling design so that you can allow for your computer room air handlers to be at the end of each row.
  • Empty cabinets: Empty cabinets can interfere with airflow, causing hot exhaust air to end up back in your cold aisle. For this reason, if you have empty cabinets, be sure that you contain all cold air.
  • Space between equipment: If you have empty spaces between hardware, it can ruin your airflow management. Specifically, hot air is more likely to leak back into the cold aisle, especially if the spaces are not sealed properly.
  • Raised floor leaks: When cold air moves under your raised floor, it can leak into support columns or other adjacent spaces, which can result in a loss of pressure. When a leak causes a loss of pressure, it allows humidity, dust, and warm air into the cold aisle environment. To prevent this from occurring, you can have someone do an inspection and seal any leaks they discover.
  • Cable openings: If you have someone come do an inspection, have them look for unsealed cable openings under the raised floors. They should also inspect the holes under remote power panels as well as any power distribution units to make sure cold air cannot escape through these openings.
  • Humidity control: If multiple air handlers are fighting to control humidity, it can create problems. To clarify, if one air handler is trying to humidify the same air that another air handler is trying to dehumidify, a lot of energy is wasted. You can avoid this by planning your humidity control points accordingly.