Understanding Data Center Containment

 In the modern landscape data center aisle containment has emerged as a pivotal strategy for optimizing operations. The system effectively curbs the mixing of hot and cold air, which has been a persistent challenge. Studies show that conventional data centers waste up to 50% of airflow due to its erratic movement, often bypassing the equipment meant to be cooled and returning directly to the Computer Room Air Conditioning (CRAC) units.

It becomes a catalyst for heightened energy efficiency within the facility, taming hotspots and harnessing the full cooling potential of AC units. Consequently, this could even negate the need for extra CRAC units despite escalating server density. The increasing costs of energy, as well as calls for data centers to be held more accountable for their environmental impact, has led many data centers to implement hot-aisle or cold-aisle containment. Both of these strategies provide substantial energy savings of 30% or more compared to an unconfined data center.

Hot Aisle Containment System: Unlocking Optimal Efficiency

The hot aisle containment approach revolves around enclosing the hot aisle, allowing a deluge of cold air to inundate the remaining data center space. This can be achieved through strategies like raised access flooring, overhead ducting, or simply flooding the data center with cold air. The cold air navigates its course to the equipment and servers, with a crucial physical barrier steering hot exhaust airflow toward the AC return.

A unique advantage of hot aisle data center containment lies in its ability to offer thermal override. This empowers the data center to retain stability during short periods of cooling system failures. It does so by the data center becoming a reservoir of cold air, capable of upholding supply temperatures for a limited duration, thus buying time until cooling is restored.

Cold Aisle Containment System: Balancing Efficiency and Comfort

In Cold Aisle Data Center Containment, the cold air is enclosed in the aisle. The rest of the data center becomes a large hot-air return plenum. This containment involves a physical barrier that enables the supply air to flow inside the cold aisle. Through this a predictable and uniform temperature is made possible at server air in-take. Cold aisle data center containment systems are often used on slab floors, for in-row cooling projects.

Strategic Cooling Optimization: Beyond Airflow Management

Distinguishing between airflow management and cooling optimization is paramount. While the former pertains to airflow control with elements like containment and blanking panels, the latter involves tweaking cooling system controls, such as altering temperature set points and fan speeds. While airflow management enhances IT equipment air intake temperatures, genuine energy efficiency flourishes with cooling optimization. This strategic approach trims operating expenses, boosts cooling capacity, and scales down energy consumption.

Sculpting the Optimal Cooling Regime: Step-by-Step

Optimizing a data center’s cooling regimen demands a prudent approach that eschews excessive cooling for the sake of operational efficiency. Often the concept that “cooler is better” has led to overcooling as data center managers are in search of ensuring service level agreements are met. However, with increased energy costs and government legislation demanding data centers be accountable for their energy use this concept of brute force cooling is becoming that of yesteryear. Some indispensable steps to implement containment in your data center include:

  1. Implement Containment Systems to Separate Incoming Air from Exit Air: When this is done, it allows data the maintenance of desired temperatures which can help meet service level agreements. When the air is hotter, heat rejection becomes more efficient.
  2. Determine the Supply Temperature That will Run the IT Equipment: The ideal temperature range is between 65° to 80°F. If the data center is running at higher temperatures, the server fans will ramp up and consume more energy.
  3. Utilize a Variable Capacity System that can Adjust to the IT load: In legacy data centers, fans often produce too much air. Therefore, leading to unnecessary consumption of electricity. The use of a variable capacity system will eliminate wasted energy through air consumption and address reliability problems caused by not producing sufficient air.
  4. Make Sure the Proper Controls are in Place for Data Center Optimization: In setting up data center controls, it is important to measure temperatures in the aisles and in front of the servers. The cooling equipment should also be controlled in order to maintain a sufficient amount of heat rejection.
  5. Utilize Economization in Order to Lower Costs and Reduce Power Consumption: One example of this is bringing outside air in order to reject heat without the use of mechanical refrigeration.

AKCP Data Center Sensor Monitoring – Wired and Wireless

The room’s thermal parameters must be continually checked and changed to guarantee optimal performance, both to enhance efficiency and to avoid downtime. A carefully monitored containment system, for example, allows cooling systems to be adjusted to a higher supply temperature, saving energy and increasing cooling capacity while remaining within acceptable operating temperatures. Uncontained cooling systems, on the other hand, are limited to a considerably lower set point than what is required by IT equipment to avoid hot spots. This sort of overcooling technology works, however it is inefficient in terms of energy use.

*international price

AKCP is the world’s leader in SNMP-based Data Center Monitoring Solutions. With over 200,000 installations worldwide, we have the experience to deploy monitoring solutions from the smallest single rack server room to a hyper scale data center with thousands of racks. Our sensorCFD™ technology assists you in analyzing your data centers containment, finding areas of potential hot and cold air mixing and areas for improvement. Watch our sensorCFD™ seminar video below.