Case Study

6 min read

DataBank and Georgia Tech Progress with Purpose-Built Data Center in Tech Square

Project at a Glance

The DataBank ATL1 data center is not your average data center.

When Georgia Tech set out to create a high-performance computing center for their institution, they turned to DataBank to build a data center environment capable of meeting the performance needs of the center, and overcoming the inherent challenges that often accompany HPC initiatives.

DataBank’s ATL1 facility will be located in the new CODA building of Georgia Tech’s Technology Square, smack dab in the middle of the Atlanta’s tech hub, where academia meets innovation, research, and fortune enterprises. ATL1 will not only be different from any other DataBank facility, it will be will be one of the most advanced data centers in the country.

img
Midtown Atlanta Data Center The Midtown Atlanta Data Center is part of the CODA development, a 645,000 sqft mixed-use office complex, currently under construction in Georgia Tech’s Technology Square.

Right out of the gate, ATL1 will possess two things unique to any data center in the region:

1. Southern Crossroads (SoX)

SoX serves as the Southeast connector to National Lambda Rail (NLR), Internet2 and other major U.S as well as International research networks. It connects southern schools, including Mississippi, Alabama, Georgia, Florida, and South & North Carolina. SoX is a special network fabric that privately interconnects many different schools and federal institutions.

2. The Georgia Tech Supercomputer

Georgia Tech was awarded $3.7 million from the National Science Foundation to cover 70% of the costs of a new, state-of-the-art high–performance computing resource for the CODA building’s data center, ATL1.The new HPC system will support data-driven research in astrophysics, computational
biology, health sciences, computational chemistry, materials and manufacturing, and more. It will also be used to research energy efficiency and performance of HPC systems themselves.

“ATL1 is an ecosystem. A social compute family where we’re bringing in any company that wants to be a part of it. Anyone who wants to witness a supercomputer and real innovation is welcome.”

– Brandon Peccoralo, General Manager at DataBank

<p>The Green Initiative and a New Generation of Data Center Cooling</p>
Type image caption here (optiona

The Green Initiative and a New Generation of Data Center Cooling

But, there’s something else that makes ATL1 quite different: 
its cooling technique.

For the past thirty years, the data center industry has largely relied upon the same cooling techniques and architectures and, with minor exceptions, cooling has not been a subject of great innovation.
In ATL1, DataBank, pushes the envelope with its new ColdLogik Rear Door cooling solution from USystems.

Data centers focus tremendous amounts of time seeking the most efficient use of power. The biggest factor in power consumption is one that data centers have little control over: a customers’ hardware. But the second biggest factor in power consumption is PUE – an industry-standard measurement of how efficiently a data center cools the waste energy coming off computers. This is an area data centers have lots of room to innovate and find improvements.

Traditionally, this has meant taking energy from the utility, and running AC units within the data center as efficiently as possible to cool the overall environment. When you sop and think about it, this approach is comically inefficient. You are basically taking the heat that is generated from consumption of energy (in the form of computers) and consuming even more energy to cool it (in the form of AC).

DataBank offers a full range of connectivity methods across the nation to achieve our customers goals. This includes a national IP transport service which provides inter-region and intra-metro connectivity. Our carrier neutral facilities offer robust interconnectivity and high- bandwith access to top-tier carriers. This results in a resilient communication environment across multiple DataBank sites

ColdLogik solution for ATL1 adds a unique approach.

Waste energy that comes out of a server in a cabinet is exchanged by heating naturally cold water that runs through a closed-loop system in each cabinet rear door. The water is heated, and then rejected out of the building as new naturally cold water is cycled back in to replace it keeping the cabinet continuously cool.

That system alone would represent a major innovation in data center cooling But, in our mission to continuously evolve the data center experience, DataBank took the implementation one step further.

In the typical data center environment, any heat removed with CRAC units is usually sent back to the central plant, exchanged for condenser water, and receded off the roof via typical heat rejection. Instead of wasting the heat, the ATL1 facility is actually sending it over to the CODA building’s high-rise boilers and allowing tenants to reuse and repurpose it to heat their offices in colder weather, thus further offsetting energy use.

Using USystem’s ColdLogik rear door coolers, DataBank and Georgia Tech are cooling 50 kW per enclosure, per rack, using 73-degree warm water. And this rear door cooling system can use that same capacity to cool up to 100 kW per rack with just minor changes in infrastructure.

  • Cooling 50 kW with 73 degrees, with the potential to double
  • 90% less energy consumption
  • 80% more real estate than traditional cooling

Cool, right?

Challenges and Solutions

Challenges

Bulk faced the following challenges:

  • How to design a solution for a colocation customer when specific IT technology, density or rack configuration is not known.
  • How to cater for the current GPU trend requiring 40-50kW+ per rack, while also considering future tech upgrades that could scale to even higher densities including Direct Liquid Cooling (DLC).
  • Find a flexible solution that allows its customer the option to select and bring in their own racks of different sizes and manufacturers.
  • Maximise operational efficiency through optimised airflow and heat rejection.
  • Reduce time for white space fit out and customer deployment by pre-installing and testing all cooling components.

Solutions

The ColdLogik solution:

  • Energy Efficiency Ratio (EER) of over 100 at maximum capacity.
  • Average 15% reclaimed power for Compute by comparison to traditional cooling.
  • Potential Cooling PUE available of 1.035 with RDC.
  • Over 50,000 trees worth of carbon saved per 1MW ColdLogik deployment.
  • Adaptive Intelligence that controls the whole room temperature.
  • Higher water temperatures reduce the need for mechanical cooling, whilst maintaining ASHRAE A1 temperatures.

How elevated water temperatures
can dramatically reduce water temperatures

WUE-USA

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of San Francisco and the Bay area its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical assistance all year round in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 7 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 5 months either. Chillers would normally remain on site in order to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to actually be run, causing an energy saving too.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.



In 2014 Lawrence Berkeley National Laboratory in California issued a report that 639 billion liters of water were used in the USA alone on data center cooling, in 2020 the forecasted usage figure was predicted to be a startling 674 billion liters of water.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In the case of San Francisco and the Bay area its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical assistance all year round in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.



In 2014 Lawrence Berkeley National Laboratory in California issued a report that 639 billion liters of water were used in the USA alone on data center cooling, in 2020 the forecasted usage figure was predicted to be a startling 674 billion liters of water.

Conclusion

In conclusion, without considering the lower water usage across the remaining 5 month which could be substantial, the ColdLogik door would likely be able to save a minimum of 58% additional water that would otherwise be consumed by the traditional cooling methods.
Translating into physical water usage over the year this could drop the current projected figure of 674 billion litters of water down to 283 billion liters of water which is a 391 billion liter drop.

This is the equivalent of filling 156,400 Olympic swimming pools which would take up an area 1.5 times that of San Francisco city.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

WUE-Northen Europe

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Stockholm and the Nordics, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 9 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 3 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.


One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day.

Whilst public information is scarce a very conservative figure for water usage is around 20 million litres of water a day in the Nordics, utilised for cooling. However importantly a large proportion of data centre owners have utilised the areas climate to reduce the mechanical power requirement, which whilst increasing water usage will provide greater overall efficiency for traditional systems.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, uThe best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.nlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above the Nordic region provides a very low dry and wet bulb for a large proportion of the year, this helps with efficiency on a whole.
In 2014 Lawrence Berkeley National The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.in California issued a report that 639 billion liters of water were used in the USA alone on data center cooling, in 2020 the forecasted usage figure was predicted to be a startling 674 billion liters of water.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In the case of the Nordic region its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for two thirds of the year and you would also require mechanical for just under half of the year in varying load. However, as most chillers have a minimum run of 25% less free cooling could be available.
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 9 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 3 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 3 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 50% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the publicly available information in the Nordic region, this could drop the current projected usage figure of 4.86 billion litres of water down to 2.43 billion litres of water which is a massive 50% drop. This is the equivalent of filling the infamous Blue Lagoon in Iceland a whopping 270 times, which really does give it perspective.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

WUE-London-FLAP

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of London and the FLAP markets, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 8 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 4 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.
One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day.

Even if you only take the 13 largest data centre operations in the UK then this would equate to 58,412,000 litres of water that are wasted each day.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As someone that lives in the UK I can safely say that our weather isn’t always the best, however this gives a wonderful opportunity for eliminating excess water use.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In the case of the United Kingdom and in particular the London area its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical for over half of the year in varying load. However, as most chillers have a minimum run of 25% making less of the free cooling available.
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 8 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 4 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 3 monthIn conclusion, without considering the lower water usage across the remaining 4 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 66% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the 13 largest publicly available data centres in the UK, this could drop the current projected usage figure of 21.32 billion litres of water down to 7.11 billion litres of water which is a 14.21 billion litre drop. This is the equivalent of filling 5550 Olympic swimming pools which would take up an area more than 130 x that which Windsor castle and it’s grounds currently occupies.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

WUE-India

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Bangalore and the Indian market, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not require any additional mechanical cooling on site for standard operation. This is in the form of chillers with refrigeration circuits, whilst normally these systems would remain on site in order to maintain redundancy in case of exceptional need they would not be required on a regular basis. The water usage would be less for 6 months of the year on the ColdLogik system, this would most likely account for a drop in water usage across this period of around 20%.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.


One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day.

Whilst public information is scarce a very conservative figure for water usage is around 34 million litres of water a day in the Indian market, utilised for cooling based on 500mW cooling capacity across the country.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above India provides a challenging environment for any cooling requirement, with high DB temperatures and relatively high WB temperatures to suit.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In India its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for the whole year and you would also require mechanical for the whole year in varying load. However, as most chillers have a minimum run of 25% less free cooling may be used.
By utilizing the ColdLogik door, on average, you would not require any additional mechanical cooling on site for standard operation. This is in the form of chillers with refrigeration circuits, whilst normally these systems would remain on site in order to maintain redundancy in case of exceptional need they would not be required on a regular basis. The water usage would be less for 6 months of the year on the ColdLogik system, this would most likely account for a drop in water usage across this period of around 20%.

Conclusion

In conclusion, considering the lower water usage across the 6 months, the ColdLogik door would likely be able to save a minimum of 10% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the publicly available information for India, this could drop the current projected usage figure of 12.37 billion litres of water down to 11.13 billion litres of water which is a 10% drop. In the future, as the Ashrae guidelines are pushed more into the allowable limits, the amount of water that could be saved is limitless.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

WUE-India

What happens when you implement a ColdLogik Rear Door? In the graphs next to this you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey. In the case of Bangalore and the Indian market, its clear to see that by utilizing the traditional approach you would, on average, have a need for the adiabatic system most of the year and you would also require mechanical assistance in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available. By utilizing the ColdLogik door, on average, you would not need to use any additional water for 6 months of the year to provide adiabatic cooling, you would only require mechanical cooling assistance for around 1-2 months. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not need to be run for 10 months of the year, causing an additional operational saving
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.

There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.

Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.


One of the major active equipment manufacturers has openly said that a realistic figure for water use per MW can be 68,000 litres of water a day.

Unfortunately information is scarce and so a conservative figure of 1000mW can be used across the country, this would potentially give a usage of around 68 million litres of water per day.

What is adiabatic cooling?

The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.

For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.

Why is this usage so high?

The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.

How can you improve this usage?

The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above China provides a challenging environment for any cooling requirement, particularly in summer, with high DB temperatures and relatively high WB temperatures to suit.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.

What happens when you implement a ColdLogik rear door?

In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in grey.In China its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for almost the whole year and you would also require mechanical for half the year in varying load. However, as most chillers have a minimum run of 25% less of the free cooling may be available
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 6 months of the year to provide adiabatic cooling, you would only require mechanical cooling assistance for around 1-2 months. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not need to be run for 10 months of the year, causing an additional operational saving.

Conclusion

In conclusion, without considering the lower water usage across the remaining 4 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 25% additional water that would otherwise be consumed by the traditional cooling methods.

Translating into physical water usage over the year, and based on the conservative 1tW figure, this could drop the current projected usage figure of 24.82 billion litres of water down to 18.6 billion litres of water which is a 6.2 billion litre drop. This is the equivalent of filling the Birds nest stadium in Beijing with water twice over which was the pinnacle of the 2008 Olympic games.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.

Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.

By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.

OUR PRODUCTS

Data Center Products 

that Exceed Expectations

Discover the High Performing Data Center products, including ColdLogik Rear Door Coolers – RDHx, ColdLogik InRow Cooler, ColdLogik & EDGE LX Plant.
ColdLogik
Rear Door Heat Exchanger

CL21 Passive Rear Door Heat Exchanger

The CL21 Passive RDHx offers high-performance cooling at zero-operational cost.
EDGE
EDGE Solutions

EDGE-3 Soundproof MDC

Enables quiet computing in limited space with advanced intelligence and security features.
ColdLogik
CDU

ColdLogik CDU

Efficiency, reliability, and innovation define our CDU.
USpace
Server Cabinet

USpace - 4210 Cabinet

Cost-effective and versatile Rack, adapts to diverse applications, offering ease and flexibility.
ColdLogik
InRow Coolers

CL80 600W InRow Cooler

Precision cooling solutions for aisle containment, ensuring optimal performance.
ColdLogik
InRow Coolers

CL80 300W InRow Cooler

The highest cooling capacity available in a footprint of a chilled water InRow.
USpace
Server Cabinet

USpace - 5210 Cabinet

High-density design with flexible configuration for evolving IT environments.

USystems

Data Center Solutions
that Exceed Expectations

Get in touch with us at USystems Ltd to join the journey towards more efficient and sustainable data centers. Our leading and innovative technologies are designed to help you use less energy and reduce your carbon footprint on a global scale. Contact us now to explore how we can work together for a greener future.
Slider with Navigation Buttons