London, 1st July 2021 – USystems, the leading provider of critical infrastructure solutions is proud to announce the official opening of its brand-new office downtown Rochester, NY. The New York office will serve as USA Headquarters to support growing demand for the ColdLogik Product
The new site is located downtown Rochester, NY in the historic Sibley building, which in 1929 was the largest department store between Chicago and New York City. This city rose to prominence as the birthplace for technology companies such as Eastman Kodak Company, Xerox, and Bausch & Lomb.
In addition, they have also secured the talents and knowledge of Dr. David Brown who will serve as Vice President of Engineering and brings years of experience in the data center infrastructure field with him. Please join us in welcoming him to our growing U.S. based team.
‘In a critical point in USystems history, we are extremely proud to bring our knowledge direct to our customers in the United States of America with our brand-new USA HQ’
Scott Bailey, Director
About USystems
USystems is a leading provider in Data Center Cooling solutions that strives to provide unparalleled results and true value to our customers with cooling solutions that improve operational and energy efficiency. Founded in 2004, USystems is headquartered in the UK, with additional offices in Europe, Middle East, North America, and India. The company offers Air Cooling, DX, and LX Cooling solutions to various sized Data Centres, including standard, hyperscale’s and edge, and those with High Performance and Super Compute capabilities.
These cooling solutions are achieved with the innovative range of ColdLogik Rear Door Coolers which since their inception in 2007 have garnered much praise and industry attention. Winning multiple awards globally for their outstanding performance, sustainability, energy saving and unrivalled efficiencies.
For more information, visit www.usystems.com or follow USystems Ltd on LinkedIn and Instagram.
The market perception of a Rear Door Cooler (RDC) is that it can only be used on a one-to-one basis with the cabinet it is cooling. However, for more than a decade, USystems have consistently disproved this theory. This document will detail how this can be done.
The basic principle surrounding the ColdLogik RDC, is that it controls the whole room environment with built in adaptive intelligence.
During standard operation of the USystems RDC systems, air is discharged evenly in both horizontal planes via EC fans. Due to the fans being equally positioned in the RDC, the air provided across the adjacent cabinets is uniform, therefore enabling USystems RDC solutions to cool multiple cabinets using a single unit. Cooling by Air-Mixing.
Whilst it is true that to achieve optimum operational efficiency an RDC per cabinet is the most effective method, it can still be an effective proposition to deploy an RDC for every second or every third cabinet. The choice of which comes down to a multitude of factors however it can be easily explained by describing the choice as ‘Optimal energy savings and OpEx vs CapEx’.
Capital Savings Potential
To illustrate the CapEx saving potential, this philosophy can be applied to a real-world application. Customer A required a cooling solution where the room temperature was maintained at ASHRAE A1, with a cooling requirement per cabinet of 10kW. The straightforward 1:1 deployment provides the largest opportunity to reduce the OpEx, as operational savings are made through the use of warmer water (reduced mechanical cooling investment required). However, this also has the largest CapEx, a greater initial venture than originally budgeted for by Customer A.
By redesigning the deployment to use a 3:1 method, each RDC produces instead 30kW of cooling duty. This significantly reduces the CapEx down with a 61% saving. While the 3:1 solution has increased mechanical cooling costs due to the use of colder water, the reduced CapEx aligns with Customer A’s budgetary considerations.
Formulae
Operating with ASHRAE standards for the active equipment internally for the Cabinet means that both the volume of air required by the equipment and the temperature required for the air flow from the door can be used to estimate the anticipated room temperature. The formulae by which this can be shown is as follows:
Operation Densities
At low duties, an RDC may be deployed in every two or three cabinets. While what determines the low duty threshold for Cooling by Air-mixing would vary (based on site-specific parameters, such as fluid temperatures, relative humidity etc), USystems are comfortable in applying a general threshold of 6kW and lower for RDC’s deployed on every third cabinet, and 15kW or lower for every second cabinet. This is not a definitive ceiling, as the adaptive intelligence of the RDC product, and the many variables in designing a data center allow for viable solutions that perform beyond these general models, such as the deployment illustrated for by Customer A. USystems have illustrated the thresholds of 6Kw and 15kW below:
Example A: 3 x Racks to 1 x RDC
In a 3:1 configuration shown here, where the cooling requirement in each cabinet is 6kW, each rear door would have a total cooling requirement of 18kW. this would directly cool the cabinet it is fitted to, and indirectly cool the air from the cabinets on the left and right. The pattern repeats for the full bank
Where a room temperature of 27°C/80.6°F or below is desired (to keep with ASHRAE A1 class guidelines) USystems would assume the air off from the Cabinets without the RDC to be approximately 40°C/104°F and produce circa 600CFM. If on application this information can be provided, USystems are able to produce more accurate room temperature projections.
The RDC in this application would be producing 18kW of cooling, generating 1800CFM, with an air off coil temperature of 20°C/68°F. The data in this example would populate the formulae as follows:
The anticipated room temperature in this case would give a mixed air temperature of 25°C/77°F, which is under the 27°C/80.6°F ASHRAE recommendations.
Example B: 2 x Racks to 1 x RDC
As the required cooling density increases, the ratio of RDC’s to racks generally increases proportionally. In a 2:1 configuration where the cabinets being cooled are producing 15kW each, the following would apply if a room temperature of 27°C/80.6°F is desired. USystems would expect the air off from the Cabinets without the RDC to be approximately 40°C/104°F and produce circa 1500CFM.
The RDC in this application would be producing 30kW of cooling, generating 2250CFM, with an air off coil temperature of 15°C/59°F. The data in this example would populate the formulae as follows:
The anticipated room temperature in this case would give a mixed air temperature of 25°C/77°F, which is under the 27°C/80.6°F ASHRAE recommendations.
Closing Statement
These examples illustrate the higher densities that can be applied using a 3:1 and 2:1 deployment strategy. With increased airflow requirements at higher duties, it is generally advisable to deploy low duty cabinets between high duty cabinets with RDC’s fitted to best maximise the air mixing strategy. There is also opportunity for growth in the data center to either increase density, or reduce OpEx by adding more RDC’s at a pace to suit CapEx. The reduced OpEx is achieved by increasing the number of RDC’s which allows an elevation in water temperatures (in most cases supplied by mechanical cooling), therefore reducing overall energy consumption.
When increasing the water temperatures, it typically means: first the external plant can be physically smaller, second, the efficiency per kW cooled will increase, and third, the free cooling that can be utilised on site will increase exponentially dependant on geographic location.
To conclude, the ColdLogik RDC solution is truly unique. Allowing data center stakeholders to utilise a system that is both sustainable in its capacity for future growth, and maximises flexibility with the ability to improve efficiency as density is added. To find out more on how we can make a Cooling by Air Mixing solution work for you, please contact sales@systems.com or visit https://www.usystems.com/data-centre-products/cl20-proactive#prodeployment for more information.
Nautilus are a long-standing partner of USystems and have championed the ColdLogik brand for more than 5 years. Working with them since the conception of the original ‘floating data centre’ the partnership has flourished and pushed both parties to a better technological standpoint in order to utilise the warm water temperatures available all year round in the San Francisco Bay area.
“Data Centers are the newest and most essential piece of critical infrastructure for the world”
James Connaughton, CE0, Nautilus
Working in conjunction with our ColdLogik team every scenario for water temperature variability was considered and design was carefully altered where needed to provide the required solution.
Capable of up to 36kW per ColdLogik cabinet the barge provides a floating data center for Colocation services, unlike anything that has ever been previously used. USystems was recently featured in Technology Magazine in an article all about ‘Transforming In The Data Center Industry’. Read the full article here to find out more about their journey in transforming the data center cooling industry.
In this episode Freddy Briffitt and Erik Venmo from NTC discuss the benefits of cold regions like the Nordics have on Data Centers, and the increase ability to utilise free cooling which improves your PUE levels. Erik also talks about the importance of educating the consumer in order to provide a turnkey solution which provides not only CapEx, but OpEx savings.
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.
There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.
Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.
One of the major active equipment manufacturers has openly said that a realistic figure for water use per mW can be 68,000 litres of water a day. Unfortunately information is scarce and so a conservative figure of 1000mW can be used across the country, this would potentially give a usage of around 68 million litres of water per day.
What is adiabatic cooling?
The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.
In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.
For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.
Why is this usage so high?
The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.
How can you improve this usage?
The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above China provides a challenging environment for any cooling requirement, particularly in summer, with high DB temperatures and relatively high WB temperatures to suit.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.
What happens when you implement a ColdLogik rear door?
In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in Grey.
In China its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for almost the whole year and you would also require mechanical for half the year in varying load. However, as most chillers have a minimum run of 25% less of the free cooling may be available.
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 6 months of the year to provide adiabatic cooling, you would only require mechanical cooling assistance for around 1-2 months. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not need to be run for 10 months of the year, causing an additional operational saving.
Conclusion
In conclusion, without considering the lower water usage across the remaining 4 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 25% additional water that would otherwise be consumed by the traditional cooling methods.
Translating into physical water usage over the year, and based on the conservative 1tW figure, this could drop the current projected usage figure of 24.82 billion litres of water down to 18.6 billion litres of water which is a 6.2 billion litre drop. This is the equivalent of filling the Birds nest stadium in Beijing with water twice over which was the pinnacle of the 2008 Olympic games.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.
Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.
By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.
India
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.
There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.
Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.
One of the major active equipment manufacturers has openly said that a realistic figure for water use per mW can be 68,000 litres of water a day. Whilst public information is scarce a very conservative figure for water usage is around 34 million litres of water a day in the Indian market, utilised for cooling based on 500mW cooling capacity across the country.
What is adiabatic cooling?
The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.
In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.
For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.
Why is this usage so high?
The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.
How can you improve this usage?
The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above India provides a challenging environment for any cooling requirement, with high DB temperatures and relatively high WB temperatures to suit.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.
What happens when you implement a ColdLogik rear door?
In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in Grey.
In India its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for the whole year and you would also require mechanical for the whole year in varying load. However, as most chillers have a minimum run of 25% less free cooling may be used.
By utilizing the ColdLogik door, on average, you would not require any additional mechanical cooling on site for standard operation. This is in the form of chillers with refrigeration circuits, whilst normally these systems would remain on site in order to maintain redundancy in case of exceptional need they would not be required on a regular basis. The water usage would be less for 6 months of the year on the ColdLogik system, this would most likely account for a drop in water usage across this period of around 20%.
Conclusion
In conclusion, considering the lower water usage across the 6 months, the ColdLogik door would likely be able to save a minimum of 10% additional water that would otherwise be consumed by the traditional cooling methods.
Translating into physical water usage over the year, and based on the publicly available information for India, this could drop the current projected usage figure of 12.37 billion litres of water down to 11.13 billion litres of water which is a 10% drop. In the future, as the Ashrae guidelines are pushed more into the allowable limits, the amount of water that could be saved is limitless.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.
Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.
By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.
The Nordics
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.
There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.
Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.
One of the major active equipment manufacturers has openly said that a realistic figure for water use per mW can be 68,000 litres of water a day. Whilst public information is scarce a very conservative figure for water usage is around 20 million litres of water a day in the Nordics, utilised for cooling. However importantly a large proportion of data centre owners have utilised the areas climate to reduce the mechanical power requirement, which whilst increasing water usage will provide greater overall efficiency for traditional systems.
What is adiabatic cooling?
The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.
In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.
For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.
Why is this usage so high?
The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.
How can you improve this usage?
The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As you can see above the Nordic region provides a very low dry and wet bulb for a large proportion of the year, this helps with efficiency on a whole.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.
What happens when you implement a ColdLogik rear door?
In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in Grey.
In the case of the Nordic region its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system for two thirds of the year and you would also require mechanical for just under half of the year in varying load. However, as most chillers have a minimum run of 25% less free cooling could be available.
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 9 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 3 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.
Conclusion
In conclusion, without considering the lower water usage across the remaining 3 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 50% additional water that would otherwise be consumed by the traditional cooling methods.
Translating into physical water usage over the year, and based on the publicly available information in the Nordic region, this could drop the current projected usage figure of 4.86 billion litres of water down to 2.43 billion litres of water which is a massive 50% drop. This is the equivalent of filling the infamous Blue Lagoon in Iceland a whopping 270 times, which really does give it perspective.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.
Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.
By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.
London
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.
There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.
Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.
One of the major active equipment manufacturers has openly said that a realistic figure for water use per mW can be 68,000 litres of water a day. Even if you only take the 13 largest data centre operations in the UK then this would equate to 58,412,000 litres of water that are wasted each day.
What is adiabatic cooling?
The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.
In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.
For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.
Why is this usage so high?
The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.
How can you improve this usage?
The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
As someone that lives in the UK I can safely say that our weather isn’t always the best, however this gives a wonderful opportunity for eliminating excess water use.
The important factor here is that anything above the blue line can utilise the DB and therefore not require any additional water usage. Anything between the blue line and the orange line can be cooled using an adiabatic system and this is where the water usage would come into being. Anything beneath the orange line would require additional mechanical cooling such as a traditional chiller system, this would then be using maximum water and additional power for the mechanical equipment.
What happens when you implement a ColdLogik rear door?
In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in Grey.
In the case of the United Kingdom and in particular the London area its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical for over half of the year in varying load. However, as most chillers have a minimum run of 25% making less of the free cooling available.
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 8 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 4 months either. Chillers would normally remain on site to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to be run, causing an additional operational saving.
Conclusion
In conclusion, without considering the lower water usage across the remaining 4 months which could be substantial, the ColdLogik door would likely be able to save a minimum of 66% additional water that would otherwise be consumed by the traditional cooling methods.
Translating into physical water usage over the year, and based on the 13 largest publicly available data centres in the UK, this could drop the current projected usage figure of 21.32 billion litres of water down to 7.11 billion litres of water which is a 14.21 billion litre drop. This is the equivalent of filling 5550 Olympic swimming pools which would take up an area more than 130 x that which Windsor castle and it’s grounds currently occupies.
If you are looking to improve your water usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.
Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.
By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.
San Francisco bay area
With the current situation the world finds itself in one thing has become abundantly clear, data centers have provided people a safe haven in their own homes whilst lockdowns have been enforced across the globe, at one point half of the human race was in lockdown in one form or another.
There have been both positives and negatives that have arisen from this though, from the ‘key worker’ status held by data center employees and their primary suppliers highlighting how governments across the world perceive the industry and its need, through to a sterner examination from the wider world on energy consumption and water usage.
Uptime and reliability have always driven the major data center design philosophy, trade-offs have been made, understandably, in order for comfort to be achieved for the operators and owners to be safe in the knowledge that design and variability across sites maintains consistency and brings down the risk of misalignment or calculation.
Whilst data centers are more efficient than they have ever been on a whole, there is still a vast amount of improvement that can be made, in particular on both energy consumption for cooling and the consumption of water in the requirement for adiabatic cooling.
In 2014 Lawrence Berkeley National Laboratory in California issued a report that 639 billion liters of water were used in the USA alone on data center cooling, in 2020 the forecasted usage figure was predicted to be a startling 674 billion liters of water.
What is adiabatic cooling?
The reason that water is used in these data center cooling solutions is traditionally to obtain the lowest possible air temperature entering the external plant and therefore extracting as much of the heat from the data center using the natural environment before the mechanical chiller needs to be implemented.
In any air content there are two temperature points, Dry Bulb (DB) and Wet Bulb (WB), the dry bulb is what you would feel if you were dry, the best way to describe wet bulb is when you walk out of the shower before you manage to get to the towel! The water on your body enables the air to reach the temperature of the relative humidity as you move through it, this is always equal to or lower than the DB temperature.
For example if the DB temperature in a room is 20°C/70°F and the WB temperature is 14°C/57°F then, if a wet object or air is pushed through a wet area or membrane the temperature would potentially reach the WB temperature, that is until the object is heated or dried.
Why is this usage so high?
The usage of water is inversely proportional to the water temperature flow to the data centre internal cooling equipment. The lower the temperature of the water flow into the data centre the higher the water usage by the external plant. Traditional plant has a normal water flow temperature of 7°C/45°F which means the highest temperature that you could potentially utilise naturally to get to the desired flow temperature is 5°C/41°F.
How can you improve this usage?
The best possible way to reduce the usage is to elevate the water temperature that the data centre requires in order to cool the equipment efficiently and effectively, the rear door cooler is a great example of this because, unlike traditional CRAC systems, instead of using colder air to mix with warm air to provide an ambient you are instead neutralising the air itself and therefore you can use a higher water temperature to obtain the same result. The graphs below show the average high temperature for DB and WB over a thirty year period.
What happens when you implement a ColdLogik rear door?
In the graphs above you can see the marked difference between using a traditional cooling system, which is marked in yellow, and the ColdLogik cooling requirement, marked in Grey.
In the case of San Francisco and the Bay area its clear to see that by utilising the traditional approach you would, on average, have a need for the adiabatic system all year round and you would also require mechanical assistance all year round in varying load. However, as most chillers have a minimum run of 25%, less free cooling could be available.
By utilizing the ColdLogik door, on average, you would not need to use any additional water for 7 months of the year to provide adiabatic cooling, you would not require any mechanical assistance through the remaining 5 months either. Chillers would normally remain on site in order to provide redundancy on the rare occasions that a heat wave outside of the average occurs, however the chillers may not ever need to actually be run, causing an energy saving too.
Conclusion
In conclusion, without considering the lower water usage across the remaining 5 month which could be substantial, the ColdLogik door would likely be able to save a minimum of 58% additional water that would otherwise be consumed by the traditional cooling methods.
Translating into physical water usage over the year this could drop the current projected figure of 674 billion liters of water down to 283 billion liters of water which is a 391 billion liter drop. This is the equivalent of filling 156,400 Olympic swimming pools which would take up an area 1.5 times that of San Francisco city.
If you are looking to improve your wter usage with a product that is tried and tested and deployed into the market worldwide then get in touch with USystems today.
Conventional air cooling traditionally consumes significant energy when using mechanical chillers, one way to reduce and potentially eliminate the additional energy wastage is by utilising adiabatic cooling. Whilst significantly improving efficiencies on one hand this exponentially increases water usage in order to equip evaporative cooling. The major down side however is the growing scarcity of water in certain geographical locations. A typical large scale Data Center consumes an equivalent of 2,500 peoples water which is putting pressure on local governments in order to drop water usage.
By utilising liquid cooling you can effectively increase the water temperature to the point where adiabatic cooling is no longer needed, giving the best of both worlds, no excess water wasted and better energy efficiency with a simpler site set up and requirement. It really is a WIN-WIN-WIN.
In this episode Sam Allen and the Chief Operating Office at 4D Data Centers discuss the ways 4D have increased their value-added customer service and adapted their operation in their Tier Three Data Centers throughout the pandemic. They also talk about the interesting challenges faced with latest cooling systems, such as immersion cooling, and the drivers in efficient cooling systems.
Today and in the future that we live in more and more traditionally non-connected everyday items are becoming connected to the internet. In this episode Freddy Briffitt talks about how we are ever-increasing our data usage and reliance.
In this episode, Freddy Briffit discusses the work IMasons are doing to help unite the ‘builders’ of the digital age, with Simon Allen (Executive Director at Infrastructure Masons). There are strategic priorities that IMasons is focusing on; talent crisis, scholarships, sustainability and addressing the digital divide. You will hear more about the way digital infrastructure will continue to contribute to the economy and society without harming the planet.
In this episode Paul Surdykowski talks to Brandon Peccoralo, General Manager at DataBank, Ltd. Data Center who support Cloud, Network and Cyber Security. The Midtown Atlanta Data Center is a purpose built high performance, high compute facility which has chosen USystems liquid cooling technology. They discuss the cost of cooling and the desire to do it in the most efficient and effective way, you want to bring the cooling technology as close to the heat source as possible – like with a 6 pack of Cola, you wouldn’t cool the entire kitchen to keep it cool, you would put it in the the fridge. With a new Data Centre like ATL1 you may only have three or four customers in the back, by using traditional methods like CRAC units you are cooling an largely empty room. The benefit of USystems ColdLogik Rear Door Cooler is the ability to isolate each individual cabinet by its exact usage, meaning you go from average PUE levels of 1.6-1.8 right down to 1.2. Paul and Brandon also discuss how Rear Door Cooling enables you to have 100% recycled, closed loop, water system.
USystems Freddy Briffitt and guest star Host In Ireland, An Industry of Substance Founder and President, Garry Connolly talk about the why of data and the how of centres. Data is vital in today’s world and people in every aspect of the Data Centre industry should be proud of the work we do. Data’s always been, along with people, the two most important assets that human mankind has ever possessed. There’s a very real carbon challenge and that confluence coming together, data is the only solution, data needs to be in centers. And therefore there’s the nearly a perfect storm developing, creating a world where won’t be a non-green data center in the world functioning for any of the large players. If there is in 10 years, we will have failed as a global industry.