Gartner Says Companies Can Save 1 Million Kilowatt Hours by Implementing 11 Best Practices in the Data Centre

Analysts to Examine Power and Cooling Strategies at the Gartner Data Center Conference, December 2-5 in Las Vegas

(PresseBox) ( Stamford, CT., )
In a conventional data centre, 35 per cent to as much as 50 per cent of the electrical energy consumed is for cooling versus 15 per cent in best-practice "green" data centres, according to Gartner, Inc.

"Virtually all data centres waste enormous amounts of electricity using inefficient cooling designs and systems," said Paul McGuckin, research vice president at Gartner. "Even in a small data centre, this wasted electricity amounts to more than 1 million kilowatt hours annually that could be saved with the implementation of some best practices."

The overriding reason for the waste in conventional data centre cooling is the unconstrained mixing of cold supply air with hot exhaust air. "This mixing increases the load on the cooling system and energy used to provide that cooling, and reduces the efficiency of the cooling system by reducing the delta-T (the difference between the hot return temperatures and the cold supply temperature). A high delta-T is a principle in cooling," Mr McGuckin said.

Gartner has identified 11 best practices which, if implemented, could save millions of kilowatt hours annually.

Plug Holes in the Raised Floor -- Most raised-floor environments exhibit cable holes, conduit holes and other breaches that allow cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10 per cent of the energy used for data centre cooling.

Install Blanking Panels - Any unused position in a rack needs to be covered with a blanking panel to manage airflow in a rack by preventing the hot air leaving one piece of equipment from entering the cold-air intake of other equipment in the same rack. When the panels are used effectively, supply air temperatures are lowered by as much as 22 degrees Fahrenheit, greatly reducing the electricity consumed by fans in the IT equipment, and potentially alleviating hot spots in the data centre.

Coordinate CRAC Units - Older computer room air-conditioning units (CRACs) operate independently with respect to cooling and dehumidifying the air. These units should be tied together with newer technologies so that their efforts are coordinated, or remove humidification responsibilities from them altogether and place those responsibilities with a newer piece of technology.

Improve Underfloor Airflow - Older data centres typically have constrained space underneath the raised floor that is not only used for the distribution of cold air, but also has served as a place for data cables and power cables. Many old data centres have accumulated such a tangle of these cables that airflow is restricted, so the underfloor should be cleaned out to improve airflow.

Implement Hot Aisles and Cold Aisles - In traditional data centres, racks were set up in what is sometimes referred to as a "classroom style," where all the intakes face in a single direction. This arrangement causes the hot air exhausted from one row to mix with the cold air being drawn into the adjacent row, thereby increasing the cold-air-supply temperature in uneven and sometimes unpredictable ways. Newer rack layout practices instituted in the past ten years demonstrate that organising rows into hot aisles and cold aisles is better at controlling the flow of air in the data centre.

Install Sensors - A small number of individual sensors can be placed in areas where temperature problems are suspected. Simple sensors store temperature data that can be manually collected and transferred into a spreadsheet, where it can be further analysed. Even this minimal investment in instrumentation can provide great insight into the dynamics of possible data centre temperature problems, and can provide a method for analysing the results of improvements made to data centre cooling.

Implement Cold-Aisle or Hot-Aisle Containment - Once a data centre has been organised around hot aisles and cold aisles, dramatically improved separation of cold supply air and hot exhaust air through containment becomes an option. For most users, hot-aisle containment or cold-aisle containment will have the single largest payback of any of these energy efficiency best practices.

Raise the Temperature in the Data Centre - Many data centres are run colder than an efficient standard. The American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) has increased the top end of allowable supply-side air temperatures from 77 to 80 degrees Fahrenheit. Not all data centres should be run at the top end of this temperature range, but a step-by-step increase, even to the 75 to 76 degrees Fahrenheit range, would have a beneficial effect on data centre electrical use.

Install Variable Speed Fans and Pumps - Traditional CRAC and CRAH units contain fans that run at a single speed. Emerging best practice suggests that variable speed fans be used whenever possible. A reduction of 10 per cent in fan speed yields a reduction in the fan's electrical use of approximately 27 per cent, and a 20 per cent speed reduction yields electrical savings of approximately 49 per cent.

Exploit "Free Cooling" - "Free cooling" is the general name given to any technique that cools air without the use of chillers or refrigeration units. The two most common forms of free cooling are air-side economisation and water-side economisation. The amount of free cooling available depends on the local climate, and ranges from approximately 100 hours per year to more than 8,000 hours per year.

Design New Data Centres Using Modular Cooling - Traditional raised-floor-perimeter air distribution systems have long been the method used to cool data centres. However, mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as a more-energy-efficient data centre cooling strategy.

"Although most users will not be able to immediately implement all 11 best practices, all users will find at least three or four that can be immediately implemented in their current data centres," said Mr McGuckin. "Savings in electrical costs of 10-to-30 per cent are achievable through these most-available techniques. Users committed to aggressively implementing all 11 best practices can achieve an annual savings of 1 million kilowatt hours in all but the smallest tier of data centres."

Additional information can be found in the Gartner report "How to Save a Million Kilowatt Hours in your Data Center." This report can be found on Gartner's website at http://www.gartner.com/....

Mr McGuckin will provide additional analysis the data centre of the future at the 27th Annual Gartner Data Center Conference, taking place December 2-5 in Las Vegas. The Gartner Data Center Conference is the most comprehensive compilation of sessions and advice on the future of the data centre ever held. It offers the latest actionable insights and best practices in all areas affecting the data centre - real-time infrastructure to servers and storage to business continuity and disaster recovery. This event hits the critical spot between strategic planning and tactical advice for IT organisations as they look to implement new technologies into their data centres and maintain the most efficient data centre. Additional information is available at www.gartner.com/us/datacenter.
Für die oben stehenden Pressemitteilungen, das angezeigte Event bzw. das Stellenangebot sowie für das angezeigte Bild- und Tonmaterial ist allein der jeweils angegebene Herausgeber (siehe Firmeninfo bei Klick auf Bild/Meldungstitel oder Firmeninfo rechte Spalte) verantwortlich. Dieser ist in der Regel auch Urheber der Pressetexte sowie der angehängten Bild-, Ton- und Informationsmaterialien.
Die Nutzung von hier veröffentlichten Informationen zur Eigeninformation und redaktionellen Weiterverarbeitung ist in der Regel kostenfrei. Bitte klären Sie vor einer Weiterverwendung urheberrechtliche Fragen mit dem angegebenen Herausgeber. Bei Veröffentlichung senden Sie bitte ein Belegexemplar an service@pressebox.de.