Six steps to a green data center

Environmental issues exposed by the media and driven by consumers have placed IT departments under pressure to develop “green” data centers. Factors including the reduction of energy consumption and carbon dioxide emissions in large data centers often provide the impetus for becoming green. A recent report issued by the United States Environmental Protection Agency (EPA) highlights the need for data center efficiency and demonstrates the increased pressure that IT departments are under.

A green data center is defined as one in which the mechanical, lighting, electrical and computer systems are designed for maximum energy efficiency and minimum environmental impact. The construction and operation of a green data center involve advanced technologies and strategies.

Some examples include: Reducing the energy consumption of the data center; Minimizing building footprints; Maximizing cooling efficiency; Using low-emission building materials, carpets and paints; Installing catalytic converters on backup generators; Using alternative energy technologies such as photovoltaic electrical heat pumps and evaporative cooling.

The consumption of energy is considered the dominant factor in defining whether or not a facility is green. And according to a 2007 Gartner report entitled 2006 Data Center Polling Results: Power and Cooling, during the last five years the power demands of equipment have grown by five or more times. In fact, companies spend more on power to run a server over its lifetime than they do in capital expense to purchase it.

IT executives therefore need to start investigating alternative ways to build an energy-efficient data center. By following these six simple steps, IT executives can come closer to achieving their vision of a green data center:

STEP 1: Virtualize and consolidate. The basic concept of virtualization is simple: encapsulate computing resources and run on shared physical infrastructure in such a way that each appears to exist in its own separate physical environment. This process is accomplished by treating storage and computing resources as an aggregate pool from which networks, systems and applications can be drawn on an as-needed basis.

In addition, measurements indicate that often a single server utilizes only 5 to 15 percent of its capacity to service applications. With virtualization, the consolidation of under-utilized servers is seamless to the end user and significantly reduces power consumption.


STEP 2: Determine your cooling requirements. Most data center cooling systems that are in service today were deployed in a manner that assumed the load would be spread out in a uniform fashion; that is, the load in any given area would never be far greater than its relative share of the total data center space.

However, according to the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), the typical data center that employs dense form-factor servers and storage averages 5,000 watts per square foot of equipment space. Compare that figure to the 2,000 watts used in the same amount of space in 2002 — energy use and heat density have more than doubled in five years! The increases do not appear to be slowing, and the increased heat requires increased cooling. And cooling requires additional energy consumption.

STEP 3: Determine optimal placement of your equipment. Assuming you have enough cooling capacity for the equipment you have installed today, you must consider the placement of your equipment first.

The Uptime Institute did a survey of 19 data centers and reported that, on average, only 40 percent of the cold air went directly to cool the servers in the room. Adopting an alternating hot aisle/cold aisle layout is optimal and can correct many cooling problems in a typical data center. Correct placement of vented tiles and closely coupled cooling systems are other techniques that can be used to improve cooling efficiencies.

STEP 4: Adopt chilled water as a cooling method. Water can move up to 3,500 times the amount of heat that the same volume of air can. However, the use of chilled water as a data center cooling method will take some time to adopt both physically and culturally, as many find it difficult to fathom pipes of running water snaking through the plenums of their data centers.

Water cooling, as part of a rack, is possible through a number of options. In some cases, heat exchangers are located within a rack, and water is circulated in a closed loop within the rack and cooled by the traditional forced air from computer room air conditioning (CRAC). In other examples, the heat exchanger is located outside of the rack.

Another example of water cooling utilizes a completely external chiller system and cools the electronics within a rack through a closed loop chilled water system, with separate supply and return lines usually routed through the data center floor. These close-coupled systems can result in shorter air paths that require less fan power. Close-coupled heat removal minimizes and almost eliminates the mixing of cool and hot air since the airflow is completely contained in the row or rack.

STEP 5: Buy from a green vendor. It is essential to seek out a vendor that has power and cooling at the forefront of its research and development strategies. Selecting equipment based on lifecycle costs that take into account the energy usage of servers will become an important part of the procurement process for IT equipment in the near future.

STEP 6: Enable green controls. Enabling green control within servers is also helpful to ensure that they are throttling the amount of power they consume based on actual load. These controls can step down the frequency of the servers during low load, which will translate to less power consumption.

Other areas that offer environmental as well as economic benefits today are information lifecycle management (ILM), data de-duplication and the use of archiving to reduce the amount of storage on the floor. Companies usually start these projects to save money, but from an environmental standpoint the fewer disks they have spinning, the less energy they will use.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs