Green Business Part II: The Data Center

Earlier this year, I touched upon the topic of “Green Business” and offered many practical steps businesses can take to green up, and in many cases improve productivity. As promised, I will now discuss some additional measures we are evaluating or undertaking in the data center.

Solar

We are currently evaluating a solar system which would help offset peak utilization. Solar energy systems are particularly beneficial during times of peak electrical demand when powering, heating and cooling is at its highest and most expensive (in most cities this occurs at the hottest time of the day normally in the late afternoon). Utilizing solar photovoltaics (PV) to provide a percentage of our electrical generation will reduce our electrical requirement from the utility. While this is useful, we are also designing the system to provide shading for the rooftop A/C unites thereby creating an additional benefit through cooling and energy savings. Though the pricing and energy yield is still not where I would like to see it for PV, there are some decent subsidies and credits available which can greatly reduce the payback period. Now is a good time to investigate what incentives are available in your geographic region.

Warmer Datacenters

Network equipment continues to improve. The average operating temperature on new gear averages 81 degrees which is up from the average of 72 degrees in 2004. Once you get to warmer temperatures, you have many more options to cool your equipment and reduce the number of chillers. However, you still need to move air, so a reduction in cooling offsets some of the benefit as you will end up losing some density. You must realize that running at higher temperatures also causes your equipment to work harder. Fans must run faster and will negate a portion of your overall savings. In this case, power consumption increases as the cube of the fan speed. Say the fan speed increases by 10%; that could mean an increase of power usage of more than 30%. This isn’t necessarily a deal breaker. Higher temperatures have other advantages–such as limiting the need to build extra CRAC capacity for redundancy since the equipment can run at a higher temperature in the case of a failure– just realize that as with anything – there are always unintended consequences.

Liquid cooling on the chip is an interesting technology but this technology is not mature and is out of the price range to gain widespread adoption at this time. For now, the deployment of this technology will be limited to supercomputing. Today, density is no longer the only driver. As you double the computer density, you also greatly enlarge the footprint of the equipment to support it.

Hot and Cold Aisle Containment

In this configuration, drapes, partitions and/or doors are used to seal off hot or cold aisles [PC4] in an effort to better control airflow and temperature. Many data centers are looking at containment which allows about 25kw to be used per rack.

Energy Saving Tips

1) Look for hot/cold aisle designs

2) Review equipment specs when making new purchases. It’s amazing how different, though similar, a product can be in terms of power consumption from one vendor to another.

a. Variable speed fans

b. Energy savings power supplies

c. Power management

3) Review equipment to see if you can operate in a warmer environment.

4) Consider replacing aging hardware.

As opportunities arise to change and upgrade your equipment or make modifications to your infrastructure, consider doing so with an eye towards energy efficiency, energy innovation and cost savings. The changes and investments you make today can be the subtle differences that translate into greater productivity and competitiveness tomorrow.

Let us help improve your IT solutions today.

Get a Quote