More power to you

When an energy grid overload was widely suggested as the cause of last month’s mass blackout in Canada and the U.S., increasingly powerful computers were identified as one of the possible culprits. Current systems are stratospherically more powerful than those of 10 years ago, the argument goes,

so they must be sucking up energy supplies at dangerously increasing levels.

“It’s an urban legend,” said Dr. Jonathan G. Koomey, a world-leading authority on hardware energy consumption at Colorado’s Rocky Mountain Institute, which specializes in power usage research. “There are lots of people running around saying that computers consume 13 per cent of all electricity and that it will soon rise to 50 per cent. In the U.S., it’s actually only about three per cent and will probably grow to around five per cent over the next 10 to 15 years,” said Koomey.

Koomey’s work has revealed that while computers and peripherals are exponentially more powerful than they were just a few years ago, manufacturers have made key innovations to keep in check the amount of energy individual units consume. The adoption of LCD monitors, for example, has cut the energy usage of screens by two-thirds and ever-smaller transistors have reduced the consumption by processors.

According to Doug Cooper, country manager for Intel Canada, the world’s largest chip maker has been addressing the issue since the energy crisis of the 1970s, when it signed up for the U.S.’s voluntary Energy Star power conservation program.

“The recent power outage was not a wake-up call for us. The idea of managing power has been at the core of what we’ve been doing over the last decade,” said Cooper. “Performance-wise, computers have increased from 6-8 MHz in the old 286 days to Pentium 4s running at 2-4 GHz today. Performance has increased 1,000-fold but power consumption has certainly not increased at the same rate.”

For Cooper, the finger-wagging at powerful computers has more to do with widespread misconceptions about the way systems operate, a misconception that manufactures may have contributed to. “It’s true that as an industry, we typically use the metaphor of power for performance,” said Cooper.

The confusion is widespread, according to Howard Locker, a strategic technology executive with IBM in North Carolina. While agreeing that systems continue to race ahead in performance terms — “Every 12 to 18 months, everything doubles,” he said — overall energy consumption per unit has hardly changed. “A desktop uses around the same power as two 60-watt lightbulbs. This has moved maybe 20 or 30 watts over the years,” said Locker.

Conserving power is even more important for laptop, handheld and wireless devices, where manufacturers have to maximize battery longevity for systems that now complete more tasks than ever before, according to Palm Canada director of marketing Janet Gillespie.

“We introduced the Palm Pilot in 1996, but it’s a very different product today with more than 10,000 available applications. We moved from AAA batteries to rechargeable lithium batteries and we recently introduced our Power to Go snap on battery pack,” said Gillespie. “Our design goal is to make our handhelds convenient and easy to use from an energy consumption perspective.”

But while individual units may be more energy efficient, increasingly large data centres are becoming the technology age’s leading power hogs, requiring increasing levels of energy to keep systems cool enough to function properly. The cooling infrastructure required for rooms full of concentrated transistors commonly consumes more energy than the computers themselves and that consumption could rise dangerously if left unchecked.

A 2003 Rocky Mountain Institute study entitled Design Recommendations for High-Performance Data Centers (available online at http://www.rmi.org/sitepages/pid626.php), notes that “For every watt being consumed by a [data centre] computer, roughly two to three additional watts are being drawn from the utility to cool the computer and provide it with protected power…This formula is unsustainable.”

Hewlett-Packard has begun addressing this energy-crisis-in-the-making with a new smart cooling solution for data centres, launched in March this year.

“It’s a question of power density,” said Chandrakant Patel, a principal scientist at the company’s key research laboratory in Palo Alto, California. “Chips produce around the same amount of power as ten years ago, but since they are much smaller, the heat is spread over a reduced surface area, creating higher temperature concentrations. It’s the work now required to extract the hot air and move it that requires more energy than before,” he said.

Setting out to dramatically reduce this energy consumption and cut businesses costs by an estimated 25 per cent, Hewlett-Packard’s solution uses computational fluid dynamics — similar to that used to improve airplane design — to create a 3D model of temperature distribution throughout a data centre. It then recommends strategic placing of computing resources and air conditioning equipment to optimize cooling-related energy use.

“If computing is going to be about millions of centres and trillions of services, these data centres will obviously have a major impact on the grid,” said Patel.

Comment: [email protected]

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

PETproject

Like money in the bank

Featured Tech Jobs