Computer energy efficiency increases

Tuesday, September 27, 2011

Computers can do amazing things, but are often viewed as consuming significant amounts of energy.  For example, centralized server operations like large data centers can consume as much power as heavy industrial manufacturing.  Whether powered by the default electricity mix or by purely renewable power, we often think that crunching numbers on computers means using a lot of power.  For this reason, computer makers and customers alike push for energy efficiency in their computing activities.

Newly published research suggests that the energy efficiency of computers doubles roughly every 18 months.  A team of researchers led by Stanford professor Jonathan Koomey looked at the peak power consumption of electronic computing devices ranging from 1946's ENIAC to the present.  ENIAC, which the U.S. Army used to calculate trajectories for artillery, took up 1,800 square feet (bigger than the average U.S. house at the time), and could consume up to 150 kilowatts of electricity.  Modern computers, and even smart phones, can now outperform ENIAC when it comes to computation, but are much more efficient in terms of their power demanded to perform a fixed set of calculations.  According to what is now being called "Koomey's Law", over the years since ENIAC first powered up, computers' energy efficiency on that basis has doubled roughly every 18 months.

This finding follows on Professor Koomey's July 2011 report on data center energy usage, which found that although data centers consume more and more electricity each year, their energy consumption is growing less than their increase in server power.


No comments:

Post a Comment