January 7, 2011 - data center power demands

Friday, January 7, 2011

As the volume of digital data we create and consume increases, how much electricity is required to store, manage and analyze this information?  Smart grid technology has been described as relying on the "internet of things", a vision becoming real of constant real-time data communications between interconnected devices like home appliances, heating systems, and vehicles and the overall power grid.  This will represent a multifold increase in the volume of data being produced - and for those entities interested in analyzing that data, a likely increase in the volume of energy required to do so.

Even now, when smart grid communications are still a relatively small portion of the total volume of data flying around the country, it can take a surprisingly large amount of electricity to run a data storage and analysis center.  In Utah, the National Security Agency has just broken ground for its Utah Data Center, a complex enclosing about 1 million square feet of space, 100,000 square feet of which will be devoted to computer hardware.  Sen. Orrin Hatch has been quoted as describing the data center as creating 100 to 200 jobs for information technology specialists and engineers.  The NSA describes the data center as a component of the Comprehensive National Cyber-security Initiative designed to help the intelligence community meet domestic cyber-security requirements.

So how much power will the Utah Data Center consume?  Apparently up to 65 megawatts.  Indeed, the availability and cost of that much power was one factor behind the siting of the facility in Utah.  In 2006, the agency reportedly nearly consumed the entire free electric capacity of the Baltimore, Maryland power grid, causing the agency to look elsewhere for the installation of this new computing capacity.  The relatively low cost of energy in Utah may also have been attractive; the EIA reports that the September 2010 average all-sector electricity price in Utah was just 7.42 cents per kWh, significantly below the U.S. average of 10.24 cents per kWh for that time period, let alone costlier markets like Washington, D.C. (13.74 cents/kWh), California (15.27 cents/kWh), or Connecticut (17.26 cents/kWh).

As society generates more and more data, can we expect to see more and more data centers?  Will they consume more and more electricity?  Because data can be directed to any geographic location, does this place areas with less expensive power at a relative advantage for the economic development opportunities posed by data centers?

No comments:

Post a Comment