http://arstechnica.com/news.ars/post/20070215-8854.htmlAnyone paying attention to recent technology headlines knows that buying servers is just one part of the total cost. It costs power to run them, and power to cool them, and power costs money. AMD has just sponsored a study by Lawrence Berkeley National Laboratory staff scientist Jonathan Koomey that tries to answer the question: just how much power do US servers slurp down each year?
Koomey, who is also a consulting professor at Stanford, claims that his analysis is the most comprehensive to date and is based on the best available data from IDC. He concludes that in 2005, the total power consumption of US servers was 0.6 percent of overall US electricity consumption. When cooling equipment is added, that number doubles to 1.2 percent—the same amount used by color televisions.
Between 2000 and 2005, server electricity use grew at a rate of 14 percent each year, meaning that it more than doubled in five years. The 2005 estimate shows that servers and associated equipment burned through 5 million kW of power, which cost US businesses roughly $2.7 billion.
Koomey notes that this represents the output of five 1 GW power plants. Or, to put it another way, it's 25 percent more than the total possible output from the Chernobyl plant, back when it was actually churning out power and not sitting there, radiating the area.
<more>