Average Power Use Per Server & Desktop

The ars technica folks published this informative breakdown of server power consumption in 2007, credited to "Intel and EXP Critical Facilities":


Bar chart of power usage of server components

Note that this does not appear to include cooling and other related power energy use. Jonathan Koomey's landmark 2007 analysis that suggest an average US server consumes only 472W, including cooling, etc. at 1X power consumption (PUE 2.0). A 2009 IBM analysis uses 425W at "average load", but this does not appear to include cooling.

The equation is being shifted by new technologies, ranging from small-scale servers that typically run under 100W to new volume servers that offer 500+ CPUs consuming less than 2kW in total. As computing shifts to the more efficient cloud, PUEs are also improving. Koomey uses in his 2010 work PUEs in the 1.92-1.83 range.

Server running costs

Thanks for the information. Our engineers have a rough rule of thumb, that it costs double the running costs of a server to extract the heat. I have used the information here and that rule of thumb to create a virtualisation calculator.

Check it out here:
http://www.abtecnet.com/virtualisation-calculator-c203.aspx

Rated Vs. Actual Power Consumption

The actual power consumption of a computer during normal use can be below that of a its manufacturer's rating. Alex Bischoff of open4energy measured the actual power consumed by his laptop over a week. Its average consumption of ~30W watts was 46% of the 65W rating of the unit's power supply.

Energy cost to run a server

Using Koomey, a 427W server running 24/7 would directly consume 3741 KWH of electricity annually, or ~$400.00 at the USA commercial pricing average of $0.107/KWH. Factoring in the cooling load at 1X (2.0 PUE) puts the annual energy cost per server at ~$800.

The $400 figure shows up in a Gartner press release, where an ambiguous use has caused some confusion.

The release starts off by saying, "For example, removing a single x86 server will result in savings of more than $400 a year in energy costs alone." This assumes no significant offsetting increase from something else installed to replace that server's function, which is unlikely.

The release continues, "server rationalization will lower energy costs, typically more than $400 per server, per year." I imagine the author meant that total energy server cost is $400 (unnecessary repeat of above statement), but it can be read as implying that reduced energy costs from rationalization are $400 per server, per year.

So how might we calculate the energy impact of server rationalization? Let's assume we replace ten 427W servers with one 1030W virtualization server and use a PUE of 2.0. 8540W gets reduced to 2060W, saving 6480W or 76%. So Gartner's implication that removing a single server will save 100% of its energy costs would apply to very few rationalization scenarios.

These illustrations are based on equipment specifications and industry averages. As always, actual power loads are a function of real-world equipment configurations and operating conditions. It's critical to establish a baseline before planning a server rationalization project.

Thanks to open4energy for alerting me to the potential confusion with the Gartner release.

Power consumption for a "typical" desktop

~100 watts seems to be a reasonable power consumption number for a new desktop configuration of computer and LCD display for home or light office. But old gear consumes more, as do CRTs, high-end machines used for gaming, media, and professional apps, and desktops with extensive peripherals and storage.

This paper from from software vendor Faronics assumes 150W for a desktop configuration: www.faronics.com/doc/wp/PS_WP_ITandFacilities_EN.pd

(Update 2009.07.28) The most recent version of the Dell calculator referenced by David can be found at: http://www.dell.com/content/topics/topic.aspx/global/products/landing/en.... This version is reportedly consistent with the recently released EPA ENERGY STAR® 5.0 standards for PCs.

And here are some stats on desktop PCs

David Washburn

Check out the series of energy calculators on the Dell web site. Using these calculators, you can input variables relating to your environment and figure out how much you'd save in both carbon emissions and dollars, depending on what products you buy. According to Dell, a classroom with 30 Optiplex 745 computers with Pentium D processors, Energy Star power management, and 17-inch flat-panel monitors would save about $1,896 a year in energy costs. By replacing the Pentium D with a Core 2 Duo processor, which uses even less energy, this same classroom reportedly would save about $2,082 a year in energy costs.

http://www.eschoolnews.com/news/showStoryts.cfm?ArticleID=6942
www.aminfo.com

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Post new comment

The content of this field is kept private and will not be shown publicly.