Some time ago (must have been over a year) I did some power measurements with a simple at-the-wall-socket power meter. For a short while they were on sale at electronics stores all over the place, as part of a "know your power use" campaign.

I've been sick for the past week (again) and one of the therapeutic things I did was rearrange my home office ; now the wall socket is easily accessible again and the servers are piled one on top of the other. I decided to do a little measurement and see what computation actually costs me. Bear in mind that this is totally unscientific, based on running a complete system through a power bar plugged into a meter and then watching the meter and guessing averages.

The system under test is an Intel i7-860 with 8GB RAM. Possibly the most difficult part of this test is actually putting the system under load. I'm hard-pressed to get more than, say, 25% load. One (single-threaded) simulation, one dual-core virtual machine running OpenIndiana and compiling KDE .. I tried to get kdesrc-build working quickly to see if that would load the machine too, but that was a bit of a disaster (aside from I'd forgotten that it would take an hour of just network activity to get the sources up-to-date). Something for another blog entry.

Anyway, here's some brief power use information, for the complete system -- that is, monitors and other peripherals that are switched on when I'm at my desk.

Samsung 206: 10W when "off", 39W when on, 10W when it goes to standby.
Iiyama E430: 10W when "off", 33W when on, 10W in standby.
CPU: 10W when "off", 64W on boot.
Complete desktop: 140-190W while logging in (after cold boot), 130W at an idle desktop, 170-200W when under load.

There's two things I'm going to take away from this: at 200W total power draw for my desktop, that's a ridiculous amount of computation power per watt. Second is that I'm going to switch off the power bar when I switch "off" the machine in the evenings, because 30W for standby is a bit much.