I wrote previously about my computer system's power consumption in standby mode, where I noted that my computer and peripherals used 13.5 watts when it was switched "off".
Next, I tested on our media center (which is just a fancy name for our TV and its connected devices).
Here are the results:
Device | Switched On | In Standby |
---|---|---|
46" plasma TV | 200 W | 0.5 W |
Cable box | 10 W | 8 W |
Stereo amplifier | Depends | 7 W |
Nintendo Wii | 15 W | 1 or 6 W |
PS3 Slim | 64 W | 0.5 W |
DVD/hard disk recorder | 31 W | 3 W |
Total | 319+ W | 23.5 or 36 W |
The measurement for the TV is approximate since a plasma TV uses more or less power depending on the brightness of whatever it's displaying. The drain in standby mode is just half a watt, which is very good. There are new rules within the European Union that devices should not use more than 0.5 watts in standby mode, and since the TV is new, it complies with this.
The Wii's power consumption in standby depends on whether the standby connection is enabled. (It's in Wii Settings → Settings → WiiConnect24.)
The cable box is suprisingly bad. It hardly consumes any less energy in standby mode than when it's switched on. The stereo amplifier is very bad too. I don't use it that often, and it has a real, physical off button, so I don't mind it so much.
For the computer, I bought a power strip that automatically cuts the power to the peripherals when the computer is switched off. I'm not very fond of the idea of doing this for the TV. It doesn't seem safe to cut the power to the game consoles if you switch off the TV, accidentally or not. Also, the cable box requires up to a minute to boot after the power has been cut. Our family watch TV a lot, so it has to be fast and easy to switch on and off. I have no good solution to this.
Previous: Git vs Mercurial presentation
Next: Dangers of SVG and the img tag