
One PC has two hard drives, 700MHz AMD k7, 256MB RAM and a 350w power supply, no monitor. Runs Myth TV, stays busy a lot.
Other has 1 hard drive, 350MHz P2, 192MB RAM, 400w power supply, no monitor. Does nightly backups, acts as a webserver, test environment for web apps and many other things.
These two servers don't have modern power management features like cpu speed stepping and etc.
I've seen it mentioned that a PC running is about equal to a 150w bulb.
I figure 150w x 24 hours x 30 days = 108kwh x $0.085 = $9.18. x 2 = $18.36 /mo.
I don't know an easy way to find out if 150w is realistic for my servers, but it shouldn't be hard to get my hands on a desktop computer that uses 100w and is much faster and has more storage space than my other two combined. (smaller too)
100w x 24 hours x 30 days x $0.085 = $6.12. Of course, it would take 2.5 or more years for the investiment to break even. But maybe if I factor in the money I save by using ccfl bulbs...
How's my math and my assumptions? Am I off the rocker?