I work in the electric power generation industry, so I have some(but not alot)as to the differences overseas. I thought you meant for 1 single computer. If I had known it was for the farm I wouldn't have mentioned a word. :P
As a comparison, if you have 80% efficient power supplies right now consuming 2kw/h and you switched them all to 90%(the bottom of the gold band) and electricity is $0.10 per KwH then you're looking at a savings of approximately $250 per year. More if you include the cost of cooling that extra waste heat.
I've attached a cool little spreadsheet I made that lets you:
1. Look at the cost of electricity for a theoretical computer you build and see how the change in efficiency affects the cost of electricity.
2. Input the current values for your power supply and see how much the change in efficiency by simply upgrading your power supply changes the cost of electricity over a year. Often I find that for loaded computers, the cost to upgrade can sometimes pay for itself in less than 2 years.
The chart is protected, only the green values are editable. I did this for simplicity. There is no password if you want to unprotect it or edit it. If you repost it please give me credit for the original product.
-Josh







Reply With Quote
Bookmarks