Quote Originally Posted by Bo_Fox View Post
or Enough of this emotional bickering!

What I was doing here is to try to clear things up a little bit in all of this mess that you're stirring up. Like when the OP said that Vsync enabled does not make a difference with power consumption, I said that it is not what I have been experiencing.

I had a HIS 4850 1GB card that died on me after exactly 30 days of use. The display became permanently corrupted after decoding hi-def videos for a couple of hours on multiple screens, and I am wondering if it has anything to do with the cheap VRM's... Ever since I had an X1900XTX from the day it came out, I noticed that the VRM temperatures on ATI cards seemed much higher than those on Nvidia cards.
You do know that the vrms on these refrence 4870 and 4870x2s are the same as the ones on the GTX280/260 65nm right? The 55nm versions use cheaper vrms.