It seems to me there is something wrong with XbitLabs' power consumption method.
1. Is it possible to measure the amps for a certain slot?
2. The method assumes PSU provides exactly 3.3v/5v/12v at any time.
3. A GPU that performs CPU calculations, looks more power hungry, compared to another GPU that has no such ability. Though the total system power consumption might be a lot better with a GPGPU.
4. This method is very complicated, needs calibration, it's not widely acceptable and prone to errors.
5. The results are not comparable between different measurements even on the same system.
Xbitlabs didn't provide a total system power consumption at the same time.
The power consumption of GTX285 seems to be a lot more than HD4890.
XbitLabs measured the power consumption while playing Crysis Warhead at 170W for HD4890 and 245W for GTX285.
Though the official "Max Power Draw" for HD4890 is 190W and 183W for GTX285.
Do they imply that Nvidia announced a lot less max power consumption (20W) specs and ATI a lot more (20W)?
Most
other power benchmarks seem to give lower power consumption for GTX285 compared to HD4890 though.
Xbit Labs measured GTX295 (55nm dual GPU) and 5970 (40nm dual GPU) but skipped the 4970X2 (55nm dual GPU). Why?
I am not sure I can trust xbit labs' power measurements.