PDA

View Full Version : GPU-Z for HWbot...?



Smartidiot89
07-02-2009, 11:53 AM
I am sorry if this have been brought up before, but is it possible HWbot sometime will implement GPU-Z? We have CPU-Z, so why is there no GPU-Z?

Just an idea, I think this would really enhance HWbot.

W1zzard
07-02-2009, 12:14 PM
if hwbot is interested i can sure give them validation db access to query submission data

Splave
07-02-2009, 12:19 PM
whats the point though? max 3d clocks while in 2d?

BeepBeep2
07-02-2009, 06:35 PM
whats the point though? max 3d clocks while in 2d?

Seriously, this will produce nothing. I've had my 4850 up @ 800/2400 passing the AMD Overdrive Test and running 5 seconds of FurMark...and I've validated it with GPU-Z. It's only stable @ 725/2250.

Smartidiot89
07-04-2009, 02:03 AM
whats the point though? max 3d clocks while in 2d?

Seriously, this will produce nothing. I've had my 4850 up @ 800/2400 passing the AMD Overdrive Test and running 5 seconds of FurMark...and I've validated it with GPU-Z. It's only stable @ 725/2250.
Using this logics whats the point of having Pentium 4/Celeron running 8GHz+ cause it sure as hell isn't stable by any meens - so why is there CPU-Z in HWbot then if such results are useless and produces nothing?

Utroz
07-05-2010, 11:54 AM
On hwbot everyone knows that cpuz sceenshots sometimes are max suicide screen shots not an indication of stability thats why they have hexuspifast, super pi, wprime, prime 95, occt, Linx, ect to prove some level of stability. Why not do the same with GPU-z, maybe the screen shot has to show that the card passed furmark or occt gpu test or something like that. the hard part is finding a program that everyone can agree on between the NV vs ATI camp....

massman
07-05-2010, 03:43 PM
Using this logics whats the point of having Pentium 4/Celeron running 8GHz+ cause it sure as hell isn't stable by any meens - so why is there CPU-Z in HWbot then if such results are useless and produces nothing?

I don't understand your point of view.

1) You ask GPU-Z max clock to be added because it would enhance HWBOT
2) People point out that GPU-Z max clocks are a pointless comparison
3) You acknowledge this, but use the pointlessness of CPU-Z as an argument pro adding GPU-Z.

=> "GPU-Z is, just like CPU-Z, a pointless benchmark but should be added anyway" ?

Anyway, I think the main reason why people get a kick out of CPU-Z validations is because it's been around for so long now. That and seeing the 'absolute' limit of technology.

BeepBeep2
07-05-2010, 05:06 PM
I don't understand your point of view.

1) You ask GPU-Z max clock to be added because it would enhance HWBOT
2) People point out that GPU-Z max clocks are a pointless comparison
3) You acknowledge this, but use the pointlessness of CPU-Z as an argument pro adding GPU-Z.

=> "GPU-Z is, just like CPU-Z, a pointless benchmark but should be added anyway" ?

Anyway, I think the main reason why people get a kick out of CPU-Z validations is because it's been around for so long now. That and seeing the 'absolute' limit of technology.

May I ask why you need GPU-Z in 3D Benches now? I was pushing IGP and GPU-Z wasn't giving accurate clocks, and not just that but hard locking my system. Now three of my results have been flagged and they are no longer valid.

GPU-Z Max clocks are not max clocks either. It shows in the window what is set for 3D Mode. I can set 1500/1500 on my 5770's for 3D and GPU-Z will show it. As long as I dont drag a window, have too many open, etc, have flash running, nothing will trigger it and I run away with a 1500 Mhz GPU-Z Validation.

I was able to "validate" my 4850 @ 800 Mhz with 1.05v. That's not right.

massman
07-06-2010, 12:58 AM
We need it to know what VGA and clocks were used. It's not because there's one type of HW partially incompatible with the software that we should abandon it completely.

Did you contact W1zzard with these issues? I'm sure he's able to fix it :)