Results 1 to 9 of 9

Thread: GPU-Z for HWbot...?

  1. #1
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034

    GPU-Z for HWbot...?

    I am sorry if this have been brought up before, but is it possible HWbot sometime will implement GPU-Z? We have CPU-Z, so why is there no GPU-Z?

    Just an idea, I think this would really enhance HWbot.
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  2. #2
    Xtreme Legend
    Join Date
    Jan 2003
    Location
    Stuttgart, Germany
    Posts
    929
    if hwbot is interested i can sure give them validation db access to query submission data

  3. #3
    PI in the face
    Join Date
    Nov 2008
    Posts
    3,083
    whats the point though? max 3d clocks while in 2d?
    Quote Originally Posted by L0ud View Post
    So many opinions and so few screenshots

  4. #4
    Xtreme 3D Team
    Join Date
    Jan 2009
    Location
    Ohio
    Posts
    8,499
    Quote Originally Posted by Bobbylite View Post
    whats the point though? max 3d clocks while in 2d?
    Seriously, this will produce nothing. I've had my 4850 up @ 800/2400 passing the AMD Overdrive Test and running 5 seconds of FurMark...and I've validated it with GPU-Z. It's only stable @ 725/2250.
    Smile

  5. #5
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034
    Quote Originally Posted by Bobbylite View Post
    whats the point though? max 3d clocks while in 2d?
    Quote Originally Posted by BeepBeep2 View Post
    Seriously, this will produce nothing. I've had my 4850 up @ 800/2400 passing the AMD Overdrive Test and running 5 seconds of FurMark...and I've validated it with GPU-Z. It's only stable @ 725/2250.
    Using this logics whats the point of having Pentium 4/Celeron running 8GHz+ cause it sure as hell isn't stable by any meens - so why is there CPU-Z in HWbot then if such results are useless and produces nothing?
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  6. #6
    Registered User Utroz's Avatar
    Join Date
    Nov 2002
    Location
    Maine
    Posts
    68
    On hwbot everyone knows that cpuz sceenshots sometimes are max suicide screen shots not an indication of stability thats why they have hexuspifast, super pi, wprime, prime 95, occt, Linx, ect to prove some level of stability. Why not do the same with GPU-z, maybe the screen shot has to show that the card passed furmark or occt gpu test or something like that. the hard part is finding a program that everyone can agree on between the NV vs ATI camp....
    File Server


    Super Old system
    [SIGPIC][/SIGPIC]
    http://valid.x86-secret.com/show_oc.php?id=371866

  7. #7
    I am Xtreme
    Join Date
    Jan 2005
    Posts
    4,714
    Quote Originally Posted by Smartidiot89 View Post
    Using this logics whats the point of having Pentium 4/Celeron running 8GHz+ cause it sure as hell isn't stable by any meens - so why is there CPU-Z in HWbot then if such results are useless and produces nothing?
    I don't understand your point of view.

    1) You ask GPU-Z max clock to be added because it would enhance HWBOT
    2) People point out that GPU-Z max clocks are a pointless comparison
    3) You acknowledge this, but use the pointlessness of CPU-Z as an argument pro adding GPU-Z.

    => "GPU-Z is, just like CPU-Z, a pointless benchmark but should be added anyway" ?

    Anyway, I think the main reason why people get a kick out of CPU-Z validations is because it's been around for so long now. That and seeing the 'absolute' limit of technology.
    Where courage, motivation and ignorance meet, a persistent idiot awakens.

  8. #8
    Xtreme 3D Team
    Join Date
    Jan 2009
    Location
    Ohio
    Posts
    8,499
    Quote Originally Posted by massman View Post
    I don't understand your point of view.

    1) You ask GPU-Z max clock to be added because it would enhance HWBOT
    2) People point out that GPU-Z max clocks are a pointless comparison
    3) You acknowledge this, but use the pointlessness of CPU-Z as an argument pro adding GPU-Z.

    => "GPU-Z is, just like CPU-Z, a pointless benchmark but should be added anyway" ?

    Anyway, I think the main reason why people get a kick out of CPU-Z validations is because it's been around for so long now. That and seeing the 'absolute' limit of technology.
    May I ask why you need GPU-Z in 3D Benches now? I was pushing IGP and GPU-Z wasn't giving accurate clocks, and not just that but hard locking my system. Now three of my results have been flagged and they are no longer valid.

    GPU-Z Max clocks are not max clocks either. It shows in the window what is set for 3D Mode. I can set 1500/1500 on my 5770's for 3D and GPU-Z will show it. As long as I dont drag a window, have too many open, etc, have flash running, nothing will trigger it and I run away with a 1500 Mhz GPU-Z Validation.

    I was able to "validate" my 4850 @ 800 Mhz with 1.05v. That's not right.
    Last edited by BeepBeep2; 07-05-2010 at 04:09 PM.
    Smile

  9. #9
    I am Xtreme
    Join Date
    Jan 2005
    Posts
    4,714
    We need it to know what VGA and clocks were used. It's not because there's one type of HW partially incompatible with the software that we should abandon it completely.

    Did you contact W1zzard with these issues? I'm sure he's able to fix it
    Where courage, motivation and ignorance meet, a persistent idiot awakens.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •