Page 1 of 7 1234 ... LastLast
Results 1 to 25 of 152

Thread: NVIDIA's shady trick to boost the GeForce 9600GT

  1. #1
    Xtreme Member
    Join Date
    Dec 2006
    Location
    Bulgaria/Plovdiv
    Posts
    263

    NVIDIA's shady trick to boost the GeForce 9600GT


  2. #2
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    262
    Another Nvidia scam for sales.But because they make billion nothing will ever be done about it

  3. #3
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Nice find.
    It´s now resolved the case of 9600GT SLI in one SLI Nvidia board is better (in some games) then one HD 3870X2 in the same Nvidia board:
    It is certainly nice for NVIDIA to see their GeForce 9600 GT reviewed on NVIDIA chipsets with LinkBoost enabled where their card leaves the competition behind in the dust (even more). Also it could send a message to customers that the card performs considerably better when used on an NVIDIA chipset? Actually this is not the case, the PCI-Express frequency can be adjusted on most motherboards, you will see these gains independent of Intel/AMD CPU architecture or Intel/NVIDIA/AMD/VIA chipset.

  4. #4
    Registered User
    Join Date
    Feb 2008
    Posts
    82
    Interesting, has huge implications for the review much discussed in this thread:
    Clicky

    So what are they doing then, basically automatically overclocking thier cards when using Nforce chipsets. Would be somewhat amusing if this made Nvidia cards and chipsets incompattible (for cards that have little to no overclocking headroom in them... there are always some...)
    Serenity:
    Core2 E6600
    Abit IN9 32X-MAX
    Corsair PC2-6400C4D
    2x BFG OC2 8800GTS in SLI
    Dell 3007WFP-HC

  5. #5
    Xtreme Addict
    Join Date
    Dec 2005
    Location
    UK
    Posts
    1,713
    So what are the actual default clocks for 9600GT?
    TAMGc5: PhII X4 945, Gigabyte GA-MA790X-UD3P, 2x Kingston PC2-6400 HyperX CL4 2GB, 2x ASUS HD 5770 CUcore Xfire, Razer Barracuda AC1, Win8 Pro x64 (Current)

    TAMGc6: AMD FX, Gigabyte GA-xxxx-UDx, 8GB/16GB DDR3, Nvidia 680 GTX, ASUS Xonar, 2x 120/160GB SSD, 1x WD Caviar Black 1TB SATA 6Gb/s, Win8 Pro x64 (Planned)

  6. #6
    Xtreme Addict
    Join Date
    Oct 2005
    Location
    MA/NH
    Posts
    1,251
    or it could just be that rivatuner reads the card wrong ?

    nvidia drivers and gpu-z always read the card the same, but on every card I have at least one of the clocks reads different on riva tuner hardware monitor....

    if they are all eminating from the same source then who is wrong?

    my shader cores have been off as much as 30mhz on all the g80s'g92's
    I havnt seen such a huge discrepancy in core clocks though.

    from what it looks like there saying a stock 650 9600gt should read about 708 in riva tuner then ?
    Mpower Max | 4770k | H100 | 16gb Sammy 30nm 1866 | GTX780 SC | Xonar Essence Stx | BIC DV62si | ATH AD700 | 550d | AX850 | VG24QE | 840pro 256gb | 640black | 2tb | CherryReds | m60 | Func1030 |
    HEAT

  7. #7
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    617
    slightly misleading of nVidia, but not so much of a problem once people are aware of it...

    but it implies that every 9600 GT can handle a 25% overclock on stock volts.
    you put a 650mhz 9600 GT in a linkboost enabled board and it clocks it to 125/100 * 650mhz = 812.5mhz... is that gonna work????
    or are a lot of nforce 590i boards gonna be mysteriously buggy while running 9600 GTs in SLI
    Last edited by hollo; 02-29-2008 at 10:01 AM.

  8. #8
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Posts
    668
    I believe linkboost is out since the 680i chipset was launched

  9. #9
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715
    Quote Originally Posted by DMH View Post
    I believe linkboost is out since the 680i chipset was launched
    exactly, i can overclock my 9600GT on 780MHz, with this theory is running my GPU highly above 850MHz and it is not possible!

  10. #10
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by Syn. View Post
    So what are the actual default clocks for 9600GT?

    according to techpowerup , it depends on PCI-E frequency







    regards
    Last edited by mascaras; 02-29-2008 at 10:15 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  11. #11
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715
    Quote Originally Posted by mascaras View Post
    it depends on PCI-E frequency


    regards
    when is Linkboost disabled too?

  12. #12
    Xtreme Member
    Join Date
    Jun 2003
    Location
    Italy
    Posts
    351
    so to sum up, every 9600gt review on nv chipset with linkboost enabled should be trashed?
    is g94 the only affected chip up to now?
    Last edited by Tuvok-LuR-; 02-29-2008 at 09:47 AM.
    3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R

  13. #13
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by DMH View Post
    I believe linkboost is out since the 680i chipset was launched
    This feature was pioneered with the NVIDIA 590i chipset and is present in the NVIDIA 680i chipset too
    ...

  14. #14
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by OBR View Post
    when is Linkboost disabled too?
    i realy dont know OBR , someone try it !

    i will receive a BFG 9600Gt OC for tests next monday , but i have a Asus P5E , i will test with different PCI-E frequencies and see in 3dmark if the score " goes up" , then i report !


    Last edited by mascaras; 02-29-2008 at 09:49 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  15. #15
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    This examples a whole lot.
    [SIGPIC][/SIGPIC]

  16. #16
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Interesting...I thought its awesome performance could be due to an tweaked arch., or maybe at the cost of IQ, but never because of this.

  17. #17
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    There's no place like 127.0.0.1, Brazil
    Posts
    888
    Very well find dude...Bad trick from nV

  18. #18
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Link boost has been around for quite some time.... ATi has a form of it in their RD600 as well. All it does is overclock the PCI-E bus automatically when it senses the card brand of it's choice. All this is is merely NVidia finally making a board that can take advantage of the pci-e bus clocking feature that's been around for almost 2 years now.

    So, if this is true and not shens, does that mean the reviews where they overclock the cards and still hit 800mhz+ actually closer to 1000mhz? After all, just 850 * 25% would be 1062 mhz.

    Calling this shady, I don't know about that one. It boosts the cards performance for the end user without worry of voiding warranty or requiring any work at all. That's not shady, that's called increasing performance. Nothing wrong with that at all.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  19. #19
    Xtreme Member
    Join Date
    Jun 2003
    Location
    Italy
    Posts
    351
    is linkboost related only to SLIed cards or also to sinle card?
    3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R

  20. #20
    Xtreme Member
    Join Date
    Jun 2003
    Location
    Italy
    Posts
    351
    Quote Originally Posted by DilTech View Post
    Link boost has been around for quite some time.... ATi has a form of it in their RD600 as well. All it does is overclock the PCI-E bus automatically when it senses the card brand of it's choice. All this is is merely NVidia finally making a board that can take advantage of the pci-e bus clocking feature that's been around for almost 2 years now.
    the performance increase has nothing to do with linkboost itself, with wider bandwidth or higher pci frequency, it's just a gpu overclock
    Quote Originally Posted by DilTech View Post
    So, if this is true and not shens, does that mean the reviews where they overclock the cards and still hit 800mhz+ actually closer to 1000mhz? After all, just 850 * 25% would be 1062 mhz.

    Calling this shady, I don't know about that one. It boosts the cards performance for the end user without worry of voiding warranty or requiring any work at all. That's not shady, that's called increasing performance. Nothing wrong with that at all.
    actually, there's no problem in bringing to the average Joe an auto-overclocking and dummyproof performance increase
    what makes it a cheat and a **** is:
    - it's not documented/advertised by nvidia
    - the driver reports the non-overclocked frequency
    so basically looks like nvidia wants to hide this and make people think their cards at stock frequencies are faster then they actually are.
    Last edited by Tuvok-LuR-; 02-29-2008 at 10:11 AM. Reason: swearing
    3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R

  21. #21
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    12,338
    I hope this is implemented on 9800GX2 and GTX

    I think it's pretty sweet.

  22. #22
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    There's no place like 127.0.0.1, Brazil
    Posts
    888
    Quote Originally Posted by Tuvok-LuR- View Post
    actually, there's no problem in bringing to the average Joe an auto-overclocking and dummyproof performance increase
    what makes it a cheat and a **** is:
    - it's not documented/advertised by nvidia
    - the driver reports the non-overclocked frequency
    so basically looks like nvidia wants to hide this and make people think their cards at stock frequencies are faster then they actually are.

  23. #23
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Allot of folks by oc'd cards and pay extra for them, this is basically no different other than not paying extra for an oc edition card.

    It's free performance so I don't personally see anything wrong or shady about it.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  24. #24
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    617
    Quote Originally Posted by mascaras View Post
    it depends on PCI-E frequency







    regards
    default clock is 650mhz
    http://www.techarp.com/article/Deskt...idia_4_big.png

    but in that article they must have been using an overclock model 9600 GT

    actual clock = 25mhz (dependant on PCI-e frequency) * 29 = 725mhz (reported by GPU-z and rivatuner overclocking)
    there's also 27mhz * 29 = 783mhz from rivatuner monitoring, which the author says is incorrect
    Quote Originally Posted by techpowerup
    Please also note that RivaTuner's monitoring clock reading is wrong. It uses 27 MHz for its calculation which is incorrect. When the PCI-E bus is 100 MHz, the core clock is indeed 650 MHz on the reference design. A RivaTuner update is necessary to reflect GPU clock changes cause by PCI-E clock properly though.

  25. #25
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,176
    Quote Originally Posted by hollo View Post
    slightly misleading of nVidia, but not so much of a problem once people are aware of it...

    but it implies that every 9600 GT can handle a 25% overclock on stock volts.
    you put a 650mhz 9600 GT in a linkboost enabled board and it clocks it to 125/100 * 650mhz = 812.5mhz... is that gonna work????
    or are a lot of nforce 590i boards gonna be mysteriously buggy while running 9600 GTs in SLI
    I can confirm this,
    I have a p35 chipset and my card can play ingame at 820Mhz stable on the stock cooler - cheap zotac card

Page 1 of 7 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •