Page 4 of 7 FirstFirst 1234567 LastLast
Results 76 to 100 of 152

Thread: NVIDIA's shady trick to boost the GeForce 9600GT

  1. #76
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by kryptobs2000 View Post
    If this were a 'trick' then I don't see how it really benefits nvidia? All the same, it seems like 90% of the people posting here didn't actually read why it does this. I'm not going to bother retyping, or requoting, so many of you seem to just be ignoring the numerous other explanations so w/e.
    I feel the exact same

    Quote Originally Posted by kryptobs2000 View Post
    My question is, now that we know why this happens, to those claiming this is a benefit, how so? It's not like you can't overclock the cards otherwise. You were never limited because of the crystals clock before. This makes no difference. If your card can reach 800mhz it doesn't matter how you achieve that clock, 800mhz is 800mhz (for videocards anyways). Am I wrong?
    Sure there's a big difference between overclocking 25x27 than 27,5x27, I don't know it yet, but you could achieve higher frequencies this way, as internal multiplier remains the same (27, in destroyers case )
    Are we there yet?

  2. #77
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by WeStSiDePLaYa View Post
    I think something is wrong with TPU's bench system.

    They are the only people who have this "problem".

    Also, linkboost support is now gone since 590sli.

    Also, since when did linkboost increase the card clocks?

    Linkboost never increased the card clocks, only the pci-e and chipset buses.

    Also, since when is the core clock dependent on pci-e bus?

    I don't think TPU knows much,. and they are just spreading misinformation.
    So now you have trolled enough go and grab a 9600 to confirm it yourself. Jeez...
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  3. #78
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by grimREEFER View Post
    here's the part i don't understand, dont reviewers leave the pci express frequency at stock?
    Yes but since most reviewers are testing SLI as well, they will be testing the cards in Nvidia motherboards.
    Linkboost AUTOMATICALLY adjusts the PCI-e frequency which, as a result, AUTOMATICALLY overclocks the core clock.

  4. #79
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by LordEC911 View Post
    Yes but since most reviewers are testing SLI as well, they will be testing the cards in Nvidia motherboards.
    Linkboost AUTOMATICALLY adjusts the PCI-e frequency which, as a result, AUTOMATICALLY overclocks the core clock.
    That is not totally true, as when I had 8800GT SLI, the PCI-Express's frequency were 100mhz @ BIOS, using the rig at my SIG.

    Linkboost does not exist anymore in 680i, afaik...
    Are we there yet?

  5. #80
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    617
    a 25% overclock would instantly crash a lot of factory overclocked 9600 GTs. we can pretty much assume nvidia wouldn't allow that, the 9xxx series probably aren't link-boost enabled

  6. #81
    Banned
    Join Date
    Feb 2008
    Posts
    213
    I don't know what the problem is really? When I saw the specs for the 9600GT I knew Nvidia had to do a tweak or something to pull it off with this card. It's not like we're talking about the high-end here. This is just a boost to this mid level card to make it run decent. Manufacturers have used many similar tricks in the past to raise performance in video cards. I never heard anyone complaining back then. As long as the card performs better with this feature I'm all for it. And btw, who said that RivaTuner is the ultimate tool for Nvidia cards? It's just a program made by a russian dude to overclock cards. If no other tool out there confirms this I'm not buying it...

  7. #82
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by BullGod View Post
    And btw, who said that RivaTuner is the ultimate tool for Nvidia cards? It's just a program made by a russian dude to overclock cards. If no other tool out there confirms this I'm not buying it...

    3dmark 2006 Fill Rate Multi-Texturing with different pci-e frquency and same clocks = different score

    http://www.xtremesystems.org/forums/...3&postcount=74


    Btw:

    3dmark 2006 with 675mhz & pci-e auto = 10900 marks

    3dmark 2006 with "675mhz" & pci-e 110Mhz = 11600 marks

    3dmark 2006 with 743mhz & pci-e auto = 11500 marks

    3dmark 2006 with "743mhz" & pci-e 110mhz (817Mhz) = Crash


    Last edited by mascaras; 02-29-2008 at 04:20 PM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  8. #83
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    For those who are asking themselves how to calculate the internal multiplier and final REAL clock, this might be helpful:

    Using 9600GT reference clocks:

    PCI-Express=100mhz
    GPU= 650mhz
    Internal Core Clock= 100mhz/4 = 25mhz
    Internal Core Multiplier = 650/25 = 26

    PCI-Express=110mhz
    GPU=650mhz (fake, as you will see further)
    Internal Core Clock=110mhz/4=27,5mhz
    Internal Core Multiplier=26 (it remains unchanged comparing to PCI-Express 100mhz, unless you manually overclock using rivatuner or ati tool, it increases 1 point at every 25mhz)
    REAL CORE CLOCK= 26x27,5= 715mhz

    In a lot easier way, to calculate RealCoreClock, using the example above: 650x1,10=715

    Shaders clock is not affected.
    Last edited by Luka_Aveiro; 02-29-2008 at 04:19 PM.
    Are we there yet?

  9. #84
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    102
    HOLY S! I was wondering why I couldnt clock my cards much past 700. My PCI-E bus was set to 110 originally, then 105 as of right now. If this is the case, I would rather set it to 100 and clock the cards manually to affect the shader clock. I noticed a discrepancy in the Hardware Monitor plugin, but brushed it off as a bug.

    Also, I am trying to see if theres an option for bios vmod, but I cant open nibitor in Vista x64. Any help would be appreciated.
    EVGA 780I P02bios
    E8400@3.6
    BFG 9600gt SLI 710/1000
    2x2g GSkill ddr2
    Sceptre 20.1 naga
    Antec SP500/TT Sli psu

  10. #85
    Xtreme Member
    Join Date
    Jun 2003
    Location
    Italy
    Posts
    351
    Quote Originally Posted by BullGod View Post
    And btw, who said that RivaTuner is the ultimate tool for Nvidia cards? It's just a program made by a russian dude to overclock cards. If no other tool out there confirms this I'm not buying it...
    the whole fact is not based on rivatuner's readings (that are actually wrong as well), they only started to raise doubts, confirmed by the fill rate benchmarks, so there's no way this can be all false.
    I don't see how this can be a good thing for the users since, if they are gonna set an auto overclock on their cards this means that they can all run at that speed...and if they can all run at that speed why not release the cards with that clock?
    Last edited by Tuvok-LuR-; 02-29-2008 at 04:31 PM.
    3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R

  11. #86
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by op1e View Post
    HOLY S! I was wondering why I couldnt clock my cards much past 700. My PCI-E bus was set to 110 originally, then 105 as of right now. If this is the case, I would rather set it to 100 and clock the cards manually to affect the shader clock. I noticed a discrepancy in the Hardware Monitor plugin, but brushed it off as a bug.

    Also, I am trying to see if theres an option for bios vmod, but I cant open nibitor in Vista x64. Any help would be appreciated.
    You CAN overclock shaders unlinked from core frequency

    I cannot help you with nibitor though, hope you find your way

    Quote Originally Posted by Tuvok-LuR- View Post
    I don't see how this can be a good thing for the users since, if they are gonna set an auto overclock on their cards this means that they can all run at that speed...and if they can all run at that speed why not release the cards with that clock?
    Because if they released it with a higher core clock, by raising pci-express frequency, the core clock would be even higher, leading to crashes.

    It is always a nice thing to know, and might be helpful knowing your limits and how to get over them. It is nice to know how the game is made, so you can play it well
    Last edited by Luka_Aveiro; 02-29-2008 at 04:36 PM.
    Are we there yet?

  12. #87
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by Frank M View Post
    Think about reviews then.
    Without the reviewers and the readers knowing it, they are comparing an
    overclocked 9600GT with the other cards, which are still at stock speeds.
    It's easy to perform well that way. In the sports world, this would be called
    doping.
    Um, they would need a 590i chipset to do it! Not even all 680i chipsets support Linkboost (if any)! Everyone else is running the card at stock speeds.

    You are blowing this WAY out of proportion.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  13. #88
    Xtreme Legend
    Join Date
    Mar 2005
    Location
    Australia
    Posts
    17,242
    have a look at their mobile GPUs guys

    they have a similar thing going on

    i don't think they change PCI frequency on laptops do you
    Team.AU
    Got tube?
    GIGABYTE Australia
    Need a GIGABYTE bios or support?



  14. #89
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by Tuvok-LuR- View Post
    If they are gonna set an auto overclock on their cards this means that they can all run at that speed...and if they can all run at that speed why not release the cards with that clock?
    yeah good point.

    anybody?

  15. #90
    Xtreme Monster
    Join Date
    May 2006
    Location
    United Kingdom
    Posts
    2,182
    I do not feel that exists any shady trick on it If it turns to be the LinkBoost feature. Newer cards starting from 7900 GTX have the feature added in it. It boosts 25%. Not sure but I think only Nvidia motherboards have this feature added on bios.

    Quote Originally Posted by AnandTech

    LinkBoost

    One of the features unique to the nForce 590SLI and 680i SLI MCP is a system called LinkBoost. If a GeForce 7900 GTX or GeForce 8800 is detected on either MCP then LinkBoost will automatically increase the PCI Express and MCP HyperTransport (HT) bus speeds by 25%. This increases the bandwidth available to each PCI Express and HT bus link from 8GB/s to 10GB/s.

    Since this technology increases the clock speed of the PCI Express bus by 25% to the x16 PCI Express graphics slots, NVIDIA requires certification of the video card for this program to work automatically. In this case, the 7900GTX and 8800 series are the only compatible cards offered, although you can manually set the bus speeds and achieve the same results depending upon your components. We feel this feature is worthwhile for those users who do not want to tune their BIOS and go through extensive test routines to find the best possible combination of settings.


    In essence, NVIDIA is guaranteeing their chipset's PCI Express and HT interconnect links are qualified to perform up to 125% of their default speeds without issue. While LinkBoost is an interesting idea, the 25% increase in PCI Express x16 slots and HT bus speeds yielded virtually the same performance as our system without LinkBoost enabled in most cases.

    Its actual implementation did not change our test scores in single video card testing but did provide a 1%~2% difference in SLI testing at resolutions under 1600x1200 in several game titles. The reason for the minimal increases at best is that the performance boost is being applied in areas that have minimal impact on system performance as the link to the CPU/Memory subsystem is left at stock speed thus negating the true benefits of this technology.
    Source

    So probably some people have this feature enabled or might be probable that drivers enable this when it is being installed after a hard restart.

    I think it should show the exactly clocks even after the clocks had been upped by the Linkboost feature. Well some softwares do not work correctly anyway.

    Metroid.

  16. #91
    Registered User
    Join Date
    Feb 2008
    Posts
    82
    Quote Originally Posted by WeStSiDePLaYa View Post
    I think something is wrong with TPU's bench system.

    They are the only people who have this "problem".

    Also, linkboost support is now gone since 590sli.

    Also, since when did linkboost increase the card clocks?

    Linkboost never increased the card clocks, only the pci-e and chipset buses.

    Also, since when is the core clock dependent on pci-e bus?

    I don't think TPU knows much,. and they are just spreading misinformation.
    Please re-read the article. Your question "Since when did linkboost incrase the card clocks" (all fo the different ways you phrase it) are answered. The answer is "since the 9600GT was released" and was in fact the whole point of the article. They even test this with an 8800GT and show that the 8800GT shows no change with pci-e frequency, only the 9600GT.

    As far as this being a major tweak goes, this allows you no more overclocking than could be acheived using normal methods, so I can hardly see any advantage.

    The disadvantage I can see is that to many reviews it will make the card look like it performs far better at stock setting than it does, as the nvidia chipset will automatically overclock the card, potentially to the point of instability for some samples. It would also, in a review of chipsets, make an Nvidia chipset look like it performed far better than an Intel one, simply by applying automatically the same overclock that could be applied manually on an Intel card.

    I can confirm that Linkboost exist on the 680i chipset, and that its function is indeed to increase the pci-e speed when an nvidia card is used, unless you take direct control of this frequency.
    Serenity:
    Core2 E6600
    Abit IN9 32X-MAX
    Corsair PC2-6400C4D
    2x BFG OC2 8800GTS in SLI
    Dell 3007WFP-HC

  17. #92
    Xtreme Enthusiast
    Join Date
    Aug 2007
    Location
    Madison, TN
    Posts
    934
    Quote Originally Posted by Tuvok-LuR- View Post
    the whole fact is not based on rivatuner's readings (that are actually wrong as well), they only started to raise doubts, confirmed by the fill rate benchmarks, so there's no way this can be all false.
    I don't see how this can be a good thing for the users since, if they are gonna set an auto overclock on their cards this means that they can all run at that speed...and if they can all run at that speed why not release the cards with that clock?
    I can only see one reason that Nvidia would do this and then keep it quiet. That's to try to sell more cards, but more likely to increase chipset sales. The unknowing public would think they'd have to have a Nvidia based mb to get the most from their video card.

    I see no problem in doing this. In many respects its a nice feature, but they should have been up front about it. Personally, I'd rather OC the card myself.

  18. #93
    Xtreme Legend
    Join Date
    Mar 2005
    Location
    Australia
    Posts
    17,242
    8600/8700 Mobile chipsets are the same as per screenshot

    Team.AU
    Got tube?
    GIGABYTE Australia
    Need a GIGABYTE bios or support?



  19. #94
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by NotFred View Post
    Please re-read the article. Your question "Since when did linkboost incrase the card clocks" (all fo the different ways you phrase it) are answered. The answer is "since the 9600GT was released" and was in fact the whole point of the article. They even test this with an 8800GT and show that the 8800GT shows no change with pci-e frequency, only the 9600GT.

    As far as this being a major tweak goes, this allows you no more overclocking than could be acheived using normal methods, so I can hardly see any advantage.

    The disadvantage I can see is that to many reviews it will make the card look like it performs far better at stock setting than it does, as the nvidia chipset will automatically overclock the card, potentially to the point of instability for some samples. It would also, in a review of chipsets, make an Nvidia chipset look like it performed far better than an Intel one, simply by applying automatically the same overclock that could be applied manually on an Intel card.

    I can confirm that Linkboost exist on the 680i chipset, and that its function is indeed to increase the pci-e speed when an nvidia card is used, unless you take direct control of this frequency.
    Well, assuming that linkboost increases PCI-Express frequency by 25%

    650mhzx1,25=812,5mhz

    Holly crapp, 9600GTs have been running at 812,5mhz while doing reviews!

    Do you really believe it or do you WANT to believe it?

    And, oh, I don't know about you, but my EVGA 680i SLI doesn't have the link boost option, can you still confirm it exists? Did you know it was an option on first BIOS for 680i boards? And now it is not?

    I can confirm I had 2 8800GT SLI and PCI-Express frequency was 100mhz on Slot 1 and 2, so link boost still exists?

    Come on man, do you really think those cards were running at 800mhz core? That would be awesome, wouldn' it?

    Peace
    Are we there yet?

  20. #95
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    102
    Quote Originally Posted by Luka_Aveiro View Post
    You CAN overclock shaders unlinked from core frequency

    I cannot help you with nibitor though, hope you find your way



    Because if they released it with a higher core clock, by raising pci-express frequency, the core clock would be even higher, leading to crashes.

    It is always a nice thing to know, and might be helpful knowing your limits and how to get over them. It is nice to know how the game is made, so you can play it well
    And leads to people like me being up to midnight on a work night bashing their heads into the keyboard thinking they got horribly binned cards cause they didnt know this little hook.
    EVGA 780I P02bios
    E8400@3.6
    BFG 9600gt SLI 710/1000
    2x2g GSkill ddr2
    Sceptre 20.1 naga
    Antec SP500/TT Sli psu

  21. #96
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    I'm confused here. Does this mean that the 9600GT has been having an unfair advantage over 8800GT in terms of core clocks?

  22. #97
    Registered User
    Join Date
    Apr 2005
    Location
    Portugal
    Posts
    65
    Some 3dmark 2006 results (win XP everyday, no special tweaks, LOD's, etc)

    3dmark links:

    Core 675, PCI-E 110 mhz -> 11602 http://service.futuremark.com/compare?3dm06=5499640

    Core 743, PCI-E 100 mhz -> 11523 http://service.futuremark.com/compare?3dm06=5499628

    Core 675, PCI-E 100 mhz -> 10971 http://service.futuremark.com/compare?3dm06=5499566

    Core 743, PCI-E 110 mhz -> don't run!

    Its really more faster than 8800 GTS 320 mb!
    Last edited by destr0yer; 02-29-2008 at 04:58 PM.
    Phenom II X4 805 @ 3500 | Phenom II X4 965 BE @ 4 ghz cooled by Noctua NH-U12P - ASUS M4A79T dlx - Asrock M3790GX - 3x 2048 Gskill Trident 2000 - 3x 2048 OCZ platinium 1600 - ASUS 4870 X2 TOP - Corsair TX850

  23. #98
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by op1e View Post
    And leads to people like me being up to midnight on a work night bashing their heads into the keyboard thinking they got horribly binned cards cause they didnt know this little hook.
    You are right about that. I would be pissed of as well if that happened to me, but I guess I would try a "load default values" at bios and then test it, is I have already done to some cards, but I would never figure out that internal clock was PCI-Express frequency related, and it was holding me back

    Quote Originally Posted by jaredpace View Post
    I'm confused here. Does this mean that the 9600GT has been having an unfair advantage over 8800GT in terms of core clocks?
    If PCI Express frequency is above 100mhz, probably YES.
    Are we there yet?

  24. #99
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    eu/hungary/budapest.tmp
    Posts
    1,591
    Quote Originally Posted by WeStSiDePLaYa View Post
    I don't think TPU knows much,. and they are just spreading misinformation.
    Nice.
    Some days ago, you were saying that johnnyGURU doesn't know much,
    today you are saying w1zzard doesn't know much, maybe tomorrow
    you'll be saying Charles is a noob

    Quote Originally Posted by grimREEFER View Post
    here's the part i don't understand, dont reviewers leave the pci express frequency at stock?
    It's the chipset automatically increasing the clock, that's what started
    this whole debate.

    Quote Originally Posted by Cybercat View Post
    Um, they would need a 590i chipset to do it! Not even all 680i chipsets support Linkboost (if any)! Everyone else is running the card at stock speeds.

    You are blowing this WAY out of proportion.
    I think you have some reading up to do... start with the article, for example.
    Usual suspects: i5-750 & H212+ | Biostar T5XE CFX-SLI | 4GB RAndoM | 4850 + AC S1 + 120@5V + modded stock for VRAM/VRM | Seasonic S12-600 | 7200.12 | P180 | U2311H & S2253BW | MX518
    mITX media & to-be-server machine: A330ION | Seasonic SFX | WD600BEVS boot & WD15EARS data
    Laptops: Lifebook T4215 tablet, Vaio TX3XP
    Bike: ZX6R

  25. #100
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Wait, isn't LinkBoost similar to what Asus Peg Link did? From my understand (correct me if I am wrong here) Link Boost bump up the PCI-Express link from 100 MHz to 125MHz. Also link boost increased PCI-E buses up to 3250Mhz from 2500mhz and SPP<->MCP HT bus from 1000 to 1250Mhz.

    And

    Why are people saying the 780i doesn't use link boost when the techreport review clearly implies that the 780i using "an extreme version" of link boost?

    Using the nForce 200 seems like a convoluted way to bring PCIe 2.0 connectivity to the 780i SLI. New chipsets from AMD and Intel put PCIe 2.0 right into the north bridge and offer full end-to-end 5.0GT/s signaling rates without the need for a third chip. So why is Nvidia using the nForce 200? I suspect it's because the nForce 780i SLI SPP isn't really a new chip at all. Nvidia MCP General Manager Drew Henry told us the 780i SLI SPP is an "optimized version of a chip we've used before," suggesting that it's really a relabeled nForce 680i SLI SPP.

    If you recall the last couple of Nvidia SPP chips, you'll remember a feature called LinkBoost, which cranked up the link speed for the chipset's PCI Express lanes. Nvidia was adamant that this wasn't overclocking since the chipset had been fully validated to run at higher speeds. I think we're seeing an extreme version of LinkBoost in action here, with the 780i SPP simply being a 680i SPP whose 16-lane PCIe 1.1 link has been coaxed into running at 4.5GT/s and validated at that speed. This approach would be fitting considering that second-generation PCI Express is really just gen one cranked up to a faster signaling rate. But it's a shame Nvidia didn't manage to nail 5.0GT/s on the button.
    techreport
    Last edited by Eastcoasthandle; 02-29-2008 at 05:06 PM.
    [SIGPIC][/SIGPIC]

Page 4 of 7 FirstFirst 1234567 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •