Are we there yet?
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
a 25% overclock would instantly crash a lot of factory overclocked 9600 GTs. we can pretty much assume nvidia wouldn't allow that, the 9xxx series probably aren't link-boost enabled
I don't know what the problem is really? When I saw the specs for the 9600GT I knew Nvidia had to do a tweak or something to pull it off with this card. It's not like we're talking about the high-end here. This is just a boost to this mid level card to make it run decent. Manufacturers have used many similar tricks in the past to raise performance in video cards. I never heard anyone complaining back then. As long as the card performs better with this feature I'm all for it. And btw, who said that RivaTuner is the ultimate tool for Nvidia cards? It's just a program made by a russian dude to overclock cards. If no other tool out there confirms this I'm not buying it...
3dmark 2006 Fill Rate Multi-Texturing with different pci-e frquency and same clocks = different score
http://www.xtremesystems.org/forums/...3&postcount=74
Btw:
3dmark 2006 with 675mhz & pci-e auto = 10900 marks
3dmark 2006 with "675mhz" & pci-e 110Mhz = 11600 marks
3dmark 2006 with 743mhz & pci-e auto = 11500 marks
3dmark 2006 with "743mhz" & pci-e 110mhz (817Mhz) = Crash
![]()
Last edited by mascaras; 02-29-2008 at 04:20 PM.
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
For those who are asking themselves how to calculate the internal multiplier and final REAL clock, this might be helpful:
Using 9600GT reference clocks:
PCI-Express=100mhz
GPU= 650mhz
Internal Core Clock= 100mhz/4 = 25mhz
Internal Core Multiplier = 650/25 = 26
PCI-Express=110mhz
GPU=650mhz (fake, as you will see further)
Internal Core Clock=110mhz/4=27,5mhz
Internal Core Multiplier=26 (it remains unchanged comparing to PCI-Express 100mhz, unless you manually overclock using rivatuner or ati tool, it increases 1 point at every 25mhz)
REAL CORE CLOCK= 26x27,5= 715mhz
In a lot easier way, to calculate RealCoreClock, using the example above: 650x1,10=715
Shaders clock is not affected.
Last edited by Luka_Aveiro; 02-29-2008 at 04:19 PM.
Are we there yet?
HOLY S! I was wondering why I couldnt clock my cards much past 700. My PCI-E bus was set to 110 originally, then 105 as of right now. If this is the case, I would rather set it to 100 and clock the cards manually to affect the shader clock. I noticed a discrepancy in the Hardware Monitor plugin, but brushed it off as a bug.
Also, I am trying to see if theres an option for bios vmod, but I cant open nibitor in Vista x64. Any help would be appreciated.
EVGA 780I P02bios
E8400@3.6
BFG 9600gt SLI 710/1000
2x2g GSkill ddr2
Sceptre 20.1 naga
Antec SP500/TT Sli psu
the whole fact is not based on rivatuner's readings (that are actually wrong as well), they only started to raise doubts, confirmed by the fill rate benchmarks, so there's no way this can be all false.
I don't see how this can be a good thing for the users since, if they are gonna set an auto overclock on their cards this means that they can all run at that speed...and if they can all run at that speed why not release the cards with that clock?
Last edited by Tuvok-LuR-; 02-29-2008 at 04:31 PM.
3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R
You CAN overclock shaders unlinked from core frequency
I cannot help you with nibitor though, hope you find your way
Because if they released it with a higher core clock, by raising pci-express frequency, the core clock would be even higher, leading to crashes.
It is always a nice thing to know, and might be helpful knowing your limits and how to get over them. It is nice to know how the game is made, so you can play it well![]()
Last edited by Luka_Aveiro; 02-29-2008 at 04:36 PM.
Are we there yet?
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
have a look at their mobile GPUs guys
they have a similar thing going on
i don't think they change PCI frequency on laptops do you![]()
I do not feel that exists any shady trick on it If it turns to be the LinkBoost feature. Newer cards starting from 7900 GTX have the feature added in it. It boosts 25%. Not sure but I think only Nvidia motherboards have this feature added on bios.
SourceOriginally Posted by AnandTech
So probably some people have this feature enabled or might be probable that drivers enable this when it is being installed after a hard restart.
I think it should show the exactly clocks even after the clocks had been upped by the Linkboost feature. Well some softwares do not work correctly anyway.
Metroid.
Please re-read the article. Your question "Since when did linkboost incrase the card clocks" (all fo the different ways you phrase it) are answered. The answer is "since the 9600GT was released" and was in fact the whole point of the article. They even test this with an 8800GT and show that the 8800GT shows no change with pci-e frequency, only the 9600GT.
As far as this being a major tweak goes, this allows you no more overclocking than could be acheived using normal methods, so I can hardly see any advantage.
The disadvantage I can see is that to many reviews it will make the card look like it performs far better at stock setting than it does, as the nvidia chipset will automatically overclock the card, potentially to the point of instability for some samples. It would also, in a review of chipsets, make an Nvidia chipset look like it performed far better than an Intel one, simply by applying automatically the same overclock that could be applied manually on an Intel card.
I can confirm that Linkboost exist on the 680i chipset, and that its function is indeed to increase the pci-e speed when an nvidia card is used, unless you take direct control of this frequency.
Serenity:
Core2 E6600
Abit IN9 32X-MAX
Corsair PC2-6400C4D
2x BFG OC2 8800GTS in SLI
Dell 3007WFP-HC
I can only see one reason that Nvidia would do this and then keep it quiet. That's to try to sell more cards, but more likely to increase chipset sales. The unknowing public would think they'd have to have a Nvidia based mb to get the most from their video card.
I see no problem in doing this. In many respects its a nice feature, but they should have been up front about it. Personally, I'd rather OC the card myself.
8600/8700 Mobile chipsets are the same as per screenshot
![]()
Well, assuming that linkboost increases PCI-Express frequency by 25%
650mhzx1,25=812,5mhz
Holly crapp, 9600GTs have been running at 812,5mhz while doing reviews!
Do you really believe it or do you WANT to believe it?
And, oh, I don't know about you, but my EVGA 680i SLI doesn't have the link boost option, can you still confirm it exists? Did you know it was an option on first BIOS for 680i boards? And now it is not?
I can confirm I had 2 8800GT SLI and PCI-Express frequency was 100mhz on Slot 1 and 2, so link boost still exists?![]()
Come on man, do you really think those cards were running at 800mhz core? That would be awesome, wouldn' it?
Peace
Are we there yet?
EVGA 780I P02bios
E8400@3.6
BFG 9600gt SLI 710/1000
2x2g GSkill ddr2
Sceptre 20.1 naga
Antec SP500/TT Sli psu
I'm confused here. Does this mean that the 9600GT has been having an unfair advantage over 8800GT in terms of core clocks?
Some 3dmark 2006 results (win XP everyday, no special tweaks, LOD's, etc)
3dmark links:
Core 675, PCI-E 110 mhz -> 11602 http://service.futuremark.com/compare?3dm06=5499640
Core 743, PCI-E 100 mhz -> 11523 http://service.futuremark.com/compare?3dm06=5499628
Core 675, PCI-E 100 mhz -> 10971 http://service.futuremark.com/compare?3dm06=5499566
Core 743, PCI-E 110 mhz -> don't run!
Its really more faster than 8800 GTS 320 mb!![]()
Last edited by destr0yer; 02-29-2008 at 04:58 PM.
Phenom II X4 805 @ 3500 | Phenom II X4 965 BE @ 4 ghz cooled by Noctua NH-U12P - ASUS M4A79T dlx - Asrock M3790GX - 3x 2048 Gskill Trident 2000 - 3x 2048 OCZ platinium 1600 - ASUS 4870 X2 TOP - Corsair TX850
You are right about that. I would be pissed of as well if that happened to me, but I guess I would try a "load default values" at bios and then test it, is I have already done to some cards, but I would never figure out that internal clock was PCI-Express frequency related, and it was holding me back
If PCI Express frequency is above 100mhz, probably YES.
Are we there yet?
Nice.
Some days ago, you were saying that johnnyGURU doesn't know much,
today you are saying w1zzard doesn't know much, maybe tomorrow
you'll be saying Charles is a noob![]()
It's the chipset automatically increasing the clock, that's what started
this whole debate.
I think you have some reading up to do... start with the article, for example.
Usual suspects: i5-750 & H212+ | Biostar T5XE CFX-SLI | 4GB RAndoM | 4850 + AC S1 + 120@5V + modded stock for VRAM/VRM | Seasonic S12-600 | 7200.12 | P180 | U2311H & S2253BW | MX518
mITX media & to-be-server machine: A330ION | Seasonic SFX | WD600BEVS boot & WD15EARS data
Laptops: Lifebook T4215 tablet, Vaio TX3XP
Bike: ZX6R
Wait, isn't LinkBoost similar to what Asus Peg Link did? From my understand (correct me if I am wrong here) Link Boost bump up the PCI-Express link from 100 MHz to 125MHz. Also link boost increased PCI-E buses up to 3250Mhz from 2500mhz and SPP<->MCP HT bus from 1000 to 1250Mhz.
And
Why are people saying the 780i doesn't use link boost when the techreport review clearly implies that the 780i using "an extreme version" of link boost?
techreportUsing the nForce 200 seems like a convoluted way to bring PCIe 2.0 connectivity to the 780i SLI. New chipsets from AMD and Intel put PCIe 2.0 right into the north bridge and offer full end-to-end 5.0GT/s signaling rates without the need for a third chip. So why is Nvidia using the nForce 200? I suspect it's because the nForce 780i SLI SPP isn't really a new chip at all. Nvidia MCP General Manager Drew Henry told us the 780i SLI SPP is an "optimized version of a chip we've used before," suggesting that it's really a relabeled nForce 680i SLI SPP.
If you recall the last couple of Nvidia SPP chips, you'll remember a feature called LinkBoost, which cranked up the link speed for the chipset's PCI Express lanes. Nvidia was adamant that this wasn't overclocking since the chipset had been fully validated to run at higher speeds. I think we're seeing an extreme version of LinkBoost in action here, with the 780i SPP simply being a 680i SPP whose 16-lane PCIe 1.1 link has been coaxed into running at 4.5GT/s and validated at that speed. This approach would be fitting considering that second-generation PCI Express is really just gen one cranked up to a faster signaling rate. But it's a shame Nvidia didn't manage to nail 5.0GT/s on the button.
Last edited by Eastcoasthandle; 02-29-2008 at 05:06 PM.
[SIGPIC][/SIGPIC]
Bookmarks