I did read the article. You have to manually change the PCIe frequency for this to kick in. Otherwise Linkboost will do it, but again, not a lot of reviews would be affected by that.
Printable View
I can confirm that on my Abit IN9 32X-MAX board, Linkboost is an option (with latest bios) and does influence pci-e frequency if this is not set to manual control. What else can anyone do apart from test things on the hardware they have access to or simply repeat things they have read elsewhere?
No, I do not believe these cards have been running at 800 odd mhz during reviews, I believe they have been running at the speeds stated in the review, which was notably different from the values advertised for the card. I believe this is due to a different PCI-e bus speed. Others on this thread seem to have have confirmed this behaviour, which is not seen on previous cards.
so they disabled it on 680i and left it on 780i to make people think that 780 has some real advantage apart from pci express 2.0
lmao would be too lame if true
this could explain why when going from 9600gt to 9600gt sli, the performance looks soooo much better than sli setups on 7 or 8 series.
Perhaps this is why geforce 9 sli setups scale so well? nvidia looks gayer and gayer as each day passes leading up to this launch.
They shouldn't even call it a launch, more like a "squirt" or something.
so clocks go up with pci-e mhz???? is this whats happening???
I don't see a real problem here, just gives easy overclocking for novice users?
I would rather be able to set the clock myself, then run the highest possible PCIe frequency though, for a few extra 3DMark points. ;)
So let me get this clear....If I am reading this correctly:
Its a pretty good card (for the price) with an auto overclock "feature" when used with compatible boards? However this "feature" can lead to overclocks that are too high to be stable?
However you can just not use the auto "feature" and manually overclock it as much as you can until it becomes unstable and then back down till its stable?
Does that make sense?
-yonton228/timmy
Thats pretty interesting. Gonna try the stock clocks (675) with the 110 bus. Cant seem to do much better than 750 right now (700 set in riva).
UPDATE; Ran at 680 (read 734) and saw some artifacting, but posted my highest 06 so far, breaking the 17k barrier. Overall it was less than 100 point increase from what I was running before (700-shown as 756 with pci-E at 100). Maybe the artifacting was due to the vram, I'm not sure how far the mem on these can go. Currently at 1100.
well at least with this discover we can now set the gpu to intermediate clocks instead of going by steps...find the limit at 100 mhz bus then slowly rise bus freq
ie limit at 100 mhz -> 25*30 = 750 (775 unstable)
rise bus to 101 -> 25,25*30 = 757,5 (stable)
rise bus to 102 -> 25,5*30 = 765 (unstable)
rise bus to 105 -> 26,25*29 = 761,25 (stable! you fine tuned your gpu clock and gained 11,25 mhz ^^)
part of the issue is that due to Nvidia only allowing sli on their chipsets means reveiwers are required to use the nvidia chipsets to test. when you place an ati card in the board it does not get the boost where the nvidia does. had the tests been on intel chipsets (like the overwhelmong majority of single cards) the 9600's scores would have been lower.
The point being made is that on intel those boost do not exist and the consumer who buys the card based on the fact it beat the ati card by a few percent winds up getting the slower card. It is shady like W1zzard said, is it wrong...no. they should have simply called it a feature and acknowledged it. instead they deny that it is true.
score of Radeon HD3870X2 is too bigger with higher PCIexpress ...
maybe it is new standard - read reffrence clock from PCIe, not from crystal on board ...
I don't see this as a problem more of an added feature. Judging from what some of you have posted its a nice way to establish a base overclock prior to tweaking. Think of it this way how many of you will setup your overclock on the motherboard bios and then use set fsb, memset etc to gain that little bit extra for a spi run? I would consider this to be similar and just like clocking your fsb past it's boot limits if you go too far with this youll crash. However, once your in windows if may actually be a benefit to balance the pcie overclock with a driver level overclock and get better results. I wish i'd waited now and got the 9600 instead of 8800s but id like to see some tweaking to see if a pcie and driver balanced overclock can get you a better 3dmark score.
Free performance? didnt the end user buy the card to start with?
so its not free, its like having a card at stock speeds the ocing it yourself, its not free performance you just unlocked some of whats already there, which you paid for when you handed over your money:)
its time to stop buying Nvidia boards.
Tests should be fair to offer the same variables when testing.
Unless they do, performance cant be known.
I will from now on never trust a review using Nvidia boards.
honestly, i have on every nForce board Linkboost DISABLED! Because some GFX hate it ... any experienced user turn off this feature everytime, and because first GFX with refference clock from PCIe i dont want to sell my nForce boards ...
What they are saying is that NForce 680i (the silicon not the platform) can handle much higher PCI-E frequency than 100MHz. On 590i and 680i platforms LinkBoost took andvantage of this; on 780i, instead of overclocking PCI-E to 125MHz when Nvidia card is present, they overclocked PCI-E to 180MHz and connected a Nforce 200 chip. They ditched LinkBoost in 780i in order to get enough bandwidth out of PCI-E 1.1 to handle two PCI-E 2.0 cards.Quote:
Originally Posted by Eastcoasthandle
FYI, read this review. Made with an X38 chipset, compare XFX9600GT to HD3870. No nVidia tricks here :rolleyes:
that review was nothing but Nvidia tricks. get real! what kind of review doesn't even show cpu clockspeeds or use the 8.2 drivers which were available when the test was run. the 06 scores were 2,000 points higher than TPU. that would be like me reviewing 3850's and showing that they do 23,000 in crossfire on 06 and not mentioning the overclock. besides the cpu speed they do not mention pci-e clocks either.
besides the obvious what did you expect that review to prove to me? how is it in any way related to the subject of this thread? did the author even mention the clockspeed change.....NO. quit spreading your crap and trying to draw people away from the issue. the issue here is clearly stated in W1zzards review.
my friend did test on intel motherboard, raising pci-e freq with 9600gt at stock.
works on intel p35 & x38 as well. 3dmark scores go up, yet core clock reads the same:
Ok ive just done 2 benchies of 3dmark 06 heres the setup
Q6600 @ 3.2Ghz
2048 DDR2 XMS2
9600GT @ Stock 650/1625/900
P35C DS3R Rev 1.1 Intel P35 chipset.
Scores with PCI-e @ 100Hz
3dMark score 11527
SM2.0 4683
SM3.0 4387
CPU 5091
Score With PCI-e @ 110Hz
3dMark Score 12176
SM2.0 5003
SM3.0 4667
CPU 5081
So a little jump in performance there