source:http://www.techpowerup.com/reviews/N...Shady_9600_GT/
Printable View
Another Nvidia scam for sales.But because they make billion nothing will ever be done about it
Nice find.
It´s now resolved the case of 9600GT SLI in one SLI Nvidia board is better (in some games) then one HD 3870X2 in the same Nvidia board:
Quote:
It is certainly nice for NVIDIA to see their GeForce 9600 GT reviewed on NVIDIA chipsets with LinkBoost enabled where their card leaves the competition behind in the dust (even more). Also it could send a message to customers that the card performs considerably better when used on an NVIDIA chipset? Actually this is not the case, the PCI-Express frequency can be adjusted on most motherboards, you will see these gains independent of Intel/AMD CPU architecture or Intel/NVIDIA/AMD/VIA chipset.
Interesting, has huge implications for the review much discussed in this thread:
Clicky
So what are they doing then, basically automatically overclocking thier cards when using Nforce chipsets. Would be somewhat amusing if this made Nvidia cards and chipsets incompattible (for cards that have little to no overclocking headroom in them... there are always some...)
So what are the actual default clocks for 9600GT?
or it could just be that rivatuner reads the card wrong ?
nvidia drivers and gpu-z always read the card the same, but on every card I have at least one of the clocks reads different on riva tuner hardware monitor....
if they are all eminating from the same source then who is wrong?
my shader cores have been off as much as 30mhz on all the g80s'g92's
I havnt seen such a huge discrepancy in core clocks though.
from what it looks like there saying a stock 650 9600gt should read about 708 in riva tuner then ?
slightly misleading of nVidia, but not so much of a problem once people are aware of it...
but it implies that every 9600 GT can handle a 25% overclock on stock volts.
you put a 650mhz 9600 GT in a linkboost enabled board and it clocks it to 125/100 * 650mhz = 812.5mhz... is that gonna work????
or are a lot of nforce 590i boards gonna be mysteriously buggy while running 9600 GTs in SLI :rofl:
I believe linkboost is out since the 680i chipset was launched
according to techpowerup , it depends on PCI-E frequency
http://img264.imageshack.us/img264/9220/pciaa3.jpg
regards
so to sum up, every 9600gt review on nv chipset with linkboost enabled should be trashed?
is g94 the only affected chip up to now?
This examples a whole lot.
Interesting...I thought its awesome performance could be due to an tweaked arch., or maybe at the cost of IQ, but never because of this.
Very well find dude...Bad trick from nV :shakes:
Link boost has been around for quite some time.... ATi has a form of it in their RD600 as well. All it does is overclock the PCI-E bus automatically when it senses the card brand of it's choice. All this is is merely NVidia finally making a board that can take advantage of the pci-e bus clocking feature that's been around for almost 2 years now.
So, if this is true and not shens, does that mean the reviews where they overclock the cards and still hit 800mhz+ actually closer to 1000mhz? After all, just 850 * 25% would be 1062 mhz.
Calling this shady, I don't know about that one. It boosts the cards performance for the end user without worry of voiding warranty or requiring any work at all. That's not shady, that's called increasing performance. Nothing wrong with that at all.
is linkboost related only to SLIed cards or also to sinle card?
the performance increase has nothing to do with linkboost itself, with wider bandwidth or higher pci frequency, it's just a gpu overclock
actually, there's no problem in bringing to the average Joe an auto-overclocking and dummyproof performance increase
what makes it a cheat and a **** is:
- it's not documented/advertised by nvidia
- the driver reports the non-overclocked frequency
so basically looks like nvidia wants to hide this and make people think their cards at stock frequencies are faster then they actually are.
I hope this is implemented on 9800GX2 and GTX :D
I think it's pretty sweet. :party:
Allot of folks by oc'd cards and pay extra for them, this is basically no different other than not paying extra for an oc edition card.
It's free performance so I don't personally see anything wrong or shady about it.
default clock is 650mhz
http://www.techarp.com/article/Deskt...idia_4_big.png
but in that article they must have been using an overclock model 9600 GT
actual clock = 25mhz (dependant on PCI-e frequency) * 29 = 725mhz (reported by GPU-z and rivatuner overclocking)
there's also 27mhz * 29 = 783mhz from rivatuner monitoring, which the author says is incorrect
Quote:
Originally Posted by techpowerup