Another Nvidia scam for sales.But because they make billion nothing will ever be done about it
Nice find.
It´s now resolved the case of 9600GT SLI in one SLI Nvidia board is better (in some games) then one HD 3870X2 in the same Nvidia board:
It is certainly nice for NVIDIA to see their GeForce 9600 GT reviewed on NVIDIA chipsets with LinkBoost enabled where their card leaves the competition behind in the dust (even more). Also it could send a message to customers that the card performs considerably better when used on an NVIDIA chipset? Actually this is not the case, the PCI-Express frequency can be adjusted on most motherboards, you will see these gains independent of Intel/AMD CPU architecture or Intel/NVIDIA/AMD/VIA chipset.
Interesting, has huge implications for the review much discussed in this thread:
Clicky
So what are they doing then, basically automatically overclocking thier cards when using Nforce chipsets. Would be somewhat amusing if this made Nvidia cards and chipsets incompattible (for cards that have little to no overclocking headroom in them... there are always some...)
Serenity:
Core2 E6600
Abit IN9 32X-MAX
Corsair PC2-6400C4D
2x BFG OC2 8800GTS in SLI
Dell 3007WFP-HC
So what are the actual default clocks for 9600GT?
TAMGc5: PhII X4 945, Gigabyte GA-MA790X-UD3P, 2x Kingston PC2-6400 HyperX CL4 2GB, 2x ASUS HD 5770 CUcore Xfire, Razer Barracuda AC1, Win8 Pro x64 (Current)
TAMGc6: AMD FX, Gigabyte GA-xxxx-UDx, 8GB/16GB DDR3, Nvidia 680 GTX, ASUS Xonar, 2x 120/160GB SSD, 1x WD Caviar Black 1TB SATA 6Gb/s, Win8 Pro x64 (Planned)
or it could just be that rivatuner reads the card wrong ?
nvidia drivers and gpu-z always read the card the same, but on every card I have at least one of the clocks reads different on riva tuner hardware monitor....
if they are all eminating from the same source then who is wrong?
my shader cores have been off as much as 30mhz on all the g80s'g92's
I havnt seen such a huge discrepancy in core clocks though.
from what it looks like there saying a stock 650 9600gt should read about 708 in riva tuner then ?
Mpower Max | 4770k | H100 | 16gb Sammy 30nm 1866 | GTX780 SC | Xonar Essence Stx | BIC DV62si | ATH AD700 | 550d | AX850 | VG24QE | 840pro 256gb | 640black | 2tb | CherryReds | m60 | Func1030 |
HEAT
slightly misleading of nVidia, but not so much of a problem once people are aware of it...
but it implies that every 9600 GT can handle a 25% overclock on stock volts.
you put a 650mhz 9600 GT in a linkboost enabled board and it clocks it to 125/100 * 650mhz = 812.5mhz... is that gonna work????
or are a lot of nforce 590i boards gonna be mysteriously buggy while running 9600 GTs in SLI![]()
Last edited by hollo; 02-29-2008 at 10:01 AM.
I believe linkboost is out since the 680i chipset was launched
Last edited by mascaras; 02-29-2008 at 10:15 AM.
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
so to sum up, every 9600gt review on nv chipset with linkboost enabled should be trashed?
is g94 the only affected chip up to now?
Last edited by Tuvok-LuR-; 02-29-2008 at 09:47 AM.
3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R
Last edited by mascaras; 02-29-2008 at 09:49 AM.
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
This examples a whole lot.
[SIGPIC][/SIGPIC]
Interesting...I thought its awesome performance could be due to an tweaked arch., or maybe at the cost of IQ, but never because of this.
Very well find dude...Bad trick from nV![]()
Link boost has been around for quite some time.... ATi has a form of it in their RD600 as well. All it does is overclock the PCI-E bus automatically when it senses the card brand of it's choice. All this is is merely NVidia finally making a board that can take advantage of the pci-e bus clocking feature that's been around for almost 2 years now.
So, if this is true and not shens, does that mean the reviews where they overclock the cards and still hit 800mhz+ actually closer to 1000mhz? After all, just 850 * 25% would be 1062 mhz.
Calling this shady, I don't know about that one. It boosts the cards performance for the end user without worry of voiding warranty or requiring any work at all. That's not shady, that's called increasing performance. Nothing wrong with that at all.
is linkboost related only to SLIed cards or also to sinle card?
3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R
the performance increase has nothing to do with linkboost itself, with wider bandwidth or higher pci frequency, it's just a gpu overclock
actually, there's no problem in bringing to the average Joe an auto-overclocking and dummyproof performance increase
what makes it a cheat and a **** is:
- it's not documented/advertised by nvidia
- the driver reports the non-overclocked frequency
so basically looks like nvidia wants to hide this and make people think their cards at stock frequencies are faster then they actually are.
Last edited by Tuvok-LuR-; 02-29-2008 at 10:11 AM. Reason: swearing
3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R
I hope this is implemented on 9800GX2 and GTX
I think it's pretty sweet.![]()
Allot of folks by oc'd cards and pay extra for them, this is basically no different other than not paying extra for an oc edition card.
It's free performance so I don't personally see anything wrong or shady about it.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
default clock is 650mhz
http://www.techarp.com/article/Deskt...idia_4_big.png
but in that article they must have been using an overclock model 9600 GT
actual clock = 25mhz (dependant on PCI-e frequency) * 29 = 725mhz (reported by GPU-z and rivatuner overclocking)
there's also 27mhz * 29 = 783mhz from rivatuner monitoring, which the author says is incorrect
Originally Posted by techpowerup
Bookmarks