According to reports the Nv 200 SLI chip gets quite hot yes.
Printable View
According to reports the Nv 200 SLI chip gets quite hot yes.
It was the sole reason why you needed to put a fan on the NB of the 780i mobo. It needs to be replaced by a more efficient chip.
Is it known exactly what this chip even does aside from enable SLI? Does it do something that increases performance over them just making drivers that have SLI unlocked?
It does absolutely nothing other than ticking a notch in NVIDIA drivers to support SLI. Hence why many here (me included) call them 'SLI tax chips'. They're just PCI-E bridge chips to uphold the 'SLI needs special NVIDIA SLI technology' facade even though that was busted with hacked drivers years ago.
Thats what I thought before hearing how hot it got. WTH could make it heat up so much, and if it was just something to enable SLI why would it be on a silicone chip at all instead of some much cheaper to produce flash chip or something like that?
I mean its gotta be doing some kind of workload for it to generate heat like that, unless it really is designed That badly?
They sure are highly priced tax chips. A single NV200 chip costs $30, you could get them for $20 each if you go "exclusive" and by that I mean that every X58 model you have must feature an NV200 chip.
Although the source I'm using for those prices is not exactly reliable :rolleyes:
Some options from P6T Deluxe X58 Bios
There are more options like Timings, CPU & IOH Clock Skews, DRam Ref. Voltage etcQuote:
CPU Voltage:
Min = 0.85000V
Max = 2.10000V(*)
Standard = By CPU
Increment = 0.00625V
QPI/DRAM Core Voltage:
Min = 1.20000V
Max = 2.10000V(*)
Standard = 1.20000V
Increment = 0.00625V
ICH PCIE Voltage:
Min = 1.50V
Max = 1.80V
Standard = 1.50V
Increment = 0.10V
IOH PCIE Voltage:
Min = 1.50V
Max = 2.78V
Standard = 1.50V
Increment = 0.02V
ICH Voltage:
Min = 1.10V
Max = 1.40V
Standard = 1.10V
Increment = 0.10V
IOH Voltage:
Min = 1.10V
Max = 1.70V
Standard = 1.10V
Increment = 0.02V
DRAM Bus Voltage:
Min = 1.50V
Max = 2.46V
Standard = 1.50V
Increment = 0.02V
CPU PLL Voltage:
Min = 1.80V
Max = 2.50V
Standard = 1.80V
Increment = 0.02V
CPU Differential Amplitude:
700mV ~ 1000mV (100mV steps)
BCLK Frequency Valid input value:
100 - 500 Internal Base Clock (BCLK)
DRAM Timing Mode:
- 1N (It might accelerate DRAM performance)
- 2N / 3N (It might enhance DRAM overclocking ability)
damn that bold white PCI ruins the motherboard :/
i wonder when PCI dies finally.. so old..
775 x58... how would you want to hook up the chipset to a 775 cpu?
you need qpi on the cpu, cause theres no more fsb... i dont think x58 has an fsb interface... and even if it would have, it doesnt have a memory controller like x48, so... youd end up with a cpu and chipset both lacking a memory controller... that would slightly impact performance :P
12gb 1333 runs fine, havent tried more yet :D
i live in taipei, im not travelling to taiwan :D
im working for foxconn and moved down here to make sure they get their oc boards done right :P heheh
many people still use pci soundcards, pci tv cards, etc... nobody wants to pss off their customers, so they continue to offer pci slots... :D
There are also growin number of people who does`nt need any PCI slots but who would have use for more PCI-e slots, like x8 slot for RAID card.
RIGHT! all the functionality uses the peer-to-peer writes protocol that's built right into the hardware already.. the same one that CF uses.. seems effing stupid that we need a chip to allow SLI to be authorized for use on a mobo, but I guess Nvidia want's to make their nickle anytime SLI is used.. apparently the sale of the second video card just isn't enough. :P
I guess I'll never understand people. Some time ago most people were saying something like: If only we could have SLI on an Intel chipset. Now, after a lot of drama and almost a war between these companies somehow it is possible. BUT blah it sucks! Although none of you have tried it out...
perhaps it isn't a completely dumb chip and does some kind of bus scheduling between cards, dynamic lane allocation, transaction queueing, etc. It would need to be doing something like this to produce the stupid amount of heat it does. there is also the possibility it provides a crystal oscillator source for the external PLL clock generator reference for their more recent cards, since they do some weird things when PCI-Express frequency runs below the 108MHz base clock their internal frequencies are generated off. All just speculation but no chip creates that much heat without doing anything at all. PLL Clock generator would explain the amount of heat they produce. Shame Nvidia are so secretive about these "useless" chips.
People were more interested in SLI on Intel boards when nVidia was in the lead and any serious gamer was using nVidia hardware. Now someone looking for the ultimate in performance is likely going ATI anyway.
It's nice to have the option of SLI on an Intel board, but it would be a lot nicer if the nF 200 chip did not have to be involved.
who has access to a logic analyzer to figure out what these nf200 chips actually do while in SLI mode! :>
LOL, so true man, so true....i personally am going to buy which ever X58 has the nforce chipset, SLi on intel is awesome, believe it or not ATI wont be on top forever, just because ATI has a good year doesnt mean its going to be like that forever, you think nVidia is going to sit and take the abuse??? lol i dont think so, im going to love having the option of SLI and Crossfire.
I'll definitely be doing the same, I love the idea of being able to have the option to use SLI if I want. I've never been much of a fan of Nvidia chipsets..why you may ask? I've been using Intel chipsets as long as I can remember and they have always been excellent. I am not brand specific but they do what they are expected to do, and Intel designed the architecture so why would I go and use such a critical component like a chipset from another company who doesn't have such long term experience in that area.
I have always used Nvidia cards since back in the day when they first introduced the TNT2 2D/3D cards, it was a whole lot easier than having a 2D card and a 3D card (oh 3dfx did we have good times!). Nvidia have done a lot of dumb things over the years and had their own problems but their cards have nearly always had good driver support, or at least good enough that I don't have a long winded tale of despair and disappointment to sing to the tune of it.
I know I'll live with the slightly increased premium an SLI capable board as I'm sure most of the others here will too for the convenience and option. The only real reason I can think of that will be worth making noise about will be if whatever it is these bridge chips do causes instability to devices or components connected via the Southbridge. Otherwise we'll be campers with our crossfire/SLI compatible boards and Nvidia will be no doubt be happy with the success of their SLI tax.
Thats the way I view it. I won't want the SLI chip since I don't game much [no point in multi-GPU set ups] but holy molly if you do then the option to have the best of both worlds is worth the extra cost and heat.
You could always take up reverse engineering and figure out a way to add specific chipset support, modify the detection routines in the drivers or plug in two cards and make believe you actually have SLI working ;>
We can't always have what we want, guess the saying goes "tough s**t". SLI without bridge chips is probably a pipe dream at least in the near future. Some have accepted this, others refuse to. I personally wouldn't lose any sleep either way. Just having SLI on an Intel board that works properly is more than I could have asked for.