Before R700, this would have mattered. Now.... who cares?
Printable View
Before R700, this would have mattered. Now.... who cares?
Extelleron, by the time nehalem is out in full force, R700 will be replaced by faster parts... So, if you think just because of one part(which btw, I don't think ANYONE wants to headaches of quad-fire(or quadsli for that matter)) people are suddenly not going to want SLi ever again during the course of nehalem's lifespan... That's just out-right crazy talk... That's like thinking during the R300 days that NVidia would never make another good card.
As for NVidia forcing a chip added to intel mobo's, while it sucks, it's really not a surprise. I mean, intel's pretty much forcing them out of the chipset game to make sure they get SLi on intel chipsets, NVidia has to make something profit wise if they're going to essentially leave the chipset market completely. They'd be stupid not to. On a side-note, this is probably going to cause quite a jump in NVidia stock, seeing as how that means NVidia will profit on a lot of intel boards with very little money spent.
NVidia tax pictured :rofl::rolleyes:
well this sucks.
I read it would most likely not be implemented in the first X58 cards shipping in December, as most manufacturers have already completed and frozen the design of their first boards. So that will be probably a bit later in 2009...
:p: :D :up:
I think this is a good business move for nVidia, they wont be loosing too much money, in fact they probably will gain more money because intel chipsets sell alot better than nvidia's do.
There will probably be more than 1 SKU from Mobo manufacturers.
I doubt you are forced to buy one that has the bridge. This is good for consumers - more options
Ok clear this one up for me. If that chip was "useless", why would it be even on the 7 series chipsets and the 9800GX2? Is it just a marker chip to say "hey it's ok to allow SLI on this system"? That seems like a costly way to go about that.
The reason I ask is really because of the 9800GX2 presence. Chip counts on a graphics card are done sparingly, so it seems that the chip is more intended to actually form the bridge between two cores in SLI rather than "Nvidia DRM". Even AMD has been hard at work upgrading the bridge on it's X2 series because the communications lag between the 2 GPU cores affects the performance of CF as well.
I'll grant you that I may have gotten confused in all this but I did think it served a purpose. So SLI "works" without it, but performs better with the BR4 chips.
All the high end boards will most likely have them. And you know how the Mobo Companies work, a new chipset comes out and they release a semi high end board, then 1-2 months later they release a High end one, then that same time later they even sometimes (Asus) release "the real high end" one.
For that price it almost doesnt make sense to have 2 separate lines now, since 1 board with the chip will give all options. They would probably lose more money making 2 separate high end board lines with that little add on chip being the only difference.
Works for me, was planning on Jumping to the new Platform soon after release, postponing it just long enough to grab the second generation board that usually has alotta fixes from things they learned from the first, but couldn't implement due to the design being pretty much frozen.
now to be honest, like another poster said in this thread, I've also wondered whether this PCIe chip is just an NVidia "kinda have you got the CD in the drive" chip, or if it makes SLI work better (knowing from misc sources that SLI could work on any Intel Chipset board, with no additional chip, if the Nvidia drivers "allowed" it)...never read anything on the topic actually...
SLI worked with PCI-E 1.0 back on the i975. No Bridge Chip was needed. It was broken in drivers and nVidia had the nerve to try and blame Intel for chipset instability LOL! The real use of the Bridge Chip for SLI, is because unlike Intel and AMD, the second PCI-E 16X slot is routed through the South Bridge or MCP part of the Chipset. Even after nVidia moved to 32X they needlessly held on to the BC. Some folks wondered why, now we know.
Since nVidia, Intel and or AMD don't need to route through the lower-end of the chipset, there ZERO need for the this chip other than DRM or a nVidia Tax for SLI. For two 16X slots, hell even nVidia doesn't need it, see for yourself?
http://www.bit-tech.net/hardware/200...0i_ultra_sli/1
No one else routes any of the PCI-E like nVidia does.
http://www.bit-tech.net/hardware/200...pset_preview/1
http://www.hexus.net/content/item.php?item=13268
http://www.firingsquad.com/media/art...ge.asp/2127/01
Newer or older, none of them uses any kind of Bridge Chip. NO Fan but nVidia's should see this as OK since it is a tax on all of the market. Even when folks build nVidia boards, they'll have to pay for an un-necessary part.
I man geesh, a whole new way to squeese the consumer, NEW Intel or AMD co processor that serves no purpose but must used to boot up the processor.:down:
Ok first off, in the back of my mind I'm thinking $30 diff in cost of a mobo these days isn't the end of the world. I think many folks spend more than that on tubing...
But back on the subject at hand. Whether it's microstuttering or simply scaling with 2-4 gpu's there are performance issues that I "think" the chip was meant to deal with. Said differently, SLI works without this chip on any chipset, but has issues that in the past we didn't pay any attention to. Now, when SLI/CF "issues" are getting more attention this chip is meant to help out. Tech Report indicated that it was supposed to reduce traffic over the FSB bus by replicating certain commands.
I'm not really able to find lots more on the chip...yet...
Those are the commands being referred to.Quote:
It includes the same two technologies: the Posted-Write Shortcut (PWShort) that allows the GPUs in SLI to remain in sync with fewer commands, cutting out the need to visit main memory, and GPU Broadcast, which the CPU only has to send the data once to address multiple GPUs.
LOL, you can run SLI config on Intel MoBo, heck, even on ATi MoBo, with hacked/modded driver, so it's quite clear that SLI is truly software based, and that bridge chip is just another mean by nVidia to extort some money from enthusiast users, because their Intel chipset bussiness is basically dead with the emergence of Nehalem CPUs generation.
sounds like this chip is gonna make them still more unpopular than NForce chipsets did...:D Even if the additional cost is minimal on a mobo, the chip being basically useless, it's just unacceptable :down:
http://bp1.blogger.com/_kttUtY0Z_zw/...bloomfield.gif
According to that diagram those commands would instead have to go back through the IOH, back through the core and thenst to the controller to memory, and then back again. Seems to me even 2 regularly used commands to synch SLI operating gpu's would be aided by not going through that entire path.
Am I just out in left field?
Er. We ALL already know this. Nvidia can't make mainboards properly so this is the closest thing to them building an sli capable board, without them actually doing it.
They make money from the bridge chips, we don't have to suffer the useless crap they box up, overprice and then sell to us. Everyone wins. :D
What I'm not clear on is whether anyone has tested a performance difference between getting SLI to "work" on another chipset without the bridge and a chipset WITH the bridge. The distinction I'm trying to make is that getting something to "work" and getting similar performance out of that method compared to the official method are two different things. Up till now, the only thing we've ever been worried about is getting it to work. What if that "working" was pretty pathetic but we never knew it?
Go back to the beginning days of CF and SLI and no one cared as long as it worked and you got FPS out of it. Now the subtleties, the "issues" if you will on image quality and performance/$ are coming out and we are becoming more concerned with "how" the product works, not just THAT it works.
*the fine print*
Take these thoughts with a strong dose of salt. SLI could work with multiple screens. SLI could work on other chipsets. At least 50% of my mind is utterly sure this is marketing BS, but I'm asking the questions because that may not be 100% of the story.
that's just what I was talking about in one of my above posts: I've never read any comments or tests on the topic. No test, nothing...
edit: only platform where it would have been possible to test easily, but expensively, is the chipset 5400, as the Intel Skulltrail version has the NVidia chip, and the Asus one doesn't...
Expensive test but yes that's the kind of thing we'd have to do.