Is there any notable difference?
Printable View
Is there any notable difference?
And how will these two perform in Tri-crossfire or Quad-crossfire?
Strange how little info is out on this stuff.
2.0 is capable of twice the bandwidth thus the ability to run tri and quad crossfire. 1.0 will not support them.
If using only a single pcie 2.0 card in a 1.0 slot, will there be virtually no performance hit?
http://www.xbitlabs.com/images/news/.../3-way-sli.jpgQuote:
Initially Nvidia plans to enable triple SLI support for the top-of-the-range GeForce 8800 GTX and Ultra graphics cards, however, eventually it may support 3-way configurations of other GPUs as well. Systems with three graphics cores will be powered by Nvidia nForce 680i as well as nForce 780i platforms with the former supporting PCI Express 1.1/1.0a, whereas the latter featuring PCI Express 2.0 along with a special “BR04” switch for more efficient multi-GPU operation.
;)
call me crazy but i still think we need to work on single cards more then making dual/trip/quad card solutions work... :rolleyes: :rolleyes: :rolleyes: :rolleyes: :cool: :cool: :cool: :p: :p: :p:
someone did a test of a 8800gt in 16x and in 8x and there was <1% difference between the 2. The differnce between 16x and 16x v2 should be even slimmer.
but u have 200W through the slot V less than 100 in the normal pci-e16x
power wise 2.0 makes it possable to get rid of the 2nd power plug on the Graphics Card
ok, so Im not making a compromise if I buy a PCI-E 2.0 card and put it in my P-965 chipset with PCI-E 1.0
correct @ above post
So if I put a pci-e 2.0 card in my 1.0, is it going to work how it's supposed to or is it going to have less bandwidth?
It'll be backward compatible with PCI-E 1.0, the 2.0 card will just be limited to the bandwidth the v1.0 slot is capable of providing, as I understand it--kind of like plugging in SATA3G drives to SATA1.5 headers, although no hard drive saturates the original 1.5Gbp/s pipe as of yet.
Plus, PCI-E 2.0 slots can deliver 150w of power through the slot itself as opposed to 75w on PCI-E 1.0 slots.
is there a bench with pci-e2 compliant cards in pci-e1 v 2
I highly doubt there are any cards out there saturating the bandwidth of a PCI-E x16 v1.0 slot at the moment.
Is the 8800gt the first ever PCI-E 2.0 consumer video card?
no its the first official pci-e2 card the 2900xt dose pci-e2 also just as an amd approved oc
I had the same exact question. Is a 8800gt card going to work in an ABIT IP35 Pro LGA 775 Intel P35 ATX motherboard?
I just read in an internet review of the 8800gt that it's compatable with a PCI-E slot.
teh diffrence on pci-e1-2 is that 1 has 100mhz and 16bit addressing space and 75W pci-e2 has 200mhz 32bit addressing space and 200W and most cards with external power will be pci-e1 compatable
Hmm someone here found this and I grabbed this..sorry, I canīt remember your username but thanks anyway.
The big question, if this is correct http://www.bit-tech.net/news/2007/06...ports_pcie_2/1 makes me wonder, if P35 is pci-e 2.0 compatible, are all the P35-based motherboards pci-e 2.0 ready? I mean, is it possible to enable pci-e 2.0 via biosupdate or something like that? Iīd say that would be cool, even if the advantage isnīt so big.
ps. waiting for my Q6600 G0 :rocker:
its from june, so if theres no news about it still, its safe to say p35 is pci-e 1.1 (not 1.0, I don't think that even reached production mobos...) only. and pci-e 2.0 supports 150w, not 200w. finally, for any posters in the future that AGAIN feel the need to ask, yes, cards designed for either slot are compatible with both standards.
you will not lose any performance whatsoever by running a PCI-E 2.0 card in a PCI-E 1.1 slot.