That's the same reviews as posted on page 1 of this thread. :p:
Let's see more coming in today (hopefully)....
Printable View
This card is much more performance then 8800 ultra, more features, more cheaper.
Deal with it Mr. Intel/Nvidia biased :ROTF:
Quote:
Performance diffrence in terms of R680 to 8800Ultra
Bioshock 1280*1024 = 4% Slower
Bioshock 1920x1200 = 24% Faster
Bioshock 2560*1600 = 39% Faster
COJ 1280*1024 = 11% Faster
COJ 1920x1200 = 24% Faster
COJ 2560*1600 = 10% Faster
COJ 1280*1024 4AA 16AF = 13% Faster
COJ 1920x1200 4AA 16AF = 23% Faster
Lost Planet 1280*1024 = 27% Slower
Lost Planet 1920x1200 = 30% Slower
Lost Planet 2560*1600 = 37% Faster
Lost Planet 1280*1024 4AA 16AF = 18% Slower
Lost Planet 1920x1200 4AA 16AF = 18% Slower
Lost Planet 2560*1600 4AA 16AF = 10% Faster
Crysis 1280*1024 = 13% Slower
Crysis 1920x1200 = 2% Slower
Crysis 2560*1600 = 12% slower
COD4 1280*1024 = 42% Faster
COD4 1920x1200 = 32% Faster
COD4 2560*1600 = 26% Faster
COD4 1280*1024 4AA 16AF = 27% Faster
COD4 1920x1200 4AA 16AF = 20% Faster
COD4 2560*1600 4AA 16AF = 16% Faster
NFS: Pro 1280*1024 = 32% Faster
NFS: Pro 1920x1200 = 38% Faster
NFS: Pro 1280*1024 4AA 16AF = 72% Slower
NFS: Pro 1920x1200 4AA 16AF = 67% Slower
Serious Sam 2 1280*1024 HAA 16AF = 30% Faster
Serious Sam 2 1920x1200 HAA 16AF = 45% Faster
Serious Sam 2 2560*1600 HAA 16AF = 78% Faster
UT3 1280*1024 = 7% Faster
UT3 1920x1200 = 24% Faster
UT3 2560*1600 = 37% Faster
F.E.A.R. 1600*1200 = 20% Faster
F.E.A.R. 2048*1536 = 20% Faster
But you can get an factory oc'd GTX for about the same money though. Hopefully it's just driver problems.
As for Crysis can't use more than 128 shaders, what about people running 8800GT in SLI, that's 224 shaders total and they seems get a big boost in performance in SLI.
@ V-RR , the FPS its not ALL, we need to know if there´s any breaks while playing games , like with HD3870 crossfire .
with HD3870 CF i have a lot of MAX FPS and good Fps average , but i also have a lot of breaks ! (thats the real "problem" of the CF & SLi)
average Fps and Max FPS for me its not the most important thing , but if the games are 100% playble without breaks ( driver optimizations) , and unfortnaly in the Reviews we cant see that!!
the HD3870X2 have a new chip PEX8547 that its suposed to Optimize the Crossfire , lets wait for more game tests (from XS users)
regards
The FPS Labs review uses a Foxconn N68S7AA (nvidia 680i) SLI Motherboard which I believe uses PCIe 1.1, not 2.0 (correct me if I am wrong). From what I read the 3870X2 needs a PCIe 2.0 PCIe. Also, this is no different then the 7950gx2 vs x1950xtx last time around.
http://i11.photobucket.com/albums/a1...1916224385.jpg
Source
sourceQuote:
As you know, on the board ATI Radeon HD 3870 X2 is two graphics processor. According to the information available at the reference motherboard ATI Radeon HD 3870 X2 will be installed 1 GB of memory GDDR3, which operates at a frequency of 2 GHz processors and related 256 - bit tyres. The product is meant for connecting to the bus PCI Express 2.0, which interact with the responsibility of the special chip-bridge. The truth seems to be in this role will be used PEX6347 not, as expected, but PEX8548. At the given scheme, the bridge is at the centre of charges between the two processors. Near each processor can be found on four chip memory.
The following illustration shows that the transition from PCI Express 1.1 for PCI Express 2.0 provides a noticeable increase in productivity - around 20-30%, depending on the application and the screen resolution.
so? there are loads and loads of ultras still available...
does it matter? its making things even worse for nvidia as the gts will be their fastest card and the 3870x2 will beat that one even more :D
no, not very small... the difference between all those cards is relatively small... you pay for only 10 or 20% extra performance, thats why i never buy high end cards :D
but if you want the best of the best and have a high res display then you need a high end card. compare high res and aa and you will see the ultra pull away. thats where the 3870x2 shines as well.
id say yes...BUT for that you need a buggy POS overpriced 680 or 780 board ;)
i expect the 3870x2 price to go down, atm two 3870s cost below 350 euros in germany, and thats with gddr4! i think the 3870x2 will go down to 350 euros in a few weeks... well i hope so! :D
The results are, more or less, the expected. In my opinion, they are not bad, not bad at all, but dont forget the "age" of the ULTRA.
a) Having two dies and thus improving YIELD on the manufacturing process is better than going for a monolithic design with 30% yield. ATI probably learned from this with their R600.
b) CrossFire interconnect seems to be handled internally. The drivers take care of all that - it's not like Crysis or other games that handle CF bad will say "Oh no 3870 X2 time to stop working" Windows sees the card as a single card, not a crossfired set of GPU's.
On ATI has 320SPs @ 741 MHz (1 MADD/clock) = 474 GFlops
On nV has 128SPs @ 1350 MHz (1 MADD/clock) = 345 GFlops
ATI chip uses a VLIW architecture =5 ops per clock cycle to execute sharders in parallel (instruction pairing).
nV chip uses 2 ops per clock cycle to execute sharders in parallel (don't remember off hand).
Vectorial hardware (ATI) = good shader compiler to extract parallelism from code
Scaler hardware (nV) = shader compiler efficiency not necessary
Is this pretty much the situation?
of course windows "sees" the card as single card (CF its internal) ,windows only "see" Dual card if you have Two HD3870X2 in CF , but in games it seems that Single HD3870X2 its the same thing as HD3870 Crossfire (need to suport CF)
a good example of that :
Quote:
HD3870X2 VS 8800Ultra
NFS: Pro 1280*1024 4AA 16AF = 72% Slower
NFS: Pro 1920x1200 4AA 16AF = 67% Slower
another example:
my friend is testing one since yesterday , In Loast Coast ( Video stress test ) i have same performance with Single HD3870 as HD3870X2
regards
ROFL , nvidia fan here everyone-In denial as always. hahahaha
Dissapointed? Use some windex on those glasses dude bcuz u dont know what ure talking about hahaha. He said dissapointed. Have you looked at the ultra prices dude? You complain about the extra 50 bucks but yet the ultra is what? 700? Man, u just started my day on the mean side...IGNORANCE man.
Eh?
So having bigger PCB, double the memory, bridge chip and extra I/O etc on each chip is "better yield"?
Who is saving money? Because its not you and me :p:
Also GPUs are easy to get high yield from, even if massively huge.
Also CF, even over the bridge chip as it is. Is still corssfire with all its ups and downs. So yes, Windows actually see it as a Crossfire setup. And dongleless aswell. Just because CF is always on in drivers and you cant set it in CCC doesnt mean its any different.
I think I read somewhere, googling up 'Why does ATI have more SP but does worse?' there was a post in the nvidia forums that went like this:
Gah, I can't find it.
The point was, if I remember correctly, if the drivers are written right in the best case scenario ATI's R600 architecture can do twice the work of a G80, but in 'normal' scenarios the G80 does more operations per second.
Yeah, of course it's better yield. Yield focuses on the actual core itself. It's easier to slap two cores together than re-engineer a brand new 'native' dual core - which due to it's size is likely to run into a ton of problems in the manufacturing process. If you can make a bunch of lesser chips at a higher yield rate, why not slap two together and get more performance if you do it right? I assume you know the rudiments of chip manufacturing, so I won't lecture you about that. Which is what my whole point was based upon anyways.
I told you already, the first samples of the R600 were somewhere in the 20-30% yield range due to a too ambitious design and the pure size of the chip.
What I dont understand is how people judge this card only based on a game that struggles to even run smooth on tripple sli with 3 ultras and then you come in here and claim it isnt good enough. Lmao, so crysis is the only game people play? I bought the game the first week it came out and I only played it a few times , finished it , and I thought it wasn't all that great considering the hype...Besides the game was rushed out so the performance should have been alot better...
Crysis fans, you dont count trust me. And btw, I happen to own a 8800gtx and im not a fanboy to neither company. But its funny how some of you defend your cards as if ure worshipping it. Its just hilarious to me.
GPUs just dont like to be paired with one or more. Its in their nature. Same reason SLI/CF will most likely always forever use cores*memory size plus any CF/SLI overhead penalty. So X2 cards from AMD and nVidia is simply yet another "let the customer pay".
If they only had 20-30% yield it would be due to speed and power usage with 1 speedbin card only. Because you can easily reuse GPUs that dont match top bin. For the G80 the Ultra, GTX and GTS series are a nice example there. Or GT and GTS512.
Soon as 3870 X2 hits stateside, I will have 2 of em on Asus Maximus :up: