http://www.pconline.com.cn/diy/graph...1/1210234.html
if true, looks worth the money :)
too bod there isnt a comparison with regular crossfire card
Printable View
http://www.pconline.com.cn/diy/graph...1/1210234.html
if true, looks worth the money :)
too bod there isnt a comparison with regular crossfire card
looks like we have a new king and this can game 3d render and play videos
http://img3.pconline.com.cn/pconline...0120_thumb.jpg
teh pic is of 1280x1024 the 3870 x2 dose alot better with higher rez ull see when u got to the link
its not like having 2 gpus make it not count the gx2 counted
what will this do in crossfire i bet it will break 30k without trying
Kinda dissapointed, it beats out the 8800Ultra on the older games, but not new games like Crysis where we need the performance boost. If it's $399 it would be a great deal, but at $449, I might as well get the 8800GTX and oc that.
http://img3.pconline.com.cn/pconline...a128_thumb.jpg
http://img3.pconline.com.cn/pconline...a256_thumb.jpg
http://img3.pconline.com.cn/pconline...1280_thumb.jpg
http://img3.pconline.com.cn/pconline...2560_thumb.jpg
http://img3.pconline.com.cn/pconline...noaa_thumb.jpg
http://img3.pconline.com.cn/pconline...noaa_thumb.jpg
http://img3.pconline.com.cn/pconline...2Haa_thumb.jpg
http://img3.pconline.com.cn/pconline...noaa_thumb.jpg
http://img3.pconline.com.cn/pconline...0120_thumb.jpg
O'rly it can't use more than 128 shaders hmm....:shakes:
Since patch 1.1 one the game works well with CF too (or that is what I heard from CF users in this forum) but here is a review with the game performance 1.0 version and SLi so as you can see a 8800GT scores 27.3 FPS average in single and 45.2 in SLi and a 8800GTS 21.2 single and 38.8 in SLi I think that's quite a hefty boost...
So it CAN use more than 128SP,it's not terrbly programed and quite likes multi GPU (or atleast the Nvidia version...)
Seems to be a roughly 10-20% increase in performance across the board vs. the 8800 Ultra, depending on the game and the level of multi-GPU support.
Another review: http://www.fpslabs.com/reviews/video...3870-x2-review
20K single card 06 is about to be no big deal
Id better crack out the dice sharpish! :p:
u have to have a 3.6 ghz + cpu to get a good score theya re running a 2.8ghz quad so the scores are lower than the 1st review with a 3.6ghz
fps also killed the bandwidth and used a 680i board so the test is scued towards NV with the extra bandwidth for teh NV cards but only a degraded 16 pci-e1 lane for the ati card (a pci-e2.0 card), the 1st reviewer used a dq6 x48 witch is neutral witha full bandwidth pci-e2 slots
if R680 have it +/- the same performance like Ultra, is comparable to GTS-512because difference between Ultra and GTS-512 is small ... GTS-512 is cheaper then this R680 ...
You can choose : R680 - brutal heat, performance like Ultra (oced GTS-512), noisy cooler for 600USD! Or GTS-512 with oced the same performance, great cooler for 400USD ... what is better?
oced gts can only touch gtx not ultra anyway great results from gpu like r600 they made new king pretty impressive
Ultra is discontinued product on EOL! Today NVIDIA has only GTS-512 difference between Ultra and GTS-512 is very small
http://i29.tinypic.com/2vbod9h.png
difference is few percent ...
... OK R680 in better then Ultra ... but 2x GT in SLI are better for the same price like R680 ...
At a msrp of $450us.It appears to be the card to own.OBR you need to put on spectacles:rofl: .
Ultra isn't quite EOL yet - in fact, Nvidia still touts it as their performance king and toes the line that Ultra > GTX > GTS 512 > GT.
As far as those prices go, EUR == USD prices with regards to video cards after you take taxes and all that account. Even w/ the US dollar falling, it's been fairly close to the trend. They're expecting the 3870X2 at $449 release so it'll be a great price, especially for people who don't have dual PCIE solutions.
Reality:
R680 kickass
Cheaper and way better then 8800 ultra.
Next... :D
Let come 8800GX2.
yeah, ultra is only like 10-20% faster then new gts with AA and 3870x2 is only like 10-20% faster then ultra.
Those 20-40% more is really not worth to mention :rolleyes:
wow, you found a shop who sells it for 420€... so what ?
There is also one who already got it for 380€ and it's not even released !
I suppose they haven't read the " HD3870X2 = hardlaunch (available 24th January in Dutch shop for €399) ( 1 2 3 4 5) " thread...
Anyway. Card looks promising (from the OP's review). :yepp: I want to see more (correctly done on x38) reviews though. :)
Kickass card? Its CF on a card. Could just buy 2 HD3870..they would even be cheaper. And the heat..ouch...
Its like adding 2 G92s on a card and call it kickass. It just aint. Living with all the CF/SLI bugs, specially with multi monitors is too horrible.
Now give me a single GPU card instead. And not another R600 heatmonster.
It needs to be same or cheaper than 2 HD3870 cards. Not more...
r670 runs cooler then g92...
its a single pcb. single card. people complain they want a single card solution now you got to go "o no no it has to be single gpu too"
thats stupid, for a high end card this thing is nice. since its a single card probable uses less power too.
A little disappointed, it does beat the Ultra overall but if it cant win in Crysis then its not going to get my vote. Thats the 1 game that people would like to play at a decent framerate and its just not cutting it. Lets just hope the 8800GX2 ( not worthy of being called 9800 ) can do better.
I do agree that this cards design is better than the GX2 as having 1 pcb means it will be alot easier for those of us who want to watercool our gpus.
Not quite..RV670 runs cooler when idle, but use more power under load.
http://www.firingsquad.com/hardware/...mages/load.gif
Of course it's a single card, +- same perf as 2 3870's, price seems same as 2 3870's... (cheaper than the competition brand), and with two of this card you have quadfire.
i really, really do not understand why if people cant play crysis on 20000fps @ cinema screen with ULTRA SUPER UBER POWER AA + AF they moan? I've played it on my 8800GTS on high with slight lag on some points and hell my bro has played it on his dual xeon with an agp 6800GS (modded to ultra).
i play it on low with no aa or af, not at native and i'm happy :)
plus according to a test by maximum pc, ati's crossfire produces better images.
For most of us the gameplay of Crysis was lacking so we need the uber high graphics to make up for it. It looks like complete crap when its not at native res. Ive only played through the game once and i have no intention of playing it again until i can run it at my monitors native res of 1920x.1200 with atleast decent settings and framerates.
Because these "some people" look at videocards like a sports-war. They wanna see a "winner", so always moaning about the one brand against the other. Some moan always about both for, well, for moaning i suppose.
The only thing that matters is if the product is good for you, and worth your money. It cannot perform less than 1 3870 (i have now and it is fine, in all my games performs roughly 30% better than my old X1900XT, and played crysis 3 times @ High 1600x1200 ), and with 2 3870X2's we may have the currently highest games performance for 800 ~850 euro's. :up:
Well lets be honest, the only popular game available right now that can't be maxed on a single card setup at 16x10+ is Crysis. Every other title there shows an improvement, sure, but for those of us who are looking to finally max out Crysis it is still a disappointment that it's the only title that doesn't beat out the Ultra. Every other game available is playable at quite high resolutions with good levels of AA/AF with $200-$300 cards.
Now I know it's cheaper than the Ultra, but it's still disappointing none the less.
crysis is nvidia game like doom they don't even support cf it is polite way to say we are working for nvidia
Seems like you're not any better at getting the facts right than you are at making predictions or testing CPUs.
It is already at the same price or cheaper than 2 HD3870 and fortunately most people won't rely on you to define what is kickass and what is not.
I'm only really interested in seeing this card in crossfire compared with 8800U's in TRI-SLi. 4 GPU's vs 3...
VR-Zone reckons the launch has been pushed back to the original date... :shrug:
Maybe is AMD is finalizing new drivers.... :D
I wonder if they ran Crysis 1.1 with all the hotfixes installed. I would think the Crysis #s will be going up. Smoking performance is some of those other games.
Nice 3dmark 06, looks like a fun card to benchmark with.
Edit:
Check out the performance in Call of Juarez, it's actually playable in DX10 for the first time!
http://www.fpslabs.com/images/storie...s_r680_coj.jpg
Yeah pretty obvious that ATI will never perform that well in Crysis. Not for a while at least. I has a damn nvidia splash screen. Kinda obvious when its the only bench that the ATI doesn't win. :up: I hate hearing the nvidia fanbois using crysis as their only safety net.
More like finalizing the hardware and then work on the drivers.
I believe you can agree that the whole obvious point of X2 card is that you can have QuadXfire. Which means that AMD has a so called "enthusiast" performance card. Which really is as you say a 3870 Xfire on a single card.
So why would you then argue about the heat, which is the same as if you have a normal Xfire setup. Then to talk about the pain of bugs you get from Xfire/SLI is just pointless as people here have accepted the whole duel card concept. If you believe that things will go back to way it was with single powerful GPU then you are the one that is being delusional.
Point being X2 is a kick ass card because it lets us run QuadXfire, end of story.
You are on XtremeSystem's forum for once can you please understand that. The key word offcourse being Xtreme.
I'm surprised it performs as well as it does....but, really I don't see this driving people to sell off their GTX and ultra cards unless they want to do quadfire.
People keep saying that this card will have the same drawbacks as normal Crossfire and SLi. Hopefully some more reviews will tell us if that's true or not.
Edit: And by the way, the Crysis bench in the OP linked review is without AA or AF. I always run AF no matter what. As well, we don't know if they used medium settings or what.
Games with HD3870X2 need to suport CF (like Two HD3870 in crossfire mode)
btw:
+ 2 reviews
http://www.fpslabs.com/reviews/video...-review/page-2
http://en.expreview.com/?p=219
http://www.techpowerup.com/index.php?50373
regards
i just sit back and wait till 'official' reviews come out.
Reviews with old drivers arent worth my time
This card is much more performance then 8800 ultra, more features, more cheaper.
Deal with it Mr. Intel/Nvidia biased :ROTF:
Quote:
Performance diffrence in terms of R680 to 8800Ultra
Bioshock 1280*1024 = 4% Slower
Bioshock 1920x1200 = 24% Faster
Bioshock 2560*1600 = 39% Faster
COJ 1280*1024 = 11% Faster
COJ 1920x1200 = 24% Faster
COJ 2560*1600 = 10% Faster
COJ 1280*1024 4AA 16AF = 13% Faster
COJ 1920x1200 4AA 16AF = 23% Faster
Lost Planet 1280*1024 = 27% Slower
Lost Planet 1920x1200 = 30% Slower
Lost Planet 2560*1600 = 37% Faster
Lost Planet 1280*1024 4AA 16AF = 18% Slower
Lost Planet 1920x1200 4AA 16AF = 18% Slower
Lost Planet 2560*1600 4AA 16AF = 10% Faster
Crysis 1280*1024 = 13% Slower
Crysis 1920x1200 = 2% Slower
Crysis 2560*1600 = 12% slower
COD4 1280*1024 = 42% Faster
COD4 1920x1200 = 32% Faster
COD4 2560*1600 = 26% Faster
COD4 1280*1024 4AA 16AF = 27% Faster
COD4 1920x1200 4AA 16AF = 20% Faster
COD4 2560*1600 4AA 16AF = 16% Faster
NFS: Pro 1280*1024 = 32% Faster
NFS: Pro 1920x1200 = 38% Faster
NFS: Pro 1280*1024 4AA 16AF = 72% Slower
NFS: Pro 1920x1200 4AA 16AF = 67% Slower
Serious Sam 2 1280*1024 HAA 16AF = 30% Faster
Serious Sam 2 1920x1200 HAA 16AF = 45% Faster
Serious Sam 2 2560*1600 HAA 16AF = 78% Faster
UT3 1280*1024 = 7% Faster
UT3 1920x1200 = 24% Faster
UT3 2560*1600 = 37% Faster
F.E.A.R. 1600*1200 = 20% Faster
F.E.A.R. 2048*1536 = 20% Faster
But you can get an factory oc'd GTX for about the same money though. Hopefully it's just driver problems.
As for Crysis can't use more than 128 shaders, what about people running 8800GT in SLI, that's 224 shaders total and they seems get a big boost in performance in SLI.
@ V-RR , the FPS its not ALL, we need to know if there´s any breaks while playing games , like with HD3870 crossfire .
with HD3870 CF i have a lot of MAX FPS and good Fps average , but i also have a lot of breaks ! (thats the real "problem" of the CF & SLi)
average Fps and Max FPS for me its not the most important thing , but if the games are 100% playble without breaks ( driver optimizations) , and unfortnaly in the Reviews we cant see that!!
the HD3870X2 have a new chip PEX8547 that its suposed to Optimize the Crossfire , lets wait for more game tests (from XS users)
regards
The FPS Labs review uses a Foxconn N68S7AA (nvidia 680i) SLI Motherboard which I believe uses PCIe 1.1, not 2.0 (correct me if I am wrong). From what I read the 3870X2 needs a PCIe 2.0 PCIe. Also, this is no different then the 7950gx2 vs x1950xtx last time around.
http://i11.photobucket.com/albums/a1...1916224385.jpg
Source
sourceQuote:
As you know, on the board ATI Radeon HD 3870 X2 is two graphics processor. According to the information available at the reference motherboard ATI Radeon HD 3870 X2 will be installed 1 GB of memory GDDR3, which operates at a frequency of 2 GHz processors and related 256 - bit tyres. The product is meant for connecting to the bus PCI Express 2.0, which interact with the responsibility of the special chip-bridge. The truth seems to be in this role will be used PEX6347 not, as expected, but PEX8548. At the given scheme, the bridge is at the centre of charges between the two processors. Near each processor can be found on four chip memory.
The following illustration shows that the transition from PCI Express 1.1 for PCI Express 2.0 provides a noticeable increase in productivity - around 20-30%, depending on the application and the screen resolution.
so? there are loads and loads of ultras still available...
does it matter? its making things even worse for nvidia as the gts will be their fastest card and the 3870x2 will beat that one even more :D
no, not very small... the difference between all those cards is relatively small... you pay for only 10 or 20% extra performance, thats why i never buy high end cards :D
but if you want the best of the best and have a high res display then you need a high end card. compare high res and aa and you will see the ultra pull away. thats where the 3870x2 shines as well.
id say yes...BUT for that you need a buggy POS overpriced 680 or 780 board ;)
i expect the 3870x2 price to go down, atm two 3870s cost below 350 euros in germany, and thats with gddr4! i think the 3870x2 will go down to 350 euros in a few weeks... well i hope so! :D
The results are, more or less, the expected. In my opinion, they are not bad, not bad at all, but dont forget the "age" of the ULTRA.
a) Having two dies and thus improving YIELD on the manufacturing process is better than going for a monolithic design with 30% yield. ATI probably learned from this with their R600.
b) CrossFire interconnect seems to be handled internally. The drivers take care of all that - it's not like Crysis or other games that handle CF bad will say "Oh no 3870 X2 time to stop working" Windows sees the card as a single card, not a crossfired set of GPU's.
On ATI has 320SPs @ 741 MHz (1 MADD/clock) = 474 GFlops
On nV has 128SPs @ 1350 MHz (1 MADD/clock) = 345 GFlops
ATI chip uses a VLIW architecture =5 ops per clock cycle to execute sharders in parallel (instruction pairing).
nV chip uses 2 ops per clock cycle to execute sharders in parallel (don't remember off hand).
Vectorial hardware (ATI) = good shader compiler to extract parallelism from code
Scaler hardware (nV) = shader compiler efficiency not necessary
Is this pretty much the situation?
of course windows "sees" the card as single card (CF its internal) ,windows only "see" Dual card if you have Two HD3870X2 in CF , but in games it seems that Single HD3870X2 its the same thing as HD3870 Crossfire (need to suport CF)
a good example of that :
Quote:
HD3870X2 VS 8800Ultra
NFS: Pro 1280*1024 4AA 16AF = 72% Slower
NFS: Pro 1920x1200 4AA 16AF = 67% Slower
another example:
my friend is testing one since yesterday , In Loast Coast ( Video stress test ) i have same performance with Single HD3870 as HD3870X2
regards
ROFL , nvidia fan here everyone-In denial as always. hahahaha
Dissapointed? Use some windex on those glasses dude bcuz u dont know what ure talking about hahaha. He said dissapointed. Have you looked at the ultra prices dude? You complain about the extra 50 bucks but yet the ultra is what? 700? Man, u just started my day on the mean side...IGNORANCE man.
Eh?
So having bigger PCB, double the memory, bridge chip and extra I/O etc on each chip is "better yield"?
Who is saving money? Because its not you and me :p:
Also GPUs are easy to get high yield from, even if massively huge.
Also CF, even over the bridge chip as it is. Is still corssfire with all its ups and downs. So yes, Windows actually see it as a Crossfire setup. And dongleless aswell. Just because CF is always on in drivers and you cant set it in CCC doesnt mean its any different.
I think I read somewhere, googling up 'Why does ATI have more SP but does worse?' there was a post in the nvidia forums that went like this:
Gah, I can't find it.
The point was, if I remember correctly, if the drivers are written right in the best case scenario ATI's R600 architecture can do twice the work of a G80, but in 'normal' scenarios the G80 does more operations per second.
Yeah, of course it's better yield. Yield focuses on the actual core itself. It's easier to slap two cores together than re-engineer a brand new 'native' dual core - which due to it's size is likely to run into a ton of problems in the manufacturing process. If you can make a bunch of lesser chips at a higher yield rate, why not slap two together and get more performance if you do it right? I assume you know the rudiments of chip manufacturing, so I won't lecture you about that. Which is what my whole point was based upon anyways.
I told you already, the first samples of the R600 were somewhere in the 20-30% yield range due to a too ambitious design and the pure size of the chip.
What I dont understand is how people judge this card only based on a game that struggles to even run smooth on tripple sli with 3 ultras and then you come in here and claim it isnt good enough. Lmao, so crysis is the only game people play? I bought the game the first week it came out and I only played it a few times , finished it , and I thought it wasn't all that great considering the hype...Besides the game was rushed out so the performance should have been alot better...
Crysis fans, you dont count trust me. And btw, I happen to own a 8800gtx and im not a fanboy to neither company. But its funny how some of you defend your cards as if ure worshipping it. Its just hilarious to me.
GPUs just dont like to be paired with one or more. Its in their nature. Same reason SLI/CF will most likely always forever use cores*memory size plus any CF/SLI overhead penalty. So X2 cards from AMD and nVidia is simply yet another "let the customer pay".
If they only had 20-30% yield it would be due to speed and power usage with 1 speedbin card only. Because you can easily reuse GPUs that dont match top bin. For the G80 the Ultra, GTX and GTS series are a nice example there. Or GT and GTS512.
Soon as 3870 X2 hits stateside, I will have 2 of em on Asus Maximus :up:
Actually, as stated in the review they used the pre-release demo of Crysis which was known to have Crossfire issues.
I want to see a REAL review. Someone please review the R680 on Crysis using the latest drivers, plus Crysis 1.1 Retail, plus the hotfixes installed.
where are all these hotfixes located? are they just windows updates?
Sorry to burst anyone's bubble, but Quad CF mode is not scaling properly so don't get your hopes up.
No surprise really, 7950GX2 quad was a joke as well.
So what happens if you run the card on a non PCIE 2.0 board? Will it not provide enough power or will bandwidth be limited or both?
On XP, it should be exactly the same scenario as Quad-SLI. Tri-SLI should perform excellently (if the reasons for Quad-SLI's failures were truthful) and so should a 3 GPU ATI setup.
Performance in Vista could be a crapshoot.
Decreased performance due to DX10 being lame vs. Increased performance due to DX9 limitations being eliminated.
Which one is greater?
I still want to see how Quad-SLI performs in Vista under DX9 compared to XP DX9. That should give some answers.
"The most important game since the original Half Life" LOL
For crying out loud, this "game" is unplayable on ANY system in eye candy mode in HD rez.
And lats be real! if Half of NVIDIA's dev support team worked on this game, and it has build on NV hardware, do ATI Catalyst wizards have any chance to optimize their code for this tech demo... khm... game?
Only thing that can help is patches with appropriate shader code that will bring out best form ATI math heavy architecture... and if you imagine amount of money NVIDA have spend on Crysis, I sincerely doubt that will ever happen!
Which you could only do on a Nvidia chipped mobo. That would be a huge client base. :rofl: :owned:
Majority of the newer motherboards are based on Intel chipsets, so it really doesnt matter if 2 8800GT cards can beat a 3870X2. ATI will still sell crap load of cards. And if they get the scaling up to par with 2 3870X2 cards those 8800GT cards will mean completely nothing for the majority of users.
If Nvidia actually had larger share of the market with their buggy a$$ motherboards I would agree with you.
I love Ati, however I'm not sold on this....I will buy an x2 video card when they make it just like they make dualcore and quad cpu's...........the technologies there......the link would be instantaneous.......no need for the bridge......Thats really where the gpu market needs to head......screw these dual card setups and dual gpu's.......gimme a card with a dual core gpu or even a quad gpu in a single socket.....hell why not just make a socket with a removable gpu........sell the GPU seperately from the Graphic board..........works for the processor manufacturers.......
Heck why not just let us buy the stuff seperate, etc memory, gpu, graphics board.......we can tailor it to our own needs just like we do with the rest of the pc.....vid cards with ddr dimms and a socket.......
look at a review of 3870 cf against 8800gt sli.... the nvidia part is already owned price/performance/power :welcome:
http://en.expreview.com/?p=53
its funny how ATI keeps releasing cards that can only match the 8800 cards... which came out what... a year and a half ago?
this X2 will be interesting to see... but not much else.
You cant think on CPUs and GPUs like that. So they would never make such a dualcore or quad.
Anyway, socket is a good idea and possible. It was also up with the MXM boards. Memory is a bigger issue tho due to speed and latency. You simply cant have it in say a DIMM slot without huge penalties.
But imagine such a GFX card, it would basicly be a new mobo. So much more VRM modules and what not.
When we get fusion and value nehalem we are basicly starting from scratch again. With IGP performance (maybe x2 or x3) and build up from there.
Else the only options is MXM approach boards.