Ok from that shot it is quite clear they are a bit bigger than usual. Still don't agree with the close to 50%, I'd say maybe less than 40% ;)
Printable View
I think they are giving fermi too much credit in performance. They don't need three slot dual 8 pin monsters. Where did AMD idea of efficient chips go. At this rate with three slot coolers, they might as well slap 3 5870s on a card add 5 8 pin powers slots and call it a day.
Well if a single GPU Fermi is significantly faster than a 5870, to counter a Dual Fermi threat they have to give the cards as much juice as they can.
Of course this is taking into account Nvidia won't be seeking PCI-E certification for Dual Fermi. If it did, a dual fermi would be pointless. 250W Single vs 300W Dual - there won't be much of a perf. difference, plus the latter will be SLI
the guy that said, hey why dont we make the card a little taller instead of it being a foot and a half long should get kudos in my opinion. hopefully others will follow suit.
Maybe im just pessimistic but i dont think ati has to do anything to the 5xxx series for competing against Fermi.
Sure hope that one could get 5890 for decent price later though...
It kind of reminds me the GTX295 single PCB ... maybe it's the cooler :shrug:
480 will be 5-10% faster than 5870 if not less.
These lately nvidia benchs done at 2560x1600 shows an advantage but in vram not in power(the thing that matters). Why don't they put a 5870 2GB in that comparison.
Easy answer:
HD5870 2GB destroys fermi.
Well....
Figures from nvidia spermi (GTX 480) is not known.... Some say 3-5% (Example; Charlie from SemiAccurate, I spoke with him at Cebit), others say 20%.... THe vendors i have been talking with says tad less then 30% above HD5870.....
My thinking is just.... hmmm... nvidia spermi will be 3x GPU..... CFX is still 4 gpu..... So, I actually think Nvidias overhyped crap will suck the dust...
btw..... Nvidia is firing people from partners just because of their crap product and leaking details..... Nvidia had guards at what they say is fermi, at cebit.... you werent even allowed to see the backside..... I infact know 2 guys that has held the Fermi in their hands at Cebit.... So it was actually Fermi in the cases, however, no plexi glass for public to show.... I had a good time asking nvidia staff why they used hemlock for their demos.... for some reason they didnt consider it funny.... ;)
I personally don't think there is even a point to do a dual gpu fermi at 40nm due to the afforementioned power issues. The only viable option performance wise would be such a feat (350+watts if not more...) and I I don't see that happening. However I don't see Nvidia sitting idle. If they can do something to beat the 5970, they will do it. As far as when, how and for how much... who knows... ( the May rumours sound like total bull if you ask me )
From a technical standpoint I see a dual 470 with reduced clocks and low leakage cores being possible. This *might* match a 5970 in *some* rare cases but not likely enough to justify it ( eg they wouldn't hold the crown ) As far as low clocked 480s on 40nm? Good #%@$ing luck!
It will be interesting to see how things unfold in the coming months, that is for sure.
@ Jowy
M was just being a smart ass with them lol
If only it could fold....:mad:
:D See post #136 :)
The thread title should be : AMD's REAL answer to AMD Radeon 5970 : AIB Custom 5970s ;)
My monitors max resolution is 1920x1080 I believe. Aren't these cards overkill for a monitor with that max resolution?
Haha yeah you have a point.
I've got a problem with Stalker CoP (my IQ settings don't stick for a reason, and I hate Stalker games anyway) and Crysis plays just fine with everything maxed out if you don't apply AA.
Don't get what you're laughing at like my card was 8800gt or something. :shrug:
I played through Stalker CoP at 1920x1200 without AA ( which is shader based only due to the renderer which in turn causes a huge performance hit with single gpu configs ) and without tessellation. It was quite playable but if you want to use both AA and tessellation, you need at least 2 5850s to be playable at this resolution throughout the game. The nice thing however is the DX11 renderer runs AND looks better than DX10 even without tessellation.
I wouldn't call a 5970 overkill for 1920x1200. You'll be able to run older titles with edge detect transparency AA and have nice performance, and play most newer games with at least 4x transparency AA. The 5870 can't accomplish this in everything. For the most part though, unless you like higher IQ (edge decect, transparency ), a single 5870 should do well enough. There are a few ( but not a lot ) of games that can't handle AA at 1920x1200 on a 5870.
As far as Crysis, I've played through both with my 4870x2 and 5870 multiple times and I felt the 5870 did a tad better interestingly enough. Played through them both with a custom very high config running the 64bit exe in DX9 with 2x AA. Plenty playable for single player. A 5970 will do the same feat at stock very high settings with 4x-8x AA ( until vram limitations set in potentially )
These custom 5970s will be nice for eyefinity at extreme resolutions (3x 19x12 or 25x16 ) where more vram and higher clocks will shine. A reference 5970 should be fine for a single display for the next while.