So is Crysis playable at 1920x1200 AA/AF (at least 30-40fps) on this SINGLE card as it was rumoured before?
Printable View
So is Crysis playable at 1920x1200 AA/AF (at least 30-40fps) on this SINGLE card as it was rumoured before?
People are saying its both a dissapointment and improvement. Obviously, it's the fastest single GPU out there, but it still won't conquer the elusive Crysis 2560x1600 without multiple GPU's so take it for what its worth. Also, I'm hearing rumors now that in some games especially towards lower reses, the gap between say a GX2 and GTX280 won't be that big whereas at high reses (1920x1200+) you'll see it really shine. Somewhat makes sense, since those are some ridiculous specs and so the card might be well future-proofed. And that's not a bad thing - having > GX2 performance in a single GPU is quite amazing. The bigger issue to me is whether the cost and power usage is worth the performance vs. other alternatives, which there aren't many at my resolution.
Obviously it makes sense: besides Crysis, no game out there is really so demanding that the G80 GTX/Ultra and G92's cant handle at 1600x1200 and lower and the complete power of the GTX280 may not even be used at medium-range reses and lower
yeah, honestly at 19x and no AA there's no jaggies for me @ very high
the trees leaves are still the worst I've seen in any game though
ps1 games had less fugly jaggy leaves
Sorry guys for the MILLION time what is the OFFICIAL RELEASE date of the 9900GTX card? Everything is changing so much I lost track. Thanks
17 NDA lifts On 18th cards should pop up in the shops
They were pretty spot on with G92 and RV670
well yeah,
but I was assuming that the searcher isn't a complete narutard with no common sense
to restate without assuming this.
Search chinese sites, disregard any old charts/graphs
no-named poorly represented name charts and if the site has the words inquirer or fudzilla on it then ignore
use googles feature to block the fund and inq,
also vr-zone and vrzone as the (now old) vr stuff is EVERYWHERE
remember to search taiwan too ;)
and happy bench hunting
Edit: to give you a basic Idea.
Mess about with a template similar to this,
change the country and in page settings
code for the lazy - http://www.google.co.uk/search?hl=en..._nhi=&safe=off
Or just check the first charts here: http://www.benchzone.com/Charts/gtx280/
..........
............
................ haha, gotcha :D
Must....read.......full......post:party:
Flinch :D :rotf:
Sorry bud, I was tempted to do that :D
My eyes stopped reading once I saw the link :D
After NDA Lift (17th for nvidia i think) :p:
regards
http://img501.imageshack.us/img501/5638/40920312ol2.png
POST AWAY NDA IS LIFTED!!! gtx 280 whopps 9800gx2 in Crysis and this is even on early drivers!! IMAGINE 3 of these babies in 3-way SLI! zomggggg... goodnesss :P Judging by the kind of FPS 98Gx2 get's in crysis i am guessing the settings was Veryhigh 1680x1050
One thing hit me tho, why couldn't nvidia make a card like this one year ago?
Allready postet. http://www.xtremesystems.org/forums/...postcount=1163
Oh sorry! You can zoom a picture... But why would you want a lesser version of the same graph? Dont get it...
he was pointing out crysis on the graph ... note the added black lines.
Any actual real results after 50 pages of bi*ching? :)
Hi,
try running the game at 2560x1600 or 1920x1200 set everything to maximum including all draw distances and set to 16xAF and 16xQAA.
Then go on one of the 24 man raids where you have tons of NPCs and particle effects flying around the screen.
Those settings and environment will be a slideshow on one 9800gtx.
Well if the cards really have NDA lifted now I think NVIDIA must have pretty much confidence in the cards.
16xQAA will bring pretty much any card to a slideshow on pretty much any game at high res, but who actually needs that? I personally can't tell 4xAA from 8x from 8xQ from 16x from 16xQ at 2560x1600 with a 9800GX2 (other than through the plummeting framerates as AA increases), and can barely tell 4x from 2x at that resolution. I am planning to get a GTX 280 (or two) due to the 9800GX2's memory bandwidth leading to terrible performance dropoff at high res, but I don't have any expectations of 16xQ suddenly becoming usable.
yeah cant wait for benches... how come the chinese web site hasnt leaked em yet? :}
I just ordered a 9800 GX2, but the 9900 GX2 will only have smaller die (55nm) and because of that higher clocks rite?
Try to cancel or not? (aww I just want to play games :()
CANCEL now !! there's more than just higher clock rates, more ram, better bandwith,... it will give better performance than ya current X2 for sure... and if it doesn't you can get an 9800X2 for cheap a few weeks from now... If it's the eVGA card you can still step up... but you will be without a GFX card for a few weeks for sure... (Step up works fab just you need to be patient)
You chose the worst moment possible to buy a GFX card... CANCEL IT GO GO GO :up:
I agree, even if it is an evga card, definitely cancle it because there's no guarantee on what's coming from both sides, who knows the 4870 might be on par with the 9800gx2 or perhaps the 4870x2 might have be oc'd to barely keep up with the gtx 260.
I agree, bad timing on buying a gfx card, wait til the benchmarks come from both sides and then decide for yourself (although unless if the micro-stuttering issue is solved with the r700, even if it more powerful than the gtx 280 you may want to take the gtx over it)
No, it actually only has 512mb of ram. This is the common misconception, but to make it short, due to the way sli works, both gpus need a dedicated 512mb of ram that has everything mirrored on each other, meaning you really only get 512mb that you use. The r700 is supposed to change this, but no one is sure yet whether or not that's true
besides, 9800gx2 is supposed to be EOL right now anyways, last rumors were saying once the gtx series is out nvidia will phase it out (probably quickly since it can use the g92s for the 9800gtx)
right production cost wise it has 1gb of ram, but you still only get to USE 512mb of is my point
2 cards are not sharing 512MB, so with AFR each card effectively renders half the pixels on the screen - isnt that how it works?
which means that calc's for only half the pixels/output are utilising each 512 memory buffer, but i dont really know how it works, assuming AFR does work effectively in some apps.
the point being that if afr works properly, 512+512MB should be available, but i dont know what happens in real life gpu/buffer sharing with dual card loads. im guessing that some apps work well and others not at all, or badly.
Not really, in fact not at all, he was talking about what would happen if the 9800gx2 was outperformed by the gtx 260, and in terms of performance that makes a difference.
Doesn't matter, the framebuffer has to be mirrored on the memory, that's why people always say even though you have sli you really only have same amount of memory as your original card, and thus why people often with sli and xfire setups opt for the higher memory version even if they don't really need it
if this is true then it seems silly, and to have redundant mirrored information is also silly.Quote:
the framebuffer has to be mirrored on the memory
load sharing is how sli works; not load mirroring....but i guess the info has to be combined...where? all in one of the gcards memory buffers?
mmm a shared memory pool would be good.
i guess.
still woefully limited and expensive imo.
then it is woefully limited by redundant mirrored info.Quote:
No.
easier to sit back and watch others create the problems, rather than fix them.
even easier to be a fanboy...actually maybe not.
more memory.
haha i'll pretend to be a frame buffer and just be redundant :ROTF: half redundant anyway.
^^ :rofl: :ROTF: ^^
:up:
** As said like Dr. Evil ** --> You can put frikin LAZERS on their heads!
Well the thing to remember is that these new cards are going to have 1gb of memory, aren't they? So although even with 2 GTX 280's in SLI you'll still 'only' have a 1gb frame buffer, but i'm thinking this should be enough to cope with Crysis @ high res, assuming the rest of the card can keep up :confused:
Its not redundant, if it wasn't mirrored you'd have tons of more microstuttering than we already do.
Exactly, why do you think there's so much argument over whether the r700 is an mcm or not? At this point I'm thinking it isn't, but it is still possible to have a shared memory pool as long as the gpus have an external memory controller. Problem with that is then you're running into possible latency issues. See why this is so difficult to fix something like this?
honestly, you won't really see any difference, only advantage the 9800gtx really has is the better binned g92 that can oc higher with the faster memory
Me thinks folks at VR-Zone really need to learn how to use a camera...
:P
I think they did it on purpose to highlight the PCI-E pins
I think it was intentional to show the 6+8 pin requirement too, they have other pics that show the blurred area more clearly.
Someone posted this at B3d... looks like its from PCInlife
http://bbs.chiphell.com/attachments/...HFvBdoLGz9.jpg
graphs with no photos are fake IMO, but those #'s are real close;)
I'm so tired of meaningless 3DMarks...
:(
this explains nothingQuote:
Its not redundant, if it wasn't mirrored you'd have tons of more microstuttering than we already do.
why and how is information mirrored in each frame buffer?
i think there are problems combining the input from more than one gpu into the single screen output signal, and the execution is still inherently flawed.
and if more people understood the real problems, then the less crappy outcomes with pi$$poor drivers would result.
mirroring would be good if i was viewing a dynamic inkblot:rolleyes:
is the info mirrored for synchronisation? other reasons?
more information please.
2 gpu's on one pcb - has an onboard mem controller i assume?Quote:
, but it is still possible to have a shared memory pool as long as the gpus have an external memory controller. Problem with that is then you're running into possible latency issues. See why this is so difficult to fix something like this?
they'll just bung in more memory and hope that the buffers dont max out , i suppose.
go to the 4xxx series thread, people have been ranting about the problems of multi gpu for the past ~15 pages, I'm sure you can figure out what you need
and 2 gpus on 1 pcb means exactly that, 2 gpus on 1 pcb, you'd need a gpu designed to have an external memory controller (or at least a disabled onboard mem controller and external controller) for that to happen. Nothing real special about the 3870x2, and the 4870x2 really only has a superior bridge chip. However that superior bridge chip may simply be what ati needed to fix a lot of the 3870x2's problems
sync problems remain unsolved either way.
call it ranting if you will
but time will tell whether a "superior" bridge chip solves anything.
any 4870X2's with 2 x 1GB memory?
i hope it works
now im thinking 2 X 4870 1gb cf vs gtx 280 :/ :lol:
Not true, see if you have a shared memory pool, you don't have to mirror the frame buffer anymore, and if your drivers work right, then you can align the frames properly (as in even spacing between frames, that's the big issue behind the micro-stuttering, due to many reasons, the frames aren't lined up/rendered at the same rate) and thus micro-stuttering is gone. This new bridge chip is supposed to offer 160 GB/s, so that may be enough to keep an mcm happy or at least eliminate/greatly reduce the latency between the gpus and memory pools
as for 4870x2 2gb, that's up to the AIBs, the r600 had 1gb versions (even a 2gb workstation edition), but AFAIK the 3870 only had 512mb versions, so once again, that's up to the third party manufacturers. But regardless if the new bridge chip works as planned, you're better off buying a 4870x2 over 2x 4870 for both price and performance reasons
Wow that really sucks if true.
Only 60% faster than 8800 Ultra at Extreme? How is that even possible given the specs?
The number 48xx appears to be true from the rumors we have heard, but I never saw it compared to the Ultra so I didn't realize how crappy of a score it was.
I'm slowly becoming less and less excited about these cards :(
That's comforting to hear.
I never put much stock in 3DMark, but when they are the only (approximate) numbers available, it's hard not to pass judgment.
I'm glad it's good in games - I'm looking at it for my next upgrade ;)
You're already playing with it. How fortunate. :)
GAR, I hate you...In a very nice way, and since you're in LA, I may have to visit so I can see the card...erm..I mean to see how you are doing:D