so 280GTX is 2x8800GTX?
nice.![]()
gtx280 > 2 x 8800gtx i think, at highest settings, in crysis, anyway.
an addiditonal 8800gtx gives +20% in crysis fps according to some reviews...
http://en.expreview.com/2007/11/01/s...0-performance/
Last edited by adamsleath; 06-09-2008 at 01:28 AM.
i7 3610QM 1.2-3.2GHz
Very high @ 1920 x 1200 was unplayable ( 15 - 26fps ) for me on a 2 x 8800gtx E2180 @ 3.40ghz system.
Sli 8800gtx's only gained ME about 5-7fps more than single card
If the game is playable ( 30 - 45fps ) @ 1920 x 1200, then the card is a monster, 9800GX2 Quad Sli gets about 45fps..
http://www.maxishine.com.au/document..._quad_sli.html
yeah just look at some scores @ computerbase -> http://www.computerbase.de/artikel/h...schnitt_crysis
2x9800gtx with 4ghz quad just scores 26,8fps @ 1920x1200.
if this is really true this card will fly in every other game that is on the market and is going to be released in the next year.![]()
interesting.
i7 3610QM 1.2-3.2GHz
Those Expreview scores look dodgy Adam, 25fps @ 12 x 10, game was chugging on well enough on my 2 x 8800gtx at that res..
more waiting for more info.
at what settings?Those Expreview scores look dodgy Adam, 25fps @ 12 x 10, game was chugging on well enough on my 2 x 8800gtx at that res..
i7 3610QM 1.2-3.2GHz
This is utterly ridiculous. DX10 performance in Crysis is on par with DX9 performance (+/-10%and you can run much higher settings than that article states. 10x7? Are they retarded? That's completely CPU-bound.. how could SLI help you there?
Their numbers are too low even for a single 8800GTX though so I think their test is broken somehow..
What kind of bogus comment is this? Of course I'm aware of speedstep.. but I've never seen it on by default, and I was pretty sure CPU-z read the clock in the spec line as the "stock rated" clocks the way that some BIOS screens do.. i.e. a description
What a bunch of douchebags you are to peruse the forums looking for people to mock when they make a simple mistake... pretty sad really.
"LOLZ EPIC FAILZ OMFRG LOL"
Are you guys preteens?ing snobs
![]()
Last edited by Sr7; 06-09-2008 at 02:46 AM.
ohhh noezzz my QX9650 only runs at 2ghz.
it is a Q6700 running idle... i dont know where you get the idea that it runs crysis at 1,6ghz...
I've never used speedstep. Simple as that.. it's always been disabled as I've now said *twice* (but it doesn't seem to register for you, maybe it takes 3 times for people in third world countries).
I know games where speedstep was denoted as the cause for people showing up as "speedhacking" on certain MMOs I've played. They narrowed it down to speedstep in laptops at the time. People I know were getting hassled by the GMs for it.
Its on by default by every Intel mobo I've used... and one of the first to go as soon as I start the OC'ing
And besides, it turns the multiplier on/off depending on CPU use and it certainly is at correct speeds when playing a game
as zerazax said its on by default and thats why everyone ignores/laughs at your statement.
On every pc i have encounterd so far C1E was on. Hence the comment Luka_Aveiro made about "noobs" with there 1,6 ghz problem.
if you disable it, fine for you, but that doesn't mean its of by default.
The screen at pczilla shows a stock Q6700 @ 2,66 ghz and to some certain people here crysis is only "half multithreaded" so a quad should be more then enough to satisfy crysis.
Last edited by Hornet331; 06-09-2008 at 03:11 AM.
i have just tested crysis at 1900x1200 NoAA very high x64bit Dx10 o my rig and i get around ~59 FPS Max ~24 FPS min and ~37 FPS avg
so i guss GTX 280 is 2x8800GTX
that's useful to know.i have just tested crysis at 1900x1200 NoAA very high x64bit Dx10 o my rig and i get around ~59 FPS Max ~24 FPS min and ~37 FPS avg
i7 3610QM 1.2-3.2GHz
Bookmarks