Page 55 of 77 FirstFirst ... 5455253545556575865 ... LastLast
Results 1,351 to 1,375 of 1917

Thread: GeForce 9900 GTX & GTS Slated For July Launch

  1. #1351
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    so 280GTX is 2x8800GTX?

    nice.

  2. #1352
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    gtx280 > 2 x 8800gtx i think, at highest settings, in crysis, anyway.
    an addiditonal 8800gtx gives +20% in crysis fps according to some reviews...

    http://en.expreview.com/2007/11/01/s...0-performance/
    Last edited by adamsleath; 06-09-2008 at 01:28 AM.
    i7 3610QM 1.2-3.2GHz

  3. #1353
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    332
    Quote Originally Posted by AliG View Post
    meh, according to many reviews, without any aa dx10 crysis doesn't really tax your gfx all that much more than dx9 crysis, and playable is up to the user's discretion. you don't need graphs, numbers are just fine


    Very high @ 1920 x 1200 was unplayable ( 15 - 26fps ) for me on a 2 x 8800gtx E2180 @ 3.40ghz system.

    Sli 8800gtx's only gained ME about 5-7fps more than single card

    If the game is playable ( 30 - 45fps ) @ 1920 x 1200, then the card is a monster, 9800GX2 Quad Sli gets about 45fps..

    http://www.maxishine.com.au/document..._quad_sli.html

  4. #1354
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by adamsleath View Post
    gtx280 > 2 x 8800gtx i think, at highest settings, in crysis, anyway.
    yeah just look at some scores @ computerbase -> http://www.computerbase.de/artikel/h...schnitt_crysis

    2x9800gtx with 4ghz quad just scores 26,8fps @ 1920x1200.


    if this is really true this card will fly in every other game that is on the market and is going to be released in the next year.

  5. #1355
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    interesting.
    i7 3610QM 1.2-3.2GHz

  6. #1356
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    332
    Those Expreview scores look dodgy Adam, 25fps @ 12 x 10, game was chugging on well enough on my 2 x 8800gtx at that res..

  7. #1357
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    more waiting for more info.

    Those Expreview scores look dodgy Adam, 25fps @ 12 x 10, game was chugging on well enough on my 2 x 8800gtx at that res..
    at what settings?
    i7 3610QM 1.2-3.2GHz

  8. #1358
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by adamsleath View Post
    This is utterly ridiculous. DX10 performance in Crysis is on par with DX9 performance (+/-10&#37 and you can run much higher settings than that article states. 10x7? Are they retarded? That's completely CPU-bound.. how could SLI help you there?

    Their numbers are too low even for a single 8800GTX though so I think their test is broken somehow..

  9. #1359
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by mao5 View Post
    Japan IT Media website today brings us the CRYSIS 1920*1200 VH test result of NVIDIA next generation flagship -GeForce GTX 280 Graphics Card.

    According to IT Media said, NVIDIA and an anonymous motherboard manufacturer hold a secret presentation to show the performance of GTX 280 outside the Computex 2008.

    The visitors said that the demonstration room is very dim lighting. In addition to show the performance of GTX 280 graphics card, the secret presentation also shown parts of the motherboards which are compatible with GTX 280, they simply had been placed on the windowsill of the room.

    IT Media site had the opportunity to run GPU-Z, CPU-Z and Crysis Benchmark on the GTX 280 demo system. From the photos, we can clearly see that NVIDIA GTX 280 presentation system used Intel Core 2 Quad four-core processor, the frequency is 2.66GHz, the Crysis Benchmark with 1920 x1200 VeryHigh settings indicated that the average fps of GTX 280 graphics card reached 36.81!

    source
    Since I made a mistake and have pre-teens from third-world countries flaming me, I'll delete my post.
    Last edited by Sr7; 06-09-2008 at 02:52 AM. Reason: pre-teens

  10. #1360
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Sr7 View Post
    Look at the cpu-z chart, they are running their CPU cores at 1.6GHz? Their multiplier is too low to even reach the 2.6GHz stock clocks they claim, unless I'm missing something here.

    Based on the L2 cache size, it looks to be a Q6700, so they should have a multiplier of 10, not 6

    CPU-bound @ 1.6GHz anyone?
    lol epic fail....


    try search for EIST

  11. #1361
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Hornet331 View Post
    lol epic fail....


    try search for EIST
    "LOLZOL EPIC FAILZ!11!"

    Wow kid.

    Speedstep is disabled by default on most motherboards... do you have proof that it was on?

    By the way that's a really cool avatar. (?)

  12. #1362
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by Sr7 View Post
    Look at the cpu-z chart, they are running their CPU cores at 1.6GHz? Their multiplier is too low to even reach the 2.6GHz stock clocks they claim, unless I'm missing something here.

    Based on the L2 cache size, it looks to be a Q6700, so they should have a multiplier of 10, not 6

    CPU-bound @ 1.6GHz anyone?


    Have you ever heard about C1E or Enhanced Intel Speedstep Technology? When idling, CPU multiplier lowers to 6x to save energy

    You just reminded me all the n00bs "oh noes, i bought new Core2Duo buts cpu-z says it 1.6ghz halp plzzzz"
    Are we there yet?

  13. #1363
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Sr7 View Post
    "LOLZOL EPIC FAILZ!11!"

    Wow kid.

    Speedstep is disabled by default on most motherboards... do you have proof that it was on?
    yes proof is the cpu-z screen.

  14. #1364
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Luka_Aveiro View Post


    Have you ever heard about C1E or Enhanced Intel Speedstep Technology? When idling, CPU multiplier lowers to 6x to save energy

    You just reminded me all the n00bs "oh noes, i bought new Core2Duo buts cpu-z says it 1.6ghz halp plzzzz"
    LMFAORLAOMRAL read up a post genius.

  15. #1365
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by Sr7 View Post
    Speedstep is disabled by default on most motherboards... do you have proof that it was on?
    Of course it is not disabled on most boards, wth did you get that idea?
    Are we there yet?

  16. #1366
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Hornet331 View Post
    yes proof is the cpu-z screen.
    Isn't the "Specification" field just reading from some preset "stock clock" reading, or is it the actual clocks it's set at?

  17. #1367
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Luka_Aveiro View Post
    Of course it is not disabled on most boards, wth did you get that idea?
    Every nForce I've had/seen had it disabled. I also got an Intel motherboard that it was disabled by default on.

  18. #1368
    Banned
    Join Date
    Apr 2008
    Location
    Brisbane, Australia
    Posts
    3,601
    Quote Originally Posted by Hornet331 View Post
    lol epic fail....


    try search for EIST
    lol. Oh well, they won't ever forget this moment.

  19. #1369
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by MomijiTMO View Post
    lol. Oh well, they won't ever forget this moment.
    What kind of bogus comment is this? Of course I'm aware of speedstep.. but I've never seen it on by default, and I was pretty sure CPU-z read the clock in the spec line as the "stock rated" clocks the way that some BIOS screens do.. i.e. a description

    What a bunch of douchebags you are to peruse the forums looking for people to mock when they make a simple mistake... pretty sad really.
    "LOLZ EPIC FAILZ OMFRG LOL"

    Are you guys preteens? ing snobs
    Last edited by Sr7; 06-09-2008 at 02:46 AM.

  20. #1370
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    ohhh noezzz my QX9650 only runs at 2ghz.

    it is a Q6700 running idle... i dont know where you get the idea that it runs crysis at 1,6ghz...
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	CPU-Z.jpg 
Views:	397 
Size:	91.3 KB 
ID:	80024  

  21. #1371
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Hornet331 View Post
    ohhh noezzz my QX9650 only runs at 2ghz.

    it is a Q6700 running idle... i dont know where you get the idea that it runs crysis at 1,6ghz...
    I've never used speedstep. Simple as that.. it's always been disabled as I've now said *twice* (but it doesn't seem to register for you, maybe it takes 3 times for people in third world countries).

    I know games where speedstep was denoted as the cause for people showing up as "speedhacking" on certain MMOs I've played. They narrowed it down to speedstep in laptops at the time. People I know were getting hassled by the GMs for it.

  22. #1372
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Its on by default by every Intel mobo I've used... and one of the first to go as soon as I start the OC'ing

    And besides, it turns the multiplier on/off depending on CPU use and it certainly is at correct speeds when playing a game

  23. #1373
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Sr7 View Post
    I've never used speedstep. Simple as that.. it's always been disabled as I've now said *twice* (but it doesn't seem to register for you, maybe it takes 3 times for people in third world countries).

    I know games where speedstep was denoted as the cause for people showing up as "speedhacking" on certain MMOs I've played. They narrowed it down to speedstep in laptops at the time. People I know were getting hassled by the GMs for it.
    as zerazax said its on by default and thats why everyone ignores/laughs at your statement.

    On every pc i have encounterd so far C1E was on. Hence the comment Luka_Aveiro made about "noobs" with there 1,6 ghz problem.

    if you disable it, fine for you, but that doesn't mean its of by default.

    The screen at pczilla shows a stock Q6700 @ 2,66 ghz and to some certain people here crysis is only "half multithreaded" so a quad should be more then enough to satisfy crysis.
    Last edited by Hornet331; 06-09-2008 at 03:11 AM.

  24. #1374
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Kuwait
    Posts
    616
    i have just tested crysis at 1900x1200 NoAA very high x64bit Dx10 o my rig and i get around ~59 FPS Max ~24 FPS min and ~37 FPS avg

    so i guss GTX 280 is 2x8800GTX

  25. #1375
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    i have just tested crysis at 1900x1200 NoAA very high x64bit Dx10 o my rig and i get around ~59 FPS Max ~24 FPS min and ~37 FPS avg
    that's useful to know.
    i7 3610QM 1.2-3.2GHz

Page 55 of 77 FirstFirst ... 5455253545556575865 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •