Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 80

Thread: Crysis demo with G8800GTX benchmark - Is this true ???

  1. #26
    Xtreme Member
    Join Date
    Jun 2006
    Posts
    289
    Guys, jesus, can't you see past the test?

    What are the demo's running on at the live demonstrations of crysis at exhibitions if 88's can barely do 40 fps lol ?

    The demo's are run at like 2560 x 1600 on 30" screens, at uber high fps so either they're running not yet released r600's or the above test is bull/incomplete/incorrectly compiled!

  2. #27
    Registered User
    Join Date
    May 2006
    Posts
    58
    but wasn't the demonstration dx9??

  3. #28
    Xtreme Member
    Join Date
    Aug 2006
    Posts
    132
    There is nothing to worry. The lead designer has AMD 64 3500 + 2GB RAM + 1900XT, he has played the game on his rig and he said that it ran great with most settings on. It has been put on the website that he said it ran very well with reasonably high settings ( still unoptimized).

    Fast single cores + 1900xt/xtx 7900gtx etc can expect anything between 1024x768 High to 1280x1024 near max settings without AA, low physics/sound

  4. #29
    Xtreme Addict
    Join Date
    May 2006
    Location
    Herbert's House in Family Guy
    Posts
    2,381
    1. the newest nvidia vista driver 100.54 dont even have support for SLI on vista, let alone 100.31. SO that means it was running on 1 card only, jsut 1 GeForce 8800GTX

    2. Vista gaming performance is slower than XP, give it some time

    3. They didnt try running it on XP, why??

    Quote Originally Posted by akshayt
    There is nothing to worry. The lead designer has AMD 64 3500 + 2GB RAM + 1900XT, he has played the game on his rig and he said that it ran great with most settings on. It has been put on the website that he said it ran very well with reasonably high settings ( still unoptimized).

    Fast single cores + 1900xt/xtx 7900gtx etc can expect anything between 1024x768 High to 1280x1024 near max settings without AA, low physics/sound
    link??
    E6600 @ 3.6
    IN9 32x MAX
    EVGA 8800Ultra
    750W

  5. #30
    Xtreme Member
    Join Date
    Mar 2005
    Location
    Snowdonia
    Posts
    166
    Core 2 QX6700 2.66GHz ... 1000$
    2x 8800GTX .... 1200$
    4GB DDR2 ... 800$

    Running Crysis on cutting edge hardware at 1280x1024 with 17.1 fps ... Priceless

  6. #31
    Xtreme Member
    Join Date
    Apr 2006
    Location
    Taupo
    Posts
    493
    Oh god, I hope that isn't the case... I was looking forward to getting away with a 8600U with 1440x900.
    Intel i5 2500k @ 4GHz || Gigabyte Z68 UD3 || Vengeance 8GB 8-8-8-24 1T || CM Stacker 810
    ATI 4850 || Samsung Spinpoint F3 x 2 || SilverStone ST60F
    Asus DVD RW +- || Logitech G15 || MS Intel-eye 1.1|| CMV 19" Wide-Screen || Scythe Ninja

  7. #32
    Xtreme Enthusiast
    Join Date
    Dec 2004
    Location
    United Kingdom, South East England Kent
    Posts
    741
    come on guys, we all know crysis made this game with NO compromises. It's not meant to work well on highest quality on the average guys pc. It's mean to beat the hell out of the best. And average guy can play on 640 res:P

  8. #33
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,838
    Quote Originally Posted by PallMall
    does this game benifit from quad core? otherwise bottleneck for sure.
    yes, it is optimised for quad core.
    also, the game doesnt have ageia ppu support, but the pc they used just happened to have a ppu.
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  9. #34
    Xtreme Member
    Join Date
    Jun 2006
    Posts
    289
    @ //porre, they've had both dx9 and 10 demos for a while now.

    @ DeltZ, CRYTEK made the game to be scalable to any hardware. If they made it runnable properly only on high end systems they'd loose A LOT of cash, and EA simply doesn't allow that.

  10. #35
    all outta gum
    Join Date
    Dec 2006
    Location
    Poland
    Posts
    3,390
    I call that review bull .
    All evidence speaks against it.
    www.teampclab.pl
    MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12

    Test bench: empty

  11. #36
    Xtreme Member
    Join Date
    Dec 2003
    Location
    England, south east
    Posts
    181
    Quote Originally Posted by Fatal Error
    Core 2 QX6700 2.66GHz ... 1000$
    2x 8800GTX .... 1200$
    4GB DDR2 ... 800$

    Running Crysis on cutting edge hardware at 1280x1024 with 17.1 fps ... Priceless
    LOL, so true.

  12. #37
    Xtreme Enthusiast
    Join Date
    Jan 2004
    Location
    London, UK
    Posts
    650
    Quote Originally Posted by grimREEFER
    yes, it is optimised for quad core.
    Link for that info? I didnt know it was quad core optimised. Is it quad core specific or just multi-core? (i.e. 1core < 2 < 4 etc)

    G

  13. #38
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Bay Area, CA
    Posts
    1,331
    Quote Originally Posted by akshayt
    There is nothing to worry. The lead designer has AMD 64 3500 + 2GB RAM + 1900XT, he has played the game on his rig and he said that it ran great with most settings on. It has been put on the website that he said it ran very well with reasonably high settings ( still unoptimized).
    Oh yeah nothing to worry about, it's not like the "lead designer" has any stake in the game or anything

    Do you really think that the devs will say their game runs poorly? They will ALWAYS say it runs well at "reasonable" settings as reasonable is completely arbitrary.

    Quote Originally Posted by DTU_XaVier
    We should remember, as seen in the past, 8800GTX does NOT take a very big hit when applying AF and AA...
    Maybe not "very big", but it is definitely a substantial, especially in non SLI configs.



    Quote Originally Posted by korda
    Not really, SLI ain't working in Vista yet. Well, perhaps with newer drivers, but not with 100.30.
    Yeah even with the newer 100.5X drivers SLI still isn't enabled. Essentially these graphs don't give us much insight as to how well Crysis will run on an 8800gtx SLI config. The graphs may not even be representative of a single 8800GTX performance...
    Last edited by J-Mag; 01-29-2007 at 12:00 PM.

  14. #39
    Xtreme Enthusiast
    Join Date
    Dec 2004
    Location
    United Kingdom, South East England Kent
    Posts
    741
    Quote Originally Posted by n-sanity
    @ //porre, they've had both dx9 and 10 demos for a while now.

    @ DeltZ, CRYTEK made the game to be scalable to any hardware. If they made it runnable properly only on high end systems they'd loose A LOT of cash, and EA simply doesn't allow that.
    fair enough...except crytek don't belong to EA...at least last time i knew...

  15. #40
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Location
    Wisconsin, USA
    Posts
    599
    Quote Originally Posted by //porre
    but wasn't the demonstration dx9??
    No, did you even watch that last trailer? It was definitely DX10

  16. #41
    Xtreme Cruncher
    Join Date
    Nov 2005
    Location
    Rhode Island
    Posts
    2,740
    Quote Originally Posted by DeltZ
    fair enough...except crytek don't belong to EA...at least last time i knew...
    EA is the publisher for Crysis though.
    Fold for XS!
    You know you want to

  17. #42
    Registered User
    Join Date
    May 2006
    Posts
    58
    Quote Originally Posted by Drunner611
    No, did you even watch that last trailer? It was definitely DX10
    i was talkin about the demonstration #27 was refering to...

  18. #43
    Xtreme Mentor
    Join Date
    Aug 2006
    Location
    HD0
    Posts
    2,646
    I'd wait until better drivers are out before making jusdgements...

  19. #44
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by J-Mag
    Oh yeah nothing to worry about, it's not like the "lead designer" has any stake in the game or anything

    Do you really think that the devs will say their game runs poorly? They will ALWAYS say it runs well at "reasonable" settings as reasonable is completely arbitrary.



    Maybe not "very big", but it is definitely a substantial, especially in non SLI configs.





    Yeah even with the newer 100.5X drivers SLI still isn't enabled. Essentially these graphs don't give us much insight as to how well Crysis will run on an 8800gtx SLI config. The graphs may not even be representative of a single 8800GTX performance...
    Not only that, but the current vista driver is also an average of 30 fps lower in every game compared to the current XP driver.

    I'm claiming 100% BS on these benchmarks, considering every showing of the game as of late has been on 8800's.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  20. #45
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Bay Area, CA
    Posts
    1,331
    Quote Originally Posted by DilTech
    Not only that, but the current vista driver is also an average of 30 fps lower in every game compared to the current XP driver.

    I'm claiming 100% BS on these benchmarks, considering every showing of the game as of late has been on 8800's.
    Have you seen any good gaming comparisons of Vista and XP? Most forum posts I have seen are people posting 3dmark comparisons or just Vista frame rates of games...

  21. #46
    Xtreme Enthusiast
    Join Date
    Apr 2004
    Posts
    703
    Why did they have the Ageia PhysX card in the setup? Crysis is supposed to be using their own Crytek physics engine.
    A wiseman once said, "If Bible proves the existence of God, then comic books prove the existence of Superheros."

  22. #47
    Xtreme Addict
    Join Date
    Nov 2005
    Location
    Where the Cheese Heads Reside
    Posts
    2,173
    Quote Originally Posted by J-Mag
    Have you seen any good gaming comparisons of Vista and XP? Most forum posts I have seen are people posting 3dmark comparisons or just Vista frame rates of games...
    Agreed. I have no issues with Vista and most games I've run perfectly. Oblivion, Rainbow Six Vega's, GRAW, X3, HL2, AoE III, Dungeon Siege II, NeverWinder Nights 2, Supreme Commander. Thats just a few most hold there FPS compared to XP and some (SC that I know of) actually get a slight boost FPS wise.
    -=The Gamer=-
    MSI Z68A-GD65 (G3) | i5 2500k @ 4.5Ghz | 1.3875V | 28C Idle / 65C Load (LinX)
    8Gig G.Skill Ripjaw PC3-12800 9-9-9-24 @ 1600Mhz w/ 1.5V | TR Ultra eXtreme 120 w/ 2 Fans
    Sapphire 7950 VaporX 1150/1500 w/ 1.2V/1.5V | 32C Idle / 64C Load | 2x 128Gig Crucial M4 SSD's
    BitFenix Shinobi Window Case | SilverStone DA750 | Dell 2405FPW 24" Screen
    -=The Server=-
    Synology DS1511+ | Dual Core 1.8Ghz CPU | 30C Idle / 38C Load
    3 Gig PC2-6400 | 3x Samsung F4 2TB Raid5 | 2x Samsung F4 2TB
    Heat

  23. #48
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Grand Forks, ND (Yah sure, you betcha)
    Posts
    1,266
    I totally concur with the pre-alpha software + incomplete drivers (first DX10 driver + no SLI) thoughts, but it may still be a cause to worry. Perhaps 8800GTX doesn't handle DX10 well, or perhaps Crysis is just going to be a killer game on gfx cards. All possible.

    Need I remind ya'll though that when Far Cry came out in March of 2004, the greatest cards available (9800XT/5950) at that time could barely run 1600x1200 (today's 1920x1200) over 30FPS, if that, with no AA/AF. The generation that came after that (X800/6800) could barely break the 40's, if that, with 4/8x. It's not inconceivable that Crysis will bust 8800GTX's (and R600's) balls, even when all the bugs are ironed out.

    But really, would you want it any other way? Crytek makes glorified playable tech demos, meant to be barely playable when released, but pretty as hell to look at. Personally, I hope it's not playable at ultra-high rez until the following gen for the reason it could be gorgeous now, but more gorgous later, just like Far Cry was then, and games like Oblivion are now.
    Last edited by turtle; 01-29-2007 at 04:39 PM.
    That is all.

    Peace and love.

  24. #49
    Xtreme Cruncher
    Join Date
    Mar 2005
    Posts
    861
    QFT! I remember getting a 6800GT just to make Far Cry workable. The 7800GT and now the 8800GTS eat it up. I'm not worried - quite certain that when it releases I can finally go SLI and it will run well.
    Bloodrage || 920 @ 3.2Ghz || TRUE Black
    3x 2GB HyperX 2000 || @ 2000Mhz 7.7.7
    2x 300GB WD VR Raid 0 || 2x 2TB Samsung F3 Raid 0
    LG 10x BD-R || LG 22x DVD/RW
    MSI GE 470 || LG 246WP
    Sonar X-Fi || Klipsch 5.1
    Lycosa || Mamba || Exact Mat
    CM ATCS 840 || Seasonic M12D
    Server 2008 R2 x64

  25. #50
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Bay Area, CA
    Posts
    1,331
    Quote Originally Posted by turtle
    But really, would you want it any other way? Crytek makes glorified playable tech demos, meant to be barely playable when released, but pretty as hell to look at. Personally, I hope it's not playable at ultra-high rez until the following gen for the reason it could be gorgeous now, but more gorgous later, just like Far Cry was then, and games like Oblivion are now.
    Yeah good call. I hate the whiners that pop out every time a name game is released ing and moaning at performance. AT first we only had high settings but the Ultra settings were introduced in D3 to avoid the whiners, because basically no one could play D3 at ultra when released. This still didn't solve the problem because now you still see people about not being able to run at the "highest" settings because they have the newest card.

    I think ini editing is the way devs should allow "ultra" settings, this cons all the n00bs into thinking they are playing at the "highest". Then later on down the road when new hardware comes out, the devs could always provide a patch for ultra settings...

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •