MMM
Page 1 of 6 1234 ... LastLast
Results 1 to 25 of 134

Thread: Next Gen VS. Current Gen CPU: Crysis Benchmark

  1. #1
    Xtreme Member
    Join Date
    Jun 2007
    Posts
    324

    Next Gen VS. Current Gen CPU: Crysis Benchmark






    Final thoughts:

    Testing using Crysis CPU Benchmark is a new try, We hope this test can show the CPU's actual performance.

    With these score AMD can't win this fight easily. We know that the OC ability of these retail version AM2+ CPUs are limited. We can not use the multiplier to OC, and using FSB to OC only can boost up the frequency for 26X~27XMHz. Pls remember that a easily-OC-to-3GHz-PhenomX4 wont sell on market.

    It is not the Frequency limiting the Phenom X4. Just read all the score for reference, Even OC to the same frequency Phenom still can not win. Compare to dual core E6850 it still have a performance gap.

    But the test also show some bad news for Intel: the dual-core-to-quad-core improvement are more less than AMD's job. AMD is catching up and Intel still using their last generation advantage.


    We want to ask AMD: where's native quad core's advantage? Nov.20 Phenom X4 will begin selling, at that time we will know more.
    original post: http://www.expreview.com/news/hard/2...91d6785_1.html

  2. #2
    Banned
    Join Date
    Aug 2007
    Posts
    1,014
    perhaps expreview coudl also ask : what's teh advantage of QX9650, i can't see one at stock.
    Also other none Intel benches would be fine as well..I simply cannot trust a game wich says play to win with intel or best played with nvidia...
    to much biased for me.

    3dmark06 or 3dmark05 for raw power would be good
    Last edited by BeardyMan; 11-05-2007 at 01:57 AM.

  3. #3
    Xtreme Member
    Join Date
    Jun 2007
    Posts
    324
    Quote Originally Posted by BeardyMan View Post
    perhaps expreview coudl also ask : what's teh advantage of QX9650, i can't see one at stock.
    Also other none Intel benches would be fine as well..I simply cannot trust a game wich says play to win with intel or best played with nvidia...
    to much biased for me.

    3dmark06 or 3dmark05 for raw power would be good
    also too much biased for me , i think all those quad are waste of the money..
    dual rules(at least now..)
    ..disappointing about the all the quad cores..

  4. #4
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Those are pretty crap memory timings dude, AFAIK slow timings hurt AMD more than it does Intel. Why are you using such loose timings? Hell even my crappy Corsair Value Select DDR2-667 can do 1000MHz @ 4-4-4-12 2T @ 2.3V... did you even try to tighten up the timings at all?

    Still, assuming there is a ~10% gain with better timings for AMD as opposed to only ~5% for Intel, it will still put X4 ~20% slower clock for clock than C2Q. Not good.
    Last edited by Epsilon84; 11-05-2007 at 02:32 AM.

  5. #5
    Xtreme Cruncher
    Join Date
    Oct 2006
    Location
    1000 Elysian Park Ave
    Posts
    2,669
    Quote Originally Posted by BeardyMan View Post
    perhaps expreview coudl also ask : what's teh advantage of QX9650, i can't see one at stock.
    Also other none Intel benches would be fine as well..I simply cannot trust a game wich says play to win with intel or best played with nvidia...
    to much biased for me.

    3dmark06 or 3dmark05 for raw power would be good
    True, but it's nice to finally see some game benchies
    i3-8100 | GTX 970
    Ryzen 5 1600 | RX 580
    Assume nothing; Question everything

  6. #6
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    Link is broken..:

    The requested URL /news/hard/2007-11-05/1194231291d6785_1.html was not found on this server.

  7. #7
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Quote Originally Posted by BeardyMan View Post
    perhaps expreview coudl also ask : what's teh advantage of QX9650, i can't see one at stock.
    Also other none Intel benches would be fine as well..I simply cannot trust a game wich says play to win with intel or best played with nvidia...
    to much biased for me.

    3dmark06 or 3dmark05 for raw power would be good
    And when AMD loses in 3DMark as well... would you be calling for specfp_rate benches?

  8. #8
    Banned
    Join Date
    May 2005
    Location
    Belgium, Dendermonde
    Posts
    1,292
    what is the resolution of the test???

    i thought crysis was gpu-bound?

  9. #9
    Banned
    Join Date
    Aug 2007
    Posts
    1,014
    Quote Originally Posted by Epsilon84 View Post
    And when AMD loses in 3DMark as well... would you be calling for specfp_rate benches?
    could care less about specFP_rate.

    just hate it when sites uses well known biased games as a reference for performance.

  10. #10
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    The same as before, in AMD test bed we are using the up-coming RD790 mobo. And in Intel test bed we are using P35. Including games, all the setting are in default. We are using GF 8800GTX,Driver is 169.02 WHQL. integrated Soundcard using Realtek 1.80 driver, and OS is XP SP2(DX9).
    does it matter what res?

    the test systems use the same gcard and the same settings.

    i think this is the fairest comparison i have seen yet.

    the bottom line for me:
    Phenom X4@ 3.0GHz average: 64
    C2Q Kentsfield@3g average: 80
    C2Q Yorkfield@3g average: 81

    25 pussent lead for intel platform vs amd platform clock for clock.
    Last edited by adamsleath; 11-05-2007 at 02:58 AM.
    i7 3610QM 1.2-3.2GHz

  11. #11
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Paris
    Posts
    275
    makes me more decided to buy an E8400 rather than Q9450 when they'll be release if they oc like mad
    ASUS P8P67
    i7 2600K 3.4GHz @ 4.6GHz
    Twintech 8800 GT 512Mo Samsung (vgpu modded)
    Crucial Ballistix DDR3 C7 2 * 2Go
    2 * WD VelociRaptor 150Go RAID 0
    2 * Samsung Spinpoint F3 1To RAID 0
    Creative Sound Blaster Audigy 2 ZS
    Seasonic S12 600HT


    WC :

    1A-SL2 CPU // 1A-SL2 GPU (home made fix)
    Eheim 1048 + magicool 25
    2 * Black Ice Pro 3 serial
    Tygon 3603 + glycoshell

  12. #12
    Xtreme Enthusiast
    Join Date
    Apr 2005
    Posts
    894
    in AMD test bed we are using the up-coming RD790 mobo. And in Intel test bed we are using P35. Including games, all the setting are in default. We are using GF 8800GTX,Driver is 169.02 WHQL. integrated Soundcard using Realtek 1.80 driver, and OS is XP SP2(DX9).
    no resolution though...............

    edit: saw the above post

    normal setting is 1024*768.

    what's thee point of testing at those res ?
    Gaming: SaberThooth X79,3930k,Asus6970DCII_Xfire,32gb,120OCZV3MaxIOPS, ThermaTake Chaser MK1
    HTPC:AMD630,ATI5750,4gb,3TB,ThermalTake DH103
    Server: E4500,4GB,5TB
    Netbook: Dell Vostro 1440

  13. #13
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Yea I wonder what settings were used in the game, I would prefer seeing 1024x768 - 1024x1280, low quality comparision.

    Quote Originally Posted by gillll View Post
    normal setting is 1024*768.

    what's thee point of testing at those res ?
    This has been explained so many times before. The lower the resolution the less impact GPU performance will have on the test and more load put onto the CPU, if you want to do a proper CPU performance test you don't want other factors to have an impact on the result, you only want to measure exactly that, CPU performance. There are many applications out there that doesn't use any GPU at all so why do such comparision then with the impact of GPU performance that will show a lower gap between the competitor than it actually is in reality, due to GPU performance start to manipulating the test.

    I just wish reviewers started using PCSX2 PS2 emulator that I'm a BETA tester for. This is an EXCELLENT app to test real CPU performance. GPU performance impact can be made 0 and it doesn't even require to have the exact same graphics card, only a gfx card that is fast enough to not become the bottleneck and that depends on what CPU and speed you're running at and game you're testing but for some games let's say Disgaea or Final Fantasy X even a comparision such as let's say an E6600 2.4GHz paired with a X1900 Pro vs AMD X2 6000+ with a 8800GTX would be a fair comparision in this PS2 emulator since in both cases the GPUs are fast enough to not become the bottleneck and only CPU + system/mobo performance will be measured, but system/mobo performance is something you'll never get away with but this is such a small difference in this emulator we're talking like less than ~3%. The 8800GTX won't cause a single FPS increase over the 1900 Pro in this case due to these less GPU demanding titles are both being bottlenecked by the CPU performance, in other word it's either CPU or GPU performance that is being measured, by increasing render target to 4x and enabling bilinear filtering with a graphics plugin the GPU load increases signficantly and the X1900 Pro in this case would become the bottleneck. Further I just wanna meantion about this emulator that performance increase you get by overclocking scales linear with the overclocked CPU amount as long as GPU doesn't become the bottleneck. If I compare my AMD Opteron 165 and Intel E6750 CPU in signature, there's even an ~40% clock for clock advantage for the Intel CPU and at a Opteron 165 @ 2.8GHz vs E6750 3.75GHz comparision, there's usually between 72~75% FPS difference (~34% clock speed difference + ~40% IPC advantage for E6750). Seldom I see another benchmark comparision telling such IPC advantage for Intel's Core 2 Duo vs AMD K8 dual core architecture but that's due to PCSX2 being a pure CPU performance test. I will do a comparision between Conroe and Penryn later on in this application when I upgrade to Penryn to see how big difference this app tells the IPC advantage is for Penryn, if typical benchmarks shows 5~10% differences, I wouldn't be too suprised if PCSX2 shows even up to 12~15%.

    So having told that I prefer low res and low quality comparisions in PC games when testing CPU performance although I still don't think PC games are the optimal test but in low res + low details at least it's a very good indication. There are apps you use that doesn't use your GPU too so therefore I don't think it's wise to test PC games in high res if you want to do a CPU performance comparision based on that.

    However if overall system performance when all hardware is measured at same time then PC games at higher res and details is more interesting, however not when doing a REAL CPU performance test.

    Crysis demo is so GPU dependant I think GPU performance impact is too big even in quite low resolutions.
    Last edited by RPGWiZaRD; 11-05-2007 at 03:34 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  14. #14
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    doesnt testing at low res show off cpu power?

    as at these low res the graphics card is less of a factor?
    i7 3610QM 1.2-3.2GHz

  15. #15
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    IS anyone else getting 404 message when clicking on the news link?

  16. #16
    Banned
    Join Date
    Aug 2007
    Posts
    1,014
    Supreme commander would be a good cpu indicator regardless of the resolution used.

  17. #17
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Quote Originally Posted by RPGWiZaRD View Post
    Yea I wonder what settings were used in the game, I would prefer seeing 1024x768 - 1024x1280, low quality comparision.
    Well if you open the link, it says they are using the inbuilt Crysis CPU benchmark @ default settings. I haven't downloaded the Crysis demo, so to those that have: what exactly are the default settings in the CPU benchmark? I'd assume it's a lower res like 1280x1024?

  18. #18
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Even at 1280x1024 this game won't be a good CPU performance indicator though.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  19. #19
    Xtreme Enthusiast
    Join Date
    Aug 2007
    Location
    Warren,MI
    Posts
    561
    i am seriously beginning to doubt this guy.
    cpu- Intel I7 3930K
    Asus P9x79 Deluxe
    2x HD7970
    32gb ddr3-1600
    corsair ax1200
    Corsair 800D
    Corsair H100 lapped
    2x 128gb M4 raid 0

  20. #20
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    I found this interesting topic over at techpowerup forums.It considers the low min. fps AMD Phenom had in earlier news posts(it was around 8.6fps if i recall correctly).
    Now look at this post from a C2D user with a CPU clocked way up to 3.6Ghz:
    http://forums.techpowerup.com/showthread.php?p=509527

    Minimum fps is at the almost exact frame and it is 8.89 fps !
    So C2D ,we saw ,has almost the same scores as C2Q(Penryn based too) and in this benchmark this user got the exact low minimum fps as Phenom got in expreview benchmark..What gives??



    Now look at this post in the same thread :
    http://forums.techpowerup.com/showpo...4&postcount=20

    The user LiNKiN is running a Dual Core(Toledo) Athlon X2 CPU @2.85Ghz and he is getting min. fps in CPU test locked at 38fps?? The trick is in the CPU/Med/Low/High settings it seems.The user than posted another set of screens showing that minimum frames per second heavily depend on the setting used in CPU benchmark!
    Screenshot(min 38fps/Low):

    Look at the other shots and see for yourself how the minimum fps is all over the chart(depending on the settings in the demo)!


    More ,this time with heavily OCed C2D@3.7Ghz scoring 10 fps as minimum fps in CPU test ,here:
    http://forums.techpowerup.com/showpo...3&postcount=28

    So we have now one Phenom@3Ghz having a minimum frame rate of ~8fps,one C2D@3.6Ghz having the same min. fps,one Toledo @2.85Ghz having 36.5fps in the same test,and we have a range of intel C2Ds and C2Qs @ expreview which have also higher than 20fps in minimum segment of the CPU test.It seems it is very dependent on the in-demo setting used,no matter if you ran CPU or GPU "benchmark"!

    Anybody else seeing how "problematic" this "benchmark" is?
    I am still looking into the "GPU" part of the tests,trying to see if there are also some discrepancies...
    Last edited by informal; 11-05-2007 at 03:44 AM.

  21. #21
    Xtreme Addict
    Join Date
    Nov 2006
    Posts
    1,402
    that game is optimized for C2Q. Crysis allready told about it.

    nothing very interesting.

    And the demo is even not optimized for GPU ...

    This is too early to say anything. I'm sure intel can't be faster next year.

    edit, and this is a fake thx ... that's very usefull

    the Phenom is always HT1 too ( only at 4*200 .. lol )
    Last edited by madcho; 11-05-2007 at 03:48 AM.

  22. #22
    Xtreme Member
    Join Date
    Jun 2005
    Location
    Bulgaria, Varna
    Posts
    447
    It is obvious, that such a ridiculous high memory timings is kneeling Phenom, but there is more on the table -- the memory partition access seems to have been adjusted for 2*64-bit mode, which adds additional latency penalty to the already hogged memory pathway! 1*128-bit (legacy) mode should be used here, as it is more suited for streaming large chunks of data, where the new 2*64-bit one is targeted for using in extreme multi-threaded environment and CryEngine is certainly not a such animal.

  23. #23
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    Guys,just read my previous post...

  24. #24
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,366
    Quote Originally Posted by informal View Post
    Guys,just read my previous post...
    Just read the link in the first post.
    Notice: We have done a re-test based on the earlier episode on these CPU due to the driver and system setting reason.

  25. #25
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Quote Originally Posted by RPGWiZaRD View Post
    Even at 1280x1024 this game won't be a good CPU performance indicator though.
    So how do you explain the big spread in performance between the various CPUs?

    Any game benchmark that removes the GPU as a bottleneck is a 'good' CPU benchmark. Not 'real world' but it does show the potential of the CPU.

Page 1 of 6 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •