Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 79

Thread: More Crysis 3 CPU benchmarks

  1. #26
    Xtreme Member
    Join Date
    Aug 2008
    Location
    Poland
    Posts
    199
    Quote Originally Posted by bhavv View Post
    And if you play at higher resolutions, CPUs still bottleneck GPU performance so its still a fair method of analysing CPU performance.

    Testing at these low resolutions is stupid, as are the reasons that people want me to believe as to why it is a valid way of testing CPU performance.

    No one puts a GTX 680 and a current 4-8 core CPU together only to play games at 1024x768. Show some real results from at least 1080p.
    So we're clear on that one. PCLab alone done a test in 3 places, 2 resolutions and PCGH made test in ONE location, ONE resolution which is still useless. Now tell me more about "PCGH being no. 1" Ridiculous

    EOT

  2. #27
    Xtreme Addict
    Join Date
    Mar 2010
    Posts
    1,079
    Quote Originally Posted by gosh View Post
    More threads = AMD is better
    One main thread that do most work = Intel is better
    FX-8350 can handle up to eight threads (kind of).
    Intel?s 3770K can handle eight threads.
    Care to explain why is AMD better when it comes to handle several threads?

    It's all about the coding. Crytek just coded the game for AMD's CPUs.
    As stated before, a patch for Intel CPUs might well turn the graphs.

    Altough it would be cool if AMD was better, just for the sake of competence.

  3. #28
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by bhavv View Post
    And if you play at higher resolutions, CPUs still bottleneck GPU performance so its still a fair method of analysing CPU performance.

    Testing at these low resolutions is stupid, as are the reasons that people want me to believe as to why it is a valid way of testing CPU performance.

    No one puts a GTX 680 and a current 4-8 core CPU together only to play games at 1024x768. Show some real results from at least 1080p.
    Man... do we have to go over this in every god damn benchmark thread.

    Both low and high resolution test have a place, as said a low resolution test is a pure cpu benchmark test and has its right to exists. So lets see what happens when a new gpu comes out, one is already close to the limit the other can still provide more and put the new gpu to better use.

  4. #29
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Quote Originally Posted by Hornet331 View Post
    as said a low resolution test is a pure cpu benchmark test
    I disagree. Its really not.

    If you want to purely test CPUs, then you use synthetic benchmarks not games. If you want to test game performance, then you run settings that people would actually run.
    Last edited by Mungri; 02-24-2013 at 11:34 AM.

  5. #30
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Location
    Hawaii
    Posts
    611
    Quote Originally Posted by bhavv View Post
    I disagree. Its really not.

    If you want to purely test CPUs, then you use synthetic benchmarks not games. If you want to test game performance, then you run settings that people would actually run.
    I believe he meant that it was pure as far as the game in concerned. If you reduce the graphical load you ensure that the GPU will not be slowing your test.

  6. #31
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    As far as games are concerned, 1024x768 testing is meaningless as it doesnt show real world performance that gamers will be getting at 1080p and up.

    And anyone thats running lower than 1080p isnt going to have specs like those in the test rig.

    In this one case for Crysis 3, I dont think that any of the results are valid because it seems like another case of early release performance bugs and poor optimization, and the amount of difference between the tests is just too random to pinpoint a single reason.

    Quote Originally Posted by El Maņo View Post
    It's all about the coding. Crytek just coded the game for AMD's CPUs.
    As stated before, a patch for Intel CPUs might well turn the graphs.
    Exactly, there doesnt seem to be as much optimization for Hyper Threading. The results are not conclusive of which CPU is better, only which CPU certain parts of the game are currently better optimized for.
    Last edited by Mungri; 02-24-2013 at 12:42 PM.

  7. #32
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by El Maņo View Post
    FX-8350 can handle up to eight threads (kind of).
    Intel?s 3770K can handle eight threads.
    Care to explain why is AMD better when it comes to handle several threads?
    3770K is better for threads compared to 3570K and it is more expensive. The cache is also bigger compared to 3570K but not as large as FX-8350. Cache is important if games are using a lot of memory

  8. #33
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Location
    Hawaii
    Posts
    611
    Quote Originally Posted by bhavv View Post
    As far as games are concerned, 1024x768 testing is meaningless as it doesnt show real world performance that gamers will be getting at 1080p and up.

    And anyone thats running lower than 1080p isnt going to have specs like those in the test rig.

    In this one case for Crysis 3, I dont think that any of the results are valid because it seems like another case of early release performance bugs and poor optimization, and the amount of difference between the tests is just too random to pinpoint a single reason.



    Exactly, there doesnt seem to be as much optimization for Hyper Threading. The results are not conclusive of which CPU is better, only which CPU certain parts of the game are currently better optimized for.
    Testing at 1024x768 is not supposed to show real world performance, it shows variable limited performance. It's showing the effect of the CPU alone (as best they can) on frame rates. Looking at the results it just seems like crysis 3 loves it some threads.

  9. #34
    Xtreme Enthusiast
    Join Date
    Aug 2005
    Posts
    519
    Come on guys, we've been over this 123314 times before. If your CPU has 100 fps in lower resolution it might have similar fps to that if you GFX is strong enough. On the other hand, if it can only do 60 fps in lower res, what do you think it will have in higher res..

    If you are using GF8400GS you might as well have Celeron.. As some guy called Keiichi Tsuchiya said - it's all about the ballance .
    2x Dual E5 2670, 32 GB, Transcend SSD 256 GB, 2xSeagate Constellation ES 2TB, 1KW PSU
    HP Envy 17" - i7 2630 QM, HD6850, 8 GB.
    i7 3770, GF 650, 8 GB, Transcend SSD 256 GB, 6x3 TB. 850W PSU

  10. #35
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Darakian View Post
    Testing at 1024x768 is not supposed to show real world performance, it shows variable limited performance. It's showing the effect of the CPU alone (as best they can) on frame rates. Looking at the results it just seems like crysis 3 loves it some threads.
    Quote Originally Posted by R101 View Post
    Come on guys, we've been over this 123314 times before. If your CPU has 100 fps in lower resolution it might have similar fps to that if you GFX is strong enough. On the other hand, if it can only do 60 fps in lower res, what do you think it will have in higher res..

    If you are using GF8400GS you might as well have Celeron.. As some guy called Keiichi Tsuchiya said - it's all about the ballance .
    well at least some people get it, before whining "b-b-but its not real world"

  11. #36
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Even if you're right, as many people have asked - WHERE are the 1080p+ CPU comparison results so they can at least be compared to the low resolution results?

    Without the comparison, these tests are meaningless and serve zero purpose to people playing at higher resolutions. I want to see how different CPUs compare at higher resolutions when they potentially start bottlenecking the GPU, not at lame low resolutions where the GPU isn't even being stressed.

    It makes a HUGE difference when your GPU is high end and potentially being bottlenecked by weak CPUs at high resolutions.

  12. #37
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    Quote Originally Posted by El Maņo View Post
    FX-8350 can handle up to eight threads (kind of).
    Intel?s 3770K can handle eight threads.
    Care to explain why is AMD better when it comes to handle several threads?

    It's all about the coding. Crytek just coded the game for AMD's CPUs.
    As stated before, a patch for Intel CPUs might well turn the graphs.

    Altough it would be cool if AMD was better, just for the sake of competence.
    actually not just coding. having native cores is always better then having virtual cores. intel implementing ht tech really well but as the cores are not physical ht has it's flaws (also advantages). also it is not a rare situation that disabling ht increases the gaming performance.


    When i'm being paid i always do my job through.

  13. #38
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by bhavv View Post
    Even if you're right, as many people have asked - WHERE are the 1080p+ CPU comparison results so they can at least be compared to the low resolution results?

    Without the comparison, these tests are meaningless and serve zero purpose to people playing at higher resolutions. I want to see how different CPUs compare at higher resolutions when they potentially start bottlenecking the GPU, not at lame low resolutions where the GPU isn't even being stressed.

    It makes a HUGE difference when your GPU is high end and potentially being bottlenecked by weak CPUs at high resolutions.
    CPU scaling benchmarks with relation to resolution and cpu importance on GPU bottlenecking? Been done, gets ugly.

    All along the watchtower the watchmen watch the eternal return.

  14. #39
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by c22 View Post
    Tell me something I don't know.




    Of course C3 is GPU limited. That's why pclab made a test in 3 different places using 2 different resolutions.

    PCGH is showing us only 1/3 of a truth, but it doesn't matter cause "PCGH is the no. 1 in CPU benchmarks.", right? Time to look around and verify some vortals, because they're apparently not even close to being to "no. 1". Far from that.
    You trolling? Didn't it enter your mind that due to the multithreaded nature of Cryengine 3 AMD could gain on Intel under some circumstances? I've read comments of a programmer working at Crytek regarding this. Both results are perfectly fine, theirs and yours. Again: different level, possibly different outcome. While I agree that they could have chosen more levels to bench, they have a long standing expertise in selecting demanding and yet representative benchmark scenarios for their CPU tests.

    And you have yet to change your antiquated resolution for benchmarking. No one plays at 5:4 or 4:3 anymore.

    Quote Originally Posted by bhavv View Post
    As far as games are concerned, 1024x768 testing is meaningless as it doesnt show real world performance that gamers will be getting at 1080p and up.
    No, they are not meaningless. What counts is framerate. 1080p and up would decrease performance further unless you are CPU bottlenecked. What purpose do 1080p benchmarks serve if the fps in those settings are too low for comfort? You can and should always lower resolution or settings to achieve the fps you want/need. What else are these options for? People always forget that because reviews mostly benchmark with maximum graphic settings.

    Just look at the quoted benchmarks above. The FX4300 gets 22fps. You can cram 4 graphics cards in there, overclock them to kingdom come, but in this scene you will never ever get more than those 22fps. At 1080p the 3570K and FX4300 would likely seem on par at 20fps or so, but in reality they are not. The 3570K can push at least 30fps, the FX4300 only 20fps. You would not have that information if those benchmarks were conducted at 1080p or above. And more information is always better.

    So in conclusion:
    Low res benchmarks (16:9 or 16:10) can provide valuable information and are perfectly fine and in tune with "real gameplay" as long as the gamer is smart enough to adjust game settings to suit his/her needs.
    Last edited by boxleitnerb; 02-25-2013 at 03:49 AM.

  15. #40
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    I only just noticed that there were some 1080p benchmarks right there.

    Its a very similar case to BF3, the game hits GPU limitation very quickly, its hardly CPU limited at all.

    I cant believe the claims I'm hearing that this low resolution testing is a valid method of testing CPU limitation in a graphics hungry game running with a GTX 680. If you want to test CPUs then do it in an application that doesn't rely so heavily on the GPU. Any game is going to be biased based on which hardware it is coded to run best on. And its very obvious, at 1080p neither Intel nor AMD are winning. Both their CPUs are too weak for such a graphics limited game.

    And no, you should NEVER lower the resolution below a current monitors native resolution. The amount of blurring an loss of sharpness and IQ is never worth moving away from your monitors native resolution. Reduce AA first, then graphics settings. The only exception being if you are playing very old games that don't support, or look too uncomfortable at high resolutions.
    Last edited by Mungri; 02-25-2013 at 04:03 AM.

  16. #41
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by bhavv View Post
    I only just noticed that there were some 1080p benchmarks right there.

    Its a very similar case to BF3, the game hits GPU limitation very quickly, its hardly CPU limited at all.

    I cant believe the claims I'm hearing that this low resolution testing is a valid method of testing CPU limitation in a graphics hungry game running with a GTX 680. If you want to test CPUs then do it in an application that doesn't rely so heavily on the GPU. Any game is going to be biased based on which hardware it is coded to run best on. And its very obvious, at 1080p neither Intel nor AMD are winning. Both their CPUs are too weak for such a graphics limited game.
    It doesn't matter if a game is graphics hungry or not. You act like ingame graphics options don't exist. Look at these GPU benchmarks:


    The 680 gets 38fps avg and 32fps min. Would you want to play at such low fps? I certainly wouldn't. I would turn down settings to achieve at least 50fps avg, better yet 50fps min, and exactly then will the difference between different CPUs become more apparent. It seems you haven't read my explanation at all. It's about fps, not bottlenecks. fps...if CPU A can do only 40fps, but CPU B can do 60, that is valuable information right there, no matter if at default settings you are GPU bottlenecked or not. Because settings aren't set in stone, they are changeable to cater to your demands. That is why they exist.
    Only when all CPUs can achieve highly playable fps, let's say 100+, differences become irrelevant.
    Last edited by boxleitnerb; 02-25-2013 at 04:08 AM.

  17. #42
    Registered User
    Join Date
    Oct 2008
    Posts
    74
    the problem of Crysis 3 seems to be that one thread is continually on max - no matter how much clock or IPC on this thread. This means on every Intel / AMD CPU one Thread is durably not available for graphics in general. You can see in benchmarks that clock doesn't matter that much, means if there could be a FPS limit regarding to that one cpu limited thread, it should scale differently...
    Of course, when a dualcore like the i3 is pushed to max on one core (what ever causes this 100% single thread) - there is only one core left to handle the other stuff which is shared on other threads quite well.

    See here:





    EIST off 6x 4,32 Ghz


    Settings:


    Graphics + FPS:




    By the way. U need to disable vsync else the load on gpu is less due fix on 30 FPS.
    Intel 990X ES @ 6x4,60Ghz@1,440V klick / DFI UT x58@ MIPS HD6950@70@1Ghz / 256GB M4 SSD / X-Fi / HK3.0 - Laing Ultra - Thermochill PA120.3 + TFC 120er + Cape Cora Ultra 8x/ Aquaero Steuerung / Lian Li A7110B / Seasonic Platinum 1kw / 2xDell IPS 23" / Win 7 + Server 2008 R2/ DSL 16K
    HTPC + HTPC Wiki + sysProfile + Clarkdale @<15W + Eigenbau-Ambilight + Ambilight PreOrder

  18. #43
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Quote Originally Posted by boxleitnerb View Post
    It doesn't matter if a game is graphics hungry or not. You act like ingame graphics options don't exist.
    Erm WHAT?

    Quote Originally Posted by bhavv View Post
    And no, you should NEVER lower the resolution below a current monitors native resolution. The amount of blurring an loss of sharpness and IQ is never worth moving away from your monitors native resolution. Reduce AA first, then graphics settings. The only exception being if you are playing very old games that don't support, or look too uncomfortable at high resolutions.
    I said that you should reduce settings to maintain comfortable smoothness, but without changing from your monitors native resolution.

    if CPU A can do only 40fps, but CPU B can do 60, that is valuable information right there, no matter if at default settings you are GPU bottlenecked or not.
    I agree with you, but thats not the case here at all. The difference in the results at 1080p with the latest CPUs is 33 FPS vs 31 FPS, theres virtually no clear cut winner between Intel or AMD CPUs. Overclocking the CPUs by 1 Ghz hardly makes any difference either.

    Also in such a CPU comparison, why were they afraid to also add in an Intel hex core CPU?
    Last edited by Mungri; 02-25-2013 at 06:32 AM.

  19. #44
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    You don't seem to understand
    33 or 31fps, for many people both are unplayable. If you lower graphics quality so that you can achieve more fps, the difference becomes 34 vs 48 in the "Post-Human" level and 35 vs 48 in the "The roof of all evil" level (FX8350 vs 3770K). Quite seizable, isn't it. It doesn't really matter if you get these results by lowering resolution or by lowering graphics settings - the end result is that with reduced graphics workload, the Intel systems are much faster and provide a better gaming experience than the AMD systems at actually playable framerates.

    How often do I have to say it?
    1080p doesn't or above matter if the fps achieved at this resolution are too low.

  20. #45
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Oh right, I get it now... So we need to see 1080p with reduced graphics settings as well then.

  21. #46
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    @bhavv

    there is no 1 perfect cpu gaming test, we probably agree on that.
    the point of low res testing it to show the max fps you will get for each cpu.
    but that is still limited by the settings. so they should do a few, like max physics and effects, medium, and lowest settings.
    some of us have super powerful cpus and want to see what speed is needed to get 60fps at max settings for the cpu AND gpu side, so they have to split that.
    others have limited resources and should see what settings are needed to enjoy the game on their cpu. there is no way i would be playing crysis under 30fps, so i would sacrifice settings if its cpu limited, or resolution and AA if its gpu limited.

    its hard to test "real world" because no configuration is the same. for me i like to have 1920x1200 and will give up view distance and AA to keep it. someone else might prefer 1280x800 and keep 4xAA on, and each has their benefits. typically though, cpu speed needed is not affected heavily by resolution.

    make senes now?
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  22. #47
    Registered User
    Join Date
    Jan 2007
    Location
    Serbia
    Posts
    36
    @c22

    Would you mind telling me why PCL used very high setting as well? Aren't CPU test supposed to be done with low res + low settings? VH in C3 is very demanding of GPU even on 10x10 pixel resolution :d You are still very much GPU bound in those tests. I play C3 with i3 2100 @1080p and have fps limited to 60 and use low/high combo settings. Very rarely does fps go under 40, or even 55 tbh (except for the first levels due to that rope bug). In pcgh and pcl tests anything bellow i5 cpu is pointless at VH settings unless the goal was to just show which processors cannot handle VH at all o.O

  23. #48
    Xtreme Member
    Join Date
    Aug 2008
    Location
    Poland
    Posts
    199
    @gx-x

    But do you play at low res + low settings? Nope, so this test is pretty much pointless, but we do that anyway, to show the difference IF title is bottlenecked by GPU at 1920x1080.

    We're testing mostly in full HD, because it's actually how it should be done. It's the simple rule: you're not playing at 1024x768 nor 1280x720. You're playing at 1680x1050 and higher. So you need to check CPU performance at 1920x1080 anyway. You can almost always find a place in the game that's CPU dependant, even at 1920x1080 max settings - it's just a matter of time and.. knowledge.

    Look at there: http://pclab.pl/art50000-14.html

    Pages from 14 to 28 - CPUs at stock settings.
    Pages from 51 to 71 - OCed settings.

    Battlefield 3, Max Payne 3, Metro 2033. They all say those titles are GPU dependant... Sure they're unless you know how to find a proper place, where CPU testing has actual sense. And most of those review sites out there simply CAN'T find the place, and on the other hand: we CAN

    Another example: Far Cry 3 being GPU bottlenecked? That's funny, because it's not. It's CPU limited.




    It's a matter of finding a good place - just saying.
    Last edited by c22; 02-25-2013 at 01:42 PM.

  24. #49
    Registered User
    Join Date
    Jan 2007
    Location
    Serbia
    Posts
    36
    Quote Originally Posted by c22 View Post
    @gx-x

    But do you play at low res + low settings? Nope, so this test is pretty much pointless, but we do that anyway, to show the difference IF title is bottlenecked by GPU at 1920x1080.
    Maybe I wasn't clear enough: when testing/comparing CPUs, why very high details? I also don't play in 720p very high settings, I'd rather go native res and lower details, BUT, when comparing CPU capabilities in a game shouldn't you just eliminate some shoddy codding that often happens in very high settings (they don't care to optimize for that kind of setting) and just pair lower res with some lower or mid settings? I mean, who goes for gtx680 with Athlon x4 or i3 540 and uses lower res with ultra high settings? Remember, we are testing CPUs, not GPU. I am Ok with 1080p tests but you should have included lower details so the lower end lot in the test actually shows some "numbers" that can be compared. From your CPU test in C3 one would conclude that game is not playable on something like i3...which is far from truth. I didn't pair my i3 with TITAN, but with 560Ti 2gb...and I play it with 50-60fps (locked at 60) with low/high combo.

  25. #50
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Games can be CPU and GPU bottlenecked, depending on the scene/level. If you disable MSAA in FC3 (a low setting ), of course you will be more CPU bottlenecked.

    Quote Originally Posted by c22 View Post
    But do you play at low res + low settings? Nope, so this test is pretty much pointless, but we do that anyway, to show the difference IF title is bottlenecked by GPU at 1920x1080.
    Wrong. I thought we had that covered already. But have fun with your low fps...

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •