Page 3 of 6 FirstFirst 123456 LastLast
Results 51 to 75 of 126

Thread: 3.2G OC'd CORE i7 940 vs. 3.16Ghz Stock E8500 vs. 3.2Ghz Stock QX9770 Complete Review

  1. #51
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    with a stock 1333fsb, the core 2 chips would have choked on highest settings, if it were a multi-gpu bandwidth demanding setup. not so much for the core i7, of course.

    this is like comparing top speeds of a 300 hp and a 500 hp car, when they're both capped at 250 km/h.

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  2. #52
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    Quote Originally Posted by RPGWiZaRD View Post
    [...]
    A bit offtopic but useful info regarding this game, I use this game to determine if my RAM & CPU is gaming stable too as it's better finding instability than even Orthos or memtest sometimes (not kidding). I had UT3 crashing on me randomly and wondered why since no other stress tests I used failed on me, but it turned out to be ram instability and now UT3 runs happily again.
    the ut series games always were kind of picky about unstable ram/cpu. i've used the oldschool ut to test my overclocks on the old pentium3 450mhz
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  3. #53
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    Yes, i can second that UT3 engine is very good at finding errors in OC, both gpu/cpu and even memory, good stability is to test the UT3 engine for a couple of hours!!

  4. #54
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    Quote Originally Posted by xsbb View Post
    All this time i've been praying for the Core i7 to completely render my current Q6600 Overclocked obsolete.


    I'm sticking to my current CPU and just buying a new GPU and an SSD instead.
    +1 on that. I, on the other hand, need to use something other than this failing Opteron 185. I'll see what the prices turn out to be at the end of December or beginning of January. Hopefully the price gouging will settle down AFTER Christmas. Also, I gotta see what Deneb is like too.

  5. #55
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    grab a q6600 while you still can they'll be eol soon...
    i7 3610QM 1.2-3.2GHz

  6. #56
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    City of Lights, The Netherlands
    Posts
    2,381
    Quote Originally Posted by Mechromancer View Post
    +1 on that. I, on the other hand, need to use something other than this failing Opteron 185. I'll see what the prices turn out to be at the end of December or beginning of January. Hopefully the price gouging will settle down AFTER Christmas. Also, I gotta see what Deneb is like too.
    I'm pretty much in the same boat, I still have a socket 939 AMD X2 3800+ and it's getting pretty slow for certain tasks. I'm not in that much of a hurry though, I think I'll wait until after summer 2009 I think for a full upgrade. I will basically have to buy a new mobo, RAM and CPU at the same time, while also upgrading my GPU, so I will have to decide wisely as I'm pretty much just a poor student. All I'm hoping is to get a decently priced Quad core rig by that time and I hope that either Deneb or Nehelem can give me just that (with something like a RV870 of course).
    "When in doubt, C-4!" -- Jamie Hyneman

    Silverstone TJ-09 Case | Seasonic X-750 PSU | Intel Core i5 750 CPU | ASUS P7P55D PRO Mobo | OCZ 4GB DDR3 RAM | ATI Radeon 5850 GPU | Intel X-25M 80GB SSD | WD 2TB HDD | Windows 7 x64 | NEC EA23WMi 23" Monitor |Auzentech X-Fi Forte Soundcard | Creative T3 2.1 Speakers | AudioTechnica AD900 Headphone |

  7. #57
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Toon
    Posts
    1,570
    Quote Originally Posted by RPGWiZaRD View Post
    A bit offtopic but useful info regarding this game, I use this game to determine if my RAM & CPU is gaming stable too as it's better finding instability than even Orthos or memtest sometimes (not kidding).
    Thanks for the tip! Also, is there a simple way (switch/batch/config file) to launch it as a stress test?
    Intel i7 920 C0 @ 3.67GHz
    ASUS 6T Deluxe
    Powercolor 7970 @ 1050/1475
    12GB GSkill Ripjaws
    Antec 850W TruePower Quattro
    50" Full HD PDP
    Red Cosmos 1000

  8. #58
    Xtreme Member
    Join Date
    Aug 2008
    Location
    Melbourne, Australia
    Posts
    190
    now i don't know what to buy in november arg this is making things hard for me.
    3570K @ 4.5GHz w/ EK Supreme HF
    ASUS MAXIMUS V GENE
    MSI OC 7970 w/ XSPC razor
    8GB DDR3 1600
    64 GB SSD & 2 TB HDD
    Fractal design Arc Midi w/ internal 3x140 SR1 rad

  9. #59
    Xtreme Cruncher
    Join Date
    Oct 2006
    Location
    1000 Elysian Park Ave
    Posts
    2,669
    Great, now we wait for Westmere. Gamers get a beefy GPU, stock up on 3-4GB of RAM and wait for affordable SSDs
    i3-8100 | GTX 970
    Ryzen 5 1600 | RX 580
    Assume nothing; Question everything

  10. #60
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Kingcarcas View Post
    Great, now we wait for Westmere. Gamers get a beefy GPU, stock up on 3-4GB of RAM and wait for affordable SSDs
    And Enthuisats get everything.

  11. #61
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Loque View Post
    starting to lean towards a Q9550 for gaming, all the extra cash for the next gen's cpu, board and memory could be spent on a better vid card or monitor, there's barely any game that uses 4 cores properly so a quad core will last until the next console gen..
    This is exactly what I was asking about here:

    http://www.xtremesystems.org/forums/...4&postcount=76

    I don't get why there's so much Nehalem hype. If they don't improve gaming performance, they might as well throw in the towel, because the areas they can really make a difference in other than gaming are transcoding, image processing, and media playback, all of which, algorithmically, are better suited to a parallel processor like GPGPU or (in the future) Larrabee.

  12. #62
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Quote Originally Posted by Sr7 View Post
    This is exactly what I was asking about here:

    http://www.xtremesystems.org/forums/...4&postcount=76

    I don't get why there's so much Nehalem hype. If they don't improve gaming performance, they might as well throw in the towel, because the areas they can really make a difference in other than gaming are transcoding, image processing, and media playback, all of which, algorithmically, are better suited to a parallel processor like GPGPU or (in the future) Larrabee.
    Only when games become more multi-threaded will we see big gains from Nehalem. The sooner people get that through their heads the better, and we can stop all this whining about the Core 2 level gaming performance.

  13. #63
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Sr7 View Post
    This is exactly what I was asking about here:

    http://www.xtremesystems.org/forums/...4&postcount=76

    I don't get why there's so much Nehalem hype. If they don't improve gaming performance, they might as well throw in the towel, because the areas they can really make a difference in other than gaming are transcoding, image processing, and media playback, all of which, algorithmically, are better suited to a parallel processor like GPGPU or (in the future) Larrabee.
    you could say that of any new CPU revision, gaming will always be played with visual fidelity as the first and foremost requirement ... this puts the burden, and the cap, on the GPU.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  14. #64
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Epsilon84 View Post
    Only when games become more multi-threaded will we see big gains from Nehalem. The sooner people get that through their heads the better, and we can stop all this whining about the Core 2 level gaming performance.
    Or when GPUs pole vault past the bottlenecks in the GPU.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  15. #65
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by Nedjo View Post
    Only game engine out there that can is truly multithreared, in sense that it isn't specifically two or quad core optimized, but multi-threaded is "Unreal Engine 3"

    Here is illustration:

    this is four sessions of "CTF_strident_flyby" benchmarking:
    from left to right:
    1. 1280x1024 Lowest setting
    2. 1280x1024 Highest settings
    3. 1680x1050 Highest settings
    4. 1920x1200 Highest settings

    So what you can see is that rendering engine is truly multithreades (no AI, no physix involved).

    Regardless of GPU load, CPU load stays consistent and that's true sign of quality multithreaded optimization. for example in Crysis you can detect dual core optimization only in low gfx settings and heavy physix scenarios... Crytech2 graphics engine is still singlethreaded...

    Also one curious thing is that UE3 scaling beyond three threads is really weak which is expected, 'cos generally multithreadin in 3D engines is optimally three-threaded (logic, load, render)

    The logic thread is the main thread (this thread creates the window and the device) and runs the mainloop (and the windows procedure events).

    All rendering calls are done only from the rendering thread (VSYNC is enabled).

    and the load thread is basically self explanatory...

    So, point is eight "virtual" threads of Nehalem will bring none whatsoever benefit for games!
    I would assume that the UT3 engine is optimized for 3 threads because the Xbox 360 has 3 CPU cores...

    And as someone else said 9800GTX+ = benchmarks irrelevant.
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  16. #66
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Stukov View Post
    I would assume that the UT3 engine is optimized for 3 threads because the Xbox 360 has 3 CPU cores...

    And as someone else said 9800GTX+ = benchmarks irrelevant.
    UT3 does a pretty decent job taking advantage of all 4 cores, but it certainly does not get a very big pop going from 3 to 4...

    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  17. #67
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Looking at those charts is anyone else not that impressed? I thought they were sposed to be "50% faster"? Or am I missing something?
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  18. #68
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Epsilon84 View Post
    Only when games become more multi-threaded will we see big gains from Nehalem. The sooner people get that through their heads the better, and we can stop all this whining about the Core 2 level gaming performance.
    That's not really true.. if nehalems micro-arch changes had increased efficiency for things that games need, you couldve at least seen *some* gain, but this obviously isn't going to be their focus.

    But in large part, yes you stand to gain far more by multi-threading a game, as I addressed here:

    http://www.xtremesystems.org/forums/...2&postcount=90

    The more threads you try to run in parallel without stalling each other, the programming becomes exponentially complex to code.

  19. #69
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Sr7 View Post
    That's not really true.. if nehalems micro-arch changes had increased efficiency for things that games need, you couldve at least seen *some* gain, but this obviously isn't going to be their focus.
    \
    Nehalem could very well show those improvements computationally, but you won't see any gain if the GPU is capping the output. No amount of CPU power thrown into the equation will change the result in a GPU limited regime.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  20. #70
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Glow9 View Post
    Looking at those charts is anyone else not that impressed? I thought they were sposed to be "50% faster"? Or am I missing something?
    You are not missing anything, the gaming benchmarks are run at high settings -- on a weak GPU (by today's methods) .... all CPUs will show the same FPS, which is essentially what is being shown.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  21. #71
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by JumpingJack View Post
    Nehalem could very well show those improvements computationally, but you won't see any gain if the GPU is capping the output. No amount of CPU power thrown into the equation will change the result in a GPU limited regime.
    Of course, that's a given. But the sad part is I'm on a 12x10 monitor still Even at those lower resolutions like 10x7 or 12x10, where the CPU is much more of a bottleneck, we see it doesn't seem to be beneficial to upgrade to nehalem, according to these released benchmarks. This is the first time I can remember that being the case with a new gen of CPUs in quite some time. Usually Intel and AMD specifically recommend that reviewers test at 8x6 or 10x7 to see the gains of their new processors.

    The whole purpose of benchmarking is to see how much you stand to gain in your experiences with your computer by buying this product right? So how much sense does it make to turn settings way down to check your CPUs gain in performance, if you don't see any of that additional performance when actually gaming at your normal/native resolutions/settings?

    It's like buying a car for commuting based on it's top speed. You'll never see that speed in every-day use because you don't do that kind of driving.

    Obviously there's an exception if you're running multiple GPUs and therefore putting the bottleneck back on the CPU, but most people don't run those kinds of setups.
    Last edited by Sr7; 10-18-2008 at 10:03 PM.

  22. #72
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Quote Originally Posted by JumpingJack View Post
    You are not missing anything, the gaming benchmarks are run at high settings -- on a weak GPU (by today's methods) .... all CPUs will show the same FPS, which is essentially what is being shown.
    Ah, hrm.. even so figured those scores be maxed with the new chip *shrug*
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  23. #73
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Sr7 View Post
    Of course, that's a given. But the sad part is I'm on a 12x10 monitor still Even at those lower resolutions like 10x7 or 12x10, where the CPU is much more of a bottleneck, we see it doesn't seem to be beneficial to upgrade to nehalem, according to these released benchmarks. This is the first time I can remember that being the case with a new gen of CPUs in quite some time. Usually Intel and AMD specifically recommend that reviewers test at 8x6 or 10x7 to see the gains of their new processors.
    Depends on the game... Gen-1 games, yeah I agree.... but current DX10 games... even 1280x1024 with high fidelity settings is GPU limited.

    WIC at 1280x800, high setting default, QX9650 @ 3.2 Ghz
    Min =32 Ave = 58 Max = 154

    WIC at 1280x800, high settings default, QX9650 @ 2.66 Ghz
    min =28 Ave = 59 Max = 141

    This is on a 4870X2 ... lost planet is doing the same thing (everything high)..... so this review ran GPU limited but tries to conclude (as well as most other on this thread) about the CPU.... this is incorrect. EDIT: Note I ran XP DX9, DX10 will be even more the same...

    Does it make a difference, nope ... why? Because we like to play at those settings ... however, I personally, prefer to not buy a whole new system ... the CPU is the lowest common denomenator -- and revs every 1-2 years, GPUs rev every 6-9 months, so if I want to future proof -- I prefer the fastest CPU then incrementally upgrade the GPU as needed... that's me ... which is why I want to see both the high quality, high res result but also the low res, lower quality results to ascertain the viability of the CPU ...

    This review did not do that... the question whether Nehalem actually improves gaming is still a question mark... I do not expect a huge leap, and I suspect to see some games actually under perform ... but the oddity of this data set is that all the CPUs compared bunched up to be roughly the same ... this is GPU limited.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  24. #74
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by JumpingJack View Post
    Depends on the game... Gen-1 games, yeah I agree.... but current DX10 games... even 1280x1024 with high fidelity settings is GPU limited.

    WIC at 1280x800, high setting default, QX9650 @ 3.2 Ghz
    Min =32 Ave = 58 Max = 154

    WIC at 1280x800, high settings default, QX9650 @ 2.66 Ghz
    min =28 Ave = 59 Max = 141

    This is on a 4870X2 ... lost planet is doing the same thing (everything high)..... so this review ran GPU limited by tries to conclude (as well as most other on this thread) about the CPU.... this is incorrect. EDIT: Note I ran XP DX9, DX10 will be even more the same...

    Does it make a difference, nope ... why? Because we like to play at those settings ... however, I personally, prefer to not buy a whole new system ... the CPU is the lowest common denomenator -- and revs every 1-2 years, GPUs rev every 6-9 months, so if I want to future proof -- I prefer the fastest CPU then incrementally upgrade the GPU as needed... that's me ... which is why I want to see both the high quality, high res result but also the low res, lower quality results to ascertain the viability of the CPU ...

    This review did not do that... the question whether Nehalem actually improves gaming is still a question mark... I do not expect a huge leap, and I suspect to see some games actually under perform ... but the oddity of this data set is that all the CPUs compared bunched up to be roughly the same ... this is GPU limited.
    I see your point, but who is buying a nehalem system in order to play CPU limited DX7 and DX8 games that already run in the 100's of fps though?

    The thing you *need* more performance for is current day games and if they're so GPU bound at 12x10 on average, I'd say it's probably not worth the price to someone who wants gaming performance. Just my opinion though.

    Do you mean to imply that we don't know the CPUs gaming potential purely because they tested 12x10 with GPU limited settings? If so, you might have problems in the future, because that trend is only going to continue, and the average resolution is going up, not down.

    Should they have bumped down resolution to settings that no one plays at to gauge the CPUs "gaming performance"? Sure if you're benchmarking by running a game at 800x600 you can call it "gaming performance" but it's not real-world gaming performance.

    I guess the fundamental question is.. what is "gaming performance"?

  25. #75
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Glow9 View Post
    Ah, hrm.. even so figured those scores be maxed with the new chip *shrug*
    Precisely, if you are looking for a CPU for gaming and that is all you want your CPU for ... then the CPU in your signature line is plenty sufficient. However, if you are a hobbyist who likes to study the fundamental comp sci of the device, and how architectural differences play into the computational result ... this data set produced by this site (gaming results not the others) is worthless.
    Last edited by JumpingJack; 10-18-2008 at 10:15 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

Page 3 of 6 FirstFirst 123456 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •