Page 14 of 21 FirstFirst ... 411121314151617 ... LastLast
Results 326 to 350 of 525

Thread: Intel Q9450 vs Phenom 9850 - ATI HD3870 X2

  1. #326
    Muslim Overclocker
    Join Date
    May 2005
    Location
    Canada
    Posts
    2,786
    Jack, lots of good info, thanks.

    But we need more controlled tests before we can jump to conclusions:

    1. Don't use games to check CPU usage. You have no idea what games do when it comes to code, and things like sound, network code, and other libraries used may use more CPU cycles than desired, at unpredictable times. To avoid this, I think it would be best to do this comparison in something such as 3dmark (vantage or 06). I will do such tests as soon as I find the time.

    2. Issue of texture fetches cannot be ignored at higher resolutions, and its a known problem with low-memory video cards: take a 4870 and chop its memory in half, do you think CPU and bus efficiency won't effect performance at 640 res vs 1600 res? Much like running at a higher res vs lower res, wouldn't you agree?

    3. The graph you present which compares CPU utilization at different resolutions is flawed: the complexity introduced with dual-gpu solutions negatively impacts CPU performance. Its best to stick to one GPU for both tests to minimize number the number of variables



    @Boschwanza,

    If you want to do these comparisons, try setting the affinity of the game's thread to only one core. This might present more predictable results.


    Other than that, interesting discussion!

    My watercooling experience

    Water
    Scythe Gentle Typhoons 120mm 1850RPM
    Thermochill PA120.3 Radiator
    Enzotech Sapphire Rev.A CPU Block
    Laing DDC 3.2
    XSPC Dual Pump Reservoir
    Primochill Pro LRT Red 1/2"
    Bitspower fittings + water temp sensor

    Rig
    E8400 | 4GB HyperX PC8500 | Corsair HX620W | ATI HD4870 512MB


    I see what I see, and you see what you see. I can't make you see what I see, but I can tell you what I see is not what you see. Truth is, we see what we want to see, and what we want to see is what those around us see. And what we don't see is... well, conspiracies.



  2. #327
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by ahmad View Post
    Jack, lots of good info, thanks.

    But we need more controlled tests before we can jump to conclusions:

    1. Don't use games to check CPU usage. You have no idea what games do when it comes to code, and things like sound, network code, and other libraries used may use more CPU cycles than desired, at unpredictable times. To avoid this, I think it would be best to do this comparison in something such as 3dmark (vantage or 06). I will do such tests as soon as I find the time.

    2. Issue of texture fetches cannot be ignored at higher resolutions, and its a known problem with low-memory video cards: take a 4870 and chop its memory in half, do you think CPU and bus efficiency won't effect performance at 640 res vs 1600 res? Much like running at a higher res vs lower res, wouldn't you agree?

    3. The graph you present which compares CPU utilization at different resolutions is flawed: the complexity introduced with dual-gpu solutions negatively impacts CPU performance. Its best to stick to one GPU for both tests to minimize number the number of variables



    @Boschwanza,

    If you want to do these comparisons, try setting the affinity of the game's thread to only one core. This might present more predictable results.


    Other than that, interesting discussion!
    .....
    On point 1 ... what I want is a gross check of the concepts. I.e. the premise is that for low load on the GPU, the CPU is free to run amuck, as such it should show max utilization for that load (i.e. very high). At high GPU loads (bottleneck), then the CPU will stall waiting on the GPU, and utilization will go down. It is not intended to be systematically decomposing the code. Just testing the bottlneck theory.

    On point 2 ... which is why I put in a 1 Gig mem card.... the most interesting article is here: http://www.yougamers.com/articles/13...ly_need-page2/ Texture thrashing is not a big deal, for most games with 512, it appears to be enough for the most part. However, the recent release of the 4870 X2 (which I promptly refitted my baseline builds with for this very reason), two reviews show one game thrashing the memory at ubber high res for the 512 Meg cards ... all the others appear to be no issues:
    http://www.anandtech.com/video/showdoc.aspx?i=3372&p=9
    http://www.guru3d.com/article/radeon...w-crossfire/11
    Race Car Driver GRID is able to fill the 512 Meg cards at 2560x1600 as seen by the precipitous drop in FPS going from 1920x1200 to 2560x1600. Other than that, it would appear most games are fitting into memory ok...

    However, I have noticed some games are very poor at precaching the textures (Doom 3 on some user generated level does not precache all the textures, for example -- somewhere in the thread I make mention of an experiment to observe the phenomena) ... when I do observe it I will make sure I point that out -- so far in the bits shown around here, I have not seen any texture thrashing that would screw with the results.


    3. Again, the test was to show that when the GPU becomes the bottleneck then the CPU stalls as such the utilization goes down (the CPU sits around doing nothing), it was not a flawed test... it demonstrated the effect very nicely.

    jack
    Last edited by JumpingJack; 08-19-2008 at 08:58 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  3. #328
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Boschwanza View Post
    I never said that, again: FSBs physical latency stays always the same for an Intel Core no matter how large cache or fast the FSB is. The lines, soldered on the board, provide a fixed latency, which the FSB protocol has to respect. Thats the weak point in my point of view. If the prefetcher catches the wrong data the whole Cache (no matter what size) is pollutet with sensless Data and a very very long access (through all the PCB layers) to the memory is necessary. In this Case the great advantage of the Core architecture (getting the data fast and very close to the processing unit) has turned upside down.

    Jack, i just made a quick test with CoH and a 1950Pro, unfortunatly i cannot go higher with the resolution due to a 19 TFT


    First shows Resolution 1280*1024 CPU settings max GPU setting max.
    Second shows Resolution 800*600 CPU settings max GPU Settings low




    I have alot more Action on Core 3 with higher GPU Setting I'm confused . Maybe you can check this out. Besides can you give me a script or program which records the cpu utilization ?

    I´ll try to get more data later
    It is interesting to see how the different cores behave with respect to thread scheduling this, though, is a function of the OS ... run it a few more times and you will likely see that change to different cores depending.

    Some Apps can force an affinity to a particular core, try running FEAR and see where it forces the single thread ... then run SuperPI (as a counter example) and see how the cores are utilized.

    Jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  4. #329
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Ok, so for those entering the thread on this page or just starting to follow the conversation -- this post is referenced to a post a page or two back. What I did was used the very extreme case of ultra low resolution and ultra high resolution conditions to force a GPU bottleneck and demonstrated the CPU utilization in the case that the CPU was stalled by waiting on the GPU. That run (again, two pages back) was done on a QX9650 clocked at 2.5 GHz using the latest 4870 X2 graphics card, details of the observation were written into the graph.

    Essentially, I am working the experiment with this particular game engine to show that two competing resources affect each other differently, by varying the load on one and observing the response on the other.

    This post is simply the Phenom 9850 clocked at 2.5 Ghz version of the same experiment on Lost Planet. I try to box out the regions of interest with an explanation of what is what. Pardon the typo's...

    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  5. #330
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Is it possible to focus testing on low FPS? If there are games out there that have accurate testing showing MIN FPS?
    The thing that Boschwanza wrote could explain all these data
    Quote Originally Posted by Boschwanza View Post
    So when your are at high resolutions a graphic card is like a frame limiter and there is more focusing on the low FPS, so where there are more latency holes, frames will drop much more then with a K10 and you will might get an better average score with the K10 because the better high fps score of a Core2Q are simply cutted off .
    If the differences in grid compared to Intel are small then it would be almost impossible to draw any conclusions if there isn’t any way to do exact the same test. If the difference is larger that it could probably be done anyway.
    But one thing that could be tested is to test one game and try to get as low FPS as possible. Select parts of games where you know that the game could slow down or lag and test that part to se how low you can go with each platform.

    The test in FEAR when memory was at 1067 and processors was clocked to 3.0 GHz. Min FPS was at 48 and Intel had 40. Was this random or it is the same every time?
    It would be interesting to se the MIN FPS score if you clock Intel to 3.6 GHz or more

  6. #331
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by gosh View Post
    Is FEAR single threaded?

    Testing a game that is using more threads (more memory, more details) will probably find if AMD doesn’t slow down as much.
    With me on FEAR my minimum FPS is 58FPS. @ maximum resolution my monitor can take (1680*1050) 4XFSAA 16XAF and Soft Shaddows.
    John
    Stop looking at the walls, look out the window

  7. #332
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Is it possible to test only MIN FPS on crysis?

    The test that i linked to in the first message the difference was larger on MIN FPS compared to MAX

    Game: Crysis
    Settings: 1280x1024, 1xAA/1xAF, DX9 everything HIGH
    Q9450 = Min FPS 22.46, Max FPS 61.71, Avg 41.96
    Phenom 9850 = Min FPS 29.82, Max FPS 63.20, Avg 47.89

    EDIT: Checked the documents from Jack and it seems hard to do any with min there (results differs, better min on higher res..).
    Last edited by gosh; 08-20-2008 at 04:10 AM.

  8. #333
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    I am doing some testing.
    Results shall be ready to get published in the evening.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  9. #334
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by BenchZowner View Post
    I am doing some testing.
    Results shall be ready to get published in the evening.
    If you are doing ANY benching on X38/X48 based boards be sure to grab the latest available BIOS either from the thread I linked or from the manufacturers otherwise you will have the X38/X48 PCI-E 2.0 disadvantage at high res/high FSAA.
    John
    Stop looking at the walls, look out the window

  10. #335
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by JohnZS View Post
    If you are doing ANY benching on X38/X48 based boards be sure to grab the latest available BIOS either from the thread I linked or from the manufacturers otherwise you will have the X38/X48 PCI-E 2.0 disadvantage at high res/high FSAA.
    John
    As far as I know/remember, the BIOSes addressed a "bug" with the 9800GX2, not exactly a PCI-Express 2.0 BUS issue or bandwidth related issue.
    Doesn't even matter, since I'm running the latest BETA's on every board.

    I'm thinking of starting some tests with the Striker II Extreme though, because of the "thing" that I noticed during its review ( check my previous post ).
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  11. #336
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    178
    Just found sometihn interesting. Intel makes a game contest to find the best threaded game. There are plenty of demos available maybe we can choose one of the and check them out on a intel and AMD platform.

    http://softwarecontests.intel.com/ga...hp?entryid=157

  12. #337
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by BenchZowner View Post
    As far as I know/remember, the BIOSes addressed a "bug" with the 9800GX2, not exactly a PCI-Express 2.0 BUS issue or bandwidth related issue.
    Doesn't even matter, since I'm running the latest BETA's on every board.

    I'm thinking of starting some tests with the Striker II Extreme though, because of the "thing" that I noticed during its review ( check my previous post ).
    I look forward to your results

    John
    Stop looking at the walls, look out the window

  13. #338
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by demonkevy666 View Post
    gosh, said you can't clock AMD's L3 cache thats not true
    Interesting! There was another discussion talking about how bad AMD was and they said that the L3 cache (or just cache) was so slow and that it wasn’t possible to overclock the L3 cache. None in that discussion said anything other. If it is possible so very good

  14. #339
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    Interesting! There was another discussion talking about how bad AMD was and they said that the L3 cache (or just cache) was so slow and that it wasn’t possible to overclock the L3 cache. None in that discussion said anything other. If it is possible so very good
    Gosh ... where was this discussion? (I don't get into every thread) This is just ludicrous.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  15. #340
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    393
    Quote Originally Posted by gosh View Post
    Is it possible to focus testing on low FPS? If there are games out there that have accurate testing showing MIN FPS?
    The thing that Boschwanza wrote could explain all these data
    Jack gave an excellent rebuttal to Boschwanza's "theory." Video cards are frame limiters at higher resolution but CPUs will always go through data at the same speed, no matter what the resolution. At higher resolution, the CPU is waiting for the GPU, thus CPU utilization drops, which is shown in the task manager.

    It makes perfect sense to me, keep up the good work Jack!
    Last edited by Clairvoyant129; 08-20-2008 at 07:15 PM.

  16. #341
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    178
    Quote Originally Posted by Clairvoyant129 View Post
    Jack gave an excellent rebuttal to Boschwanza's "theory." Video cards are frame limiters at higher resolution but CPUs will always go through data at the same speed, no matter what the resolution. At higher resolution, the CPU is waiting for the GPU, thus CPU utilization drops, which is shown in the task manager.

    It makes perfect sense to me, keep up the good work Jack!
    I wouldnt say there is just "black and white" and that simple, i mean it will depend on the game.

    Call of Duty 4 and Lost Planet act exactly how Jack descriped. CoH gave me the same Utilization no matter what resolution and WiC its mixed, while there is not much action Utilization decreases with higher resolution but on the heavy stuff processor load increases.

    There is one thing we have to take care about a GPU bounded situation wont exclude a CPU bounded situation.

  17. #342
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by JumpingJack View Post
    Gosh ... where was this discussion? (I don't get into every thread) This is just ludicrous.
    Not in this forum and this was in another language .
    What I was told there was that the frequency of the L3 cache was locked and couldn’t be changed. The discussion was about how good i7 (Nehalem) was because (one of the reasoon) the L3 cache was so much faster and was running at the same frequency as the processor.

    For games it should be some speed to gain clocking L3.
    Last edited by gosh; 08-21-2008 at 02:09 AM.

  18. #343
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by Boschwanza View Post
    This is quite interesting, during the atomic bomb attack and the bomber raid the CPU Utilization is increasing .
    Is smoke etc done with vertex data?
    If some type of graphics has some sort of knowledge on the resolution it could be that the processor needs to calculate more on higher resolutions. Don't know this, just one speculation

  19. #344
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    Is smoke etc done with vertex data?
    If some type of graphics has some sort of knowledge on the resolution it could be that the processor needs to calculate more on higher resolutions. Don't know this, just one speculation
    Yes, volumetric smoke gets vertexes, shaded, and made translucent just like any other 3D object.

    Download the Toy Shop demo, and run at different resolutions and look at the smoke generated from the chimney's, press W to see the wire frame ... the number of vertexes does not change... this is fixed by the 3D model framework. At least this is what I have been able to gather in my reading.

    http://developer.download.nvidia.com...gems3_ch30.pdf some interesting details.

    EDIT: Here is a very interesting one: http://www.iam.unibe.ch/publikatione...le/at_download discussing 'threading' fluid simulations for 3D rendering... the work load here is on the CPU.
    Last edited by JumpingJack; 08-21-2008 at 09:21 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  20. #345
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    Not in this forum and this was in another language .
    What I was told there was that the frequency of the L3 cache was locked and couldn’t be changed. The discussion was about how good i7 (Nehalem) was because (one of the reasoon) the L3 cache was so much faster and was running at the same frequency as the processor.

    For games it should be some speed to gain clocking L3.
    Send them here, I will straighten them out... i have had my northbridge clocked as high as 2.6 Ghz.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  21. #346
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by gosh View Post
    Is smoke etc done with vertex data?
    If some type of graphics has some sort of knowledge on the resolution it could be that the processor needs to calculate more on higher resolutions. Don't know this, just one speculation
    What I think happened here is that in stead of just changing resolution and GPU related load settings, he simply used presets to change between high settings and low settings.

    Games for the past 1 or 2 generations give options for physics, particle debris, and smoke/shadows ... these are usually CPU duties.

    I can run an experiment to verify. But those bombing runs, in low settings, do not throw up debris ... at high settings it is covered with debris.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  22. #347
    Xtreme Enthusiast
    Join Date
    Mar 2008
    Posts
    750
    By the way, Jack, which drivers are you using for your HD4870X2's? I have heard that reviews use 8.520.2?
    Motherboard: ASUS P5Q
    CPU: Intel Core 2 Quad Q9450 @ 3.20GHz (1.07v vCore! )
    RAM: 2GB Kingston HyperX 800MHz
    GPU: MSI Radeon HD 4870 @ 780/1000 (default)

  23. #348
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by RunawayPrisoner View Post
    By the way, Jack, which drivers are you using for your HD4870X2's? I have heard that reviews use 8.520.2?
    I updated to 8.8 shortly after they were released. No change in WIC, but Lost Planet sees a nice bump, about 15-20 FPS, HL2-lost coast also showed some gain. I will redo the 3DMark @ 2.5 GHz on each and post them.

    3DMark06@default values 4870X2 with Shipped drivers (scores generated were also posted earlier in this thread)

    Phenom 9850 2.5 GHz Score = 13297

    QX9650 2.5 GHz Score = 14471


    3DMark06 @ default values 4870X2 Catalyst 8.8 August Release
    Phenom 9850 @ 2.5 GHz Score = 13400


    QX9650@2.5 GHz Score = 14820

    Jack
    Last edited by JumpingJack; 08-22-2008 at 12:23 AM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  24. #349
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    178
    Quote Originally Posted by JumpingJack View Post
    What I think happened here is that in stead of just changing resolution and GPU related load settings, he simply used presets to change between high settings and low settings.
    Thats why i did the test again using the exact same presets for WiC "very high" and just change the Resolution to address differences.

    Again



    Will do some more tests this weekend.

  25. #350
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by Boschwanza View Post
    Thats why i did the test again using the exact same presets for WiC "very high" and just change the Resolution to address differences.

    Again



    Will do some more tests this weekend.
    That's with Anti-Aliasing and Anisotropic Filtering enabled or disabled ?
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

Page 14 of 21 FirstFirst ... 411121314151617 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •