Page 3 of 3 FirstFirst 123
Results 51 to 64 of 64

Thread: AMD's Phenom Processor @ lostcircuits.com

  1. #51
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    KTE-
    Yeah, I also ran FRAPs during a run as well ... and as I experiment more with this game/bench, it still is overall odd behavior.... I attached the raw data and the XLS file, if you open the XLS file you can see the STDEV function calculation, but I will check again. EDIT: I checked, no my calculations are correct...

    EDIT:
    BIG discoveries, and I think it goes a long way to explaining the LC data and the error (and it is an honest error).... the UT3BENCH is buggy, massively buggy... to check, run it then click on the UT3 logo, an instance of Notpad appears with a directory error... this is just one, but after you do that it won't launch UT3.... ok, this is not that pertinent, but demonstrates the crappy utility it is....

    The big discovery is this ... if you run the UT3 bench utility, set the map to play and the resolution and nothing else (let it default), it defaults to 'lowest' quality settings per the utility (i.e. 1 for both options), however if you run it... it runs UT3 at the highest quality settings. You must change the quality options at least once from 1 to something else, then back to 1 to actually run it low quality (note: all my data I have presented is sane, as I visually watched the run run in low quality). Nonetheless, unless you look for it... you will 'think' you are running low quality. Frankly, the visual acuity of low quality is pretty darn good and a testement to the UT3 engine.

    here is the data/example of what I am talking about....

    Fresh run of UT3BENCH, only set res and map (do not touch quality, default to low):



    Change visual quality to 5 or high (see setting in screen shot):


    Finally, change the quality setting back to 1 and run again:



    It was tough to grab a screen at a point in the benchmark that was roughly the same, but you can see the lighting and coloring quality of the sign to the upper left. It is clear that unless you change the quality settings at least once, the UT3BENCH will not appropriately setup the bench run to what you think it is.

    At least that explains LC's screen shot being of high quality settings and not low quality settings.


    UT3BENCH actually takes a copy of the UTengine.ini file, alters it and runs the bench accordingly, obviously it does not make the appropriate edits at default for the initial run ...

    Ok, this was a huge discovery ... it is probably best to set the game settings oneself, and run from the shortcut commandline I posted earlier.

    Jack
    Last edited by JumpingJack; 03-10-2008 at 09:35 AM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  2. #52
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    139
    Nice work Jack, thanks. Just shows how difficult it is to benchmark properly and to ensure the results are accurate.

  3. #53
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Thesavage View Post
    Nice work Jack, thanks. Just shows how difficult it is to benchmark properly and to ensure the results are accurate.
    Yep ... one or two misses on the details and it can throw the analysis completely off ...

    I would never have paid much attention to it until people started looking at it, then me challenging it, and others countering ... ultimately this thread has been very useful for me, I have picked out a lot of detail ... if it weren't for this thread, I would not have given it a second thought.

    What I have also found is that the UT3 engine is a phenomenal peice of work, much better than the Crysis 2 engine in terms of utilzing the CPU. UT3 scales better with core count than anything I have seen (with maybe the exception of WIC, but I think it still does better). It is a great CPU and GPU stresser, I just need to underestand all the nuiances before I can do anything systematic with it.

    This is the advantage to (polite) confrontational debate on the DATA

    Jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  4. #54
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    139
    Have you had any further opportunity to test threading in the Fear engine? I have looked at this myself on the same system as you are using (Phenom 9600BE and Asus M3A32-MVP) but i found the results unusual.I use a Logitech G15 keyboard to monitor core and memory movement and usage. What I have found so far is there is one core doing most of the work but the other 3 are aslo in use, as much as 25%(this is BTW in XP 64)

  5. #55
    Xtreme Mentor
    Join Date
    May 2007
    Posts
    2,792
    LOL Jack, you've made many major edits all through the thread! Basically changed what you had said in many places. The last time I read the thread, is not what I'm reading right now in many places, so a bit of a catch 22 being experienced. Thus back off to the top again to read again... Good discovery though, all I can do yet is since I can't get it to run.

    UT3, I have not managed to get it run. It keeps reading: "modified executable code is not allowed". This is on XP 32b. Maybe SP3 has something to do with it...

    When I installed the full copy, it gave that error. Tried it 3-4 times. When I uninstalled and reinstalled UT3 Demo with the latest patches, it gives the same error. Took over 2 days just to download the demo. Thus I can't really say much yet.

    Quote Originally Posted by JumpingJack View Post
    KTE-
    Yeah, I also ran FRAPs during a run as well ... and as I experiment more with this game/bench, it still is overall odd behavior.... I attached the raw data and the XLS file, if you open the XLS file you can see the STDEV function calculation, but I will check again. EDIT: I checked, no my calculations are correct...
    Yeah I looked high and low and I don't see a XLS file in this thread, so no calculations that I can find (?) Where is it supposed to be?

    All I can see is the benchmark text file outputs and the figures you posted of STDev and Mean STDev. Using these runs you posted in the code tags I get different answers, so I'd like to take a look at the XLS.

    Click image for larger version. 

Name:	UT3 Res Scaling.png 
Views:	419 
Size:	44.6 KB 
ID:	73892

  6. #56
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by KTE View Post
    LOL Jack, you've made many major edits all through the thread! Basically changed what you had said in many places. The last time I read the thread, is not what I'm reading right now in many places, so a bit of a catch 22 being experienced. Thus back off to the top again to read again... Good discovery though, all I can do yet is since I can't get it to run.

    Click image for larger version. 

Name:	UT3 Res Scaling.png 
Views:	419 
Size:	44.6 KB 
ID:	73892
    yeah, on that post I edited it about 4 times ... I did not change what I said so much as I added a bit more detail, and attached the data (I forgot you could attach XLS files with vBulletin).

    It should be listed as an attachement at the bottom of the thread. It is in the sheet 1 tab.

    EDit (Another edit ) ... you are right, it is not there...

    I will attach it to this post. Ooops, I was wrong... XLS files won't post... let me zip it.

    EDIT 2: Attached

    EDIT 3: Also, please check my translation from the raw output to the XLS file, I make typos often times.

    EDIT 4: ... I did find one typo... Run2 for 640x480 is 126.33 not 123.33 as I originally typed, all others I double checked. How are you calculating sigma? This correction to the typo changes sigma for the 640x480 run to 15.86810816

    EDIT5: Friggin' A ... file was zipped bad. Re-attached.
    Attached Files Attached Files
    Last edited by JumpingJack; 03-10-2008 at 10:10 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  7. #57
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Thesavage View Post
    Have you had any further opportunity to test threading in the Fear engine? I have looked at this myself on the same system as you are using (Phenom 9600BE and Asus M3A32-MVP) but i found the results unusual.I use a Logitech G15 keyboard to monitor core and memory movement and usage. What I have found so far is there is one core doing most of the work but the other 3 are aslo in use, as much as 25%(this is BTW in XP 64)
    I have done an extensive pre/pst TLB patch run on FEAR, and I also get weird results....

    To improve your variability, set your settings. Apply them, exit the game, restart then run the test.

    jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  8. #58
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    KTE -- if you are really interested in chasing down this game as a CPU bench, I ran another experiment to check core scaling at 640x480 and 1280x1024, I have attached the raw data and the XLS file in the ZIP below.

    Config: Phenom 9600 BE at stock, 8800 GTX, 2 gigs ram DDR2-800, UT bench utility used, ensured low quality settings at stated resolutions to put workload back on CPU.

    The results are shown below. Two items I found interesting is that sigma generally goes up with frame rate (several interpretations on this observation), and the 1280x1024 gives higher FPS in general though with these statistics it is hard to make that conclusion solid.

    What is clear, at 3 cores, even a very nicely threaded game such as UT3 has been sufficiently 'core saturated' so to speak... this bodes very well for the Phenom 8xxx series as being good for gaming, even for multithreaded gaming.

    Jack
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	UT3 Core Scaling.JPG 
Views:	311 
Size:	46.0 KB 
ID:	73926  
    Attached Files Attached Files
    Last edited by JumpingJack; 03-10-2008 at 10:06 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  9. #59
    Xtreme Mentor
    Join Date
    May 2007
    Posts
    2,792
    Quote Originally Posted by JumpingJack View Post
    yeah, on that post I edited it about 4 times ... I did not change what I said so much as I added a bit more detail, and attached the data (I forgot you could attach XLS files with vBulletin).

    It should be listed as an attachement at the bottom of the thread. It is in the sheet 1 tab.

    EDit (Another edit ) ... you are right, it is not there...

    I will attach it to this post. Ooops, I was wrong... XLS files won't post... let me zip it.

    EDIT 2: Attached

    EDIT 3: Also, please check my translation from the raw output to the XLS file, I make typos often times.

    EDIT 4: ... I did find one typo... Run2 for 640x480 is 126.33 not 123.33 as I originally typed, all others I double checked. How are you calculating sigma? This correction to the typo changes sigma for the 640x480 run to 15.86810816

    EDIT5: Friggin' A ... file was zipped bad. Re-attached.
    Hahaha

    Jack, I was reading it real-time. Have you any idea of the number of edits in your posts within this thread?
    My PC had been infected by Bavriax.exe virus which attaches itself to trojan downloaders and then someone takes over your system to act as a farmbot (warning: downloaded UT3Demo from a legitimate server, extracted and it had this virus execute automatically - was not picked up by AV/FW (fileplanet IIRC)). It's next to impossible to get rid of (new) and only picked by online updated scanners since the virus is kernel level, Safe Mode won't help, it escapes all detection and has no visible processes running. It won't let you execute any software at all and rewires your keyboard mappings. At that time, I knew it was sending and retriving data online heavily but while gearing to deal with it and the hacker who kept writing text all over my screen in a large notepad and changing screensize to 800x600, I was online on and off, reading this thread. Trying to download the attached zip file. It was broken and giving me this error:
    Click image for larger version. 

Name:	error.png 
Views:	286 
Size:	12.8 KB 
ID:	73935

    So I couldn't read. After that I couldn't execute anything at all so no net access to read updates. Took me 7 hours manually to deal with it. No software worked. What a bloody PITA. It's working now but still won't be fully clean, it's botched my Fx %AppData% dir and everything I download is going somewhere where it's not supposed to and a place I don't see. Thus it's added much more delays to my schedule now and what I can test right now. Basically, nothing for now until I get this all fixed.

    Thanks for the files. I've checked them out now, they're good.

    Re the sigma: I was using an experimental, completely JS written scientific tool coded by a friend. It was supposed to be stable and running basic calcs fine. However, yesterday when running these values through, it gave me the answers I had posted, repeatedly. Later on I had time to debug it and it was a bug. I'm not sure how but it's caused, but the coding for n-1 kept defaulting to n so the squared variance was dividing by 6 values whereas it should divide by 5 and then square root. That led to the error in end values, apologies for the extra hydra. Those results you posted I hand checked early morning, and they are correct including the new values.
    Quote Originally Posted by JumpingJack View Post
    KTE -- if you are really interested in chasing down this game as a CPU bench, I ran another experiment to check core scaling at 640x480 and 1280x1024, I have attached the raw data and the XLS file in the ZIP below.

    Config: Phenom 9600 BE at stock, 8800 GTX, 2 gigs ram DDR2-800, UT bench utility used, ensured low quality settings at stated resolutions to put workload back on CPU.

    The results are shown below. Two items I found interesting is that sigma generally goes up with frame rate (several interpretations on this observation), and the 1280x1024 gives higher FPS in general though with these statistics it is hard to make that conclusion solid.

    What is clear, at 3 cores, even a very nicely threaded game such as UT3 has been sufficiently 'core saturated' so to speak... this bodes very well for the Phenom 8xxx series as being good for gaming, even for multithreaded gaming.
    Jack
    Thanks. I'll look through it with better time, resources and opportunity - may well be a while yet though.

    Previously, I have benched UT3/Prey/Crysis/FarCry/FEAR on Q6600 G0 and Phenom 9600BE using a 3870 but only at generally played resolutions mid (1680x1050) to high (1920x1200). It was a mixed scenario since the GPU was bottlenecked many times but the CPU had enough code being fed to still scale with 100MHz speed changes. Overall, at those resolutions it was showing UT3/Prey being very Intel favoring while Crysis was near same for both (wasn't a CPU dependent game, nor was it a quad-core optimized game, as this of a few thread shows), but FEAR favored Phenom. FarCry, I don't really remember the output of and I don't have the data drives with me here, but from cached instinct, I would wager that it was showing Core2/Penryn favoritism clock for clock.

    Add: I played FarCry just now with a Q6600 and it used 27-29% of the cores consistent at 1024x768 full-screen and the same for 800x600 full-screen. FEAR was just a little less.

  10. #60
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    235
    great work Jack ,very informative
    ---
    ---
    "Generally speaking, CMOS power consumption is the result of charging and discharging gate capacitors. The charge required to fully charge the gate grows with the voltage; charge times frequency is current. Voltage times current is power. So, as you raise the voltage, the current consumption grows linearly, and the power consumption quadratically, at a fixed frequency. Once you reach the frequency limit of the chip without raising the voltage, further frequency increases are normally proportional to voltage. In other words, once you have to start raising the voltage, power consumption tends to rise with the cube of frequency."
    +++
    1st
    CPU - 2600K(4.4ghz)/Mobo - AsusEvo/RAM - 8GB1866mhz/Cooler - VX/Gfx - Radeon 6950/PSU - EnermaxModu87+700W
    +++
    2nd
    TRUltra-120Xtreme /// EnermaxModu82+(625w) /// abitIP35pro/// YorkfieldQ9650-->3906mhz(1.28V) /// 640AAKS & samsung F1 1T &samsung F1640gb&F1 RAID 1T /// 4gigs of RAM-->520mhz /// radeon 4850(700mhz)-->TRHR-03 GT
    ++++
    3rd
    Windsor4200(11x246-->2706mhz-->1.52v) : Zalman9500 : M2N32-SLI Deluxe : 2GB ddr2 SuperTalent-->451mhz : seagate 7200.10 320GB :7900GT(530/700) : Tagan530w

  11. #61
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by KTE View Post
    Hahaha

    Jack, I was reading it real-time. Have you any idea of the number of edits in your posts within this thread?
    Yeah, the reasons are I want to ensure I get it right. and second, my tone toward LC at the initial post was unwarranted and uncalled for... I should have been softer in my tone. It was obviously an honest mistake, especially as I catgorize bugs in the utility he used. LC has always been one of my favorite sites, and I really appreciate his at the socket power measurements... those are alwasy very informative.

    jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  12. #62
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by KTE View Post
    Hahaha

    Add: I played FarCry just now with a Q6600 and it used 27-29% of the cores consistent at 1024x768 full-screen and the same for 800x600 full-screen. FEAR was just a little less.
    I haven't gone though your entire last post, but this thread is gonna turn into 'the traps and tricks of benching with games' thread Hang tight, I will grab some screen shots as well on UT3 since this is the current focus.... very enlightening. Also, I bolded the FPS 109 above for the 3 core run 1 because it was an obvious outlier... I need to repeat that run several more times to be confident in it.

    Jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  13. #63
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by KTE View Post
    Hahaha
    Re the sigma: I was using an experimental, completely JS written scientific tool coded by a friend. It was supposed to be stable and running basic calcs fine. However, yesterday when running these values through, it gave me the answers I had posted, repeatedly. Later on I had time to debug it and it was a bug. I'm not sure how but it's caused, but the coding for n-1 kept defaulting to n so the squared variance was dividing by 6 values whereas it should divide by 5 and then square root. That led to the error in end values, apologies for the extra hydra. Those results you posted I hand checked early morning, and they are correct including the new values.
    Yep, you lose that degree of freedom ... I make that mistake myself. I have become very reliant on excel and statistica for doing routine statistics.

    Previously, I have benched UT3/Prey/Crysis/FarCry/FEAR on Q6600 G0 and Phenom 9600BE using a 3870 but only at generally played resolutions mid (1680x1050) to high (1920x1200). It was a mixed scenario since the GPU was bottlenecked many times but the CPU had enough code being fed to still scale with 100MHz speed changes. Overall, at those resolutions it was showing UT3/Prey being very Intel favoring while Crysis was near same for both (wasn't a CPU dependent game, nor was it a quad-core optimized game, as this of a few thread shows), but FEAR favored Phenom. FarCry, I don't really remember the output of and I don't have the data drives with me here, but from cached instinct, I would wager that it was showing Core2/Penryn favoritism clock for clock.
    I am seeing similar results, I have not benched at Q6600 speeds though, I need to go back to that so FEAR is behaving differently for me, I have no doubts at Q6600 speeds this is true as I did a Q6700 (speed) wise and made the same observation.

    Crysis is turning out to be one heck of a trick. Not sure how much time I want to spend on fleshing it out, right now I pretty well have UT3 well understood.... but Crysis is proving to be 'weird' to say the least. Not doing an Intel/AMD comparision, rather just a core scaling comparision.

    If I get enough data to put together a good presentation I will post on it, but if I do I would beg if you or someone with similar HW can repeat what I would show.... it is just bizzare. On that topic, the trap I think reviewers are falling into is simply clicking on the 'set all' drop down option in the advanced tab. If you are going to use Crysis as a CPU bench, the physics and particle qualities should be set to high in order to put as much stress on the CPU, while keeping the graphics intensive options low to remove the stress from the GPU. Of course this is not 'played at' settings, but I am looking at the game as a CPU test for the most part.

    Jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  14. #64
    Xtreme Mentor
    Join Date
    May 2007
    Posts
    2,792
    Jack,

    I've been trying to replicate your runs but sorry, my Phenom is dead for 4 days now. No trick has worked and it is erroring at the memory initializing stage (C1). The LEDs on the Tracer start and then jam -- tried 3 working boards and 3 working PSU's. To say the least, my 9500 and both my last 9600 BE's did this if I cleared CMOS, randomly, but they would all restore to normal operation when left off for 8-9 hours or put under cold. This one doesn't. Multiple other users have had the same problem when settings are reset after flashing any BIOS. Achim had his 9500 die too and it was out of the blue. He gets the same errors. At the time of death mine was running stock (because I was testing old and new BIOSes). Prior to that, for the last two weeks it had been on low-ish volts. VCore 1.26-1.30V, HT 1.20V, NB 1.1V, IMC 1.038-1.10V, DIMM 2.0-2.2V. In particular, notice the delta between vIMM and vIMC. Throughout my experience, it would certainly appear to me that Phenom does indeed have major IMC problems, that's of the retail ones. Whether that's by running plus 2.0 VDIMM, or by increasing the deltaV between IMC and DIMM... I'm not sure, because we have no way to tell unless a Phenom susceptible to this is tested at low and high volts. I must admit, I am strongly leaning to it being dead because of running 2.2V DIMM @ sub 1.2V IMC. Achim's Phenom lasted around 2.5 months (?) and mine lasted roughly just short of 2 months.
    Anyway, the grunt of the circumstance; the replacement Phenom won't arrive for long and when it does, I won't be testing it at all. It will go to the department it belongs at my uncles firm. Quite unfortunate since I had installed and setup benchmarks for 3dsMax 2008, Sony Vegas, SPECap, SPECopc, Maya 6.5, TrueSpace 7.5 and many major benchmarks to comparitively test with just this week. Honestly, I have no plans to play with Phenom anymore.

    UT3: I have not been able to get it running on the Intel system. 3 systems all are giving me the same problems and they are different CD installs with and without the 1.1 and 1.2 patches. I get the same problem with UT 2003 and UT 2004. Looking at the documentation websites it seems to be a bug which only affected XP 64-bit before, that they knew (which they released hotfix in the patch).

    Crysis: Been playing the game at these settings attached below, with High Physics and High Particles at 1028x768 and lower resolutions. Higher resolutions than this requires more video memory and will bottleneck only my GPU. I'm on a 3.7G Q6600/P35/PC2-8000/2600XT/Cat 8.2 (Cat. 8.3 has up to 15% DX10 performance improvement and much for DX9). Maximum CPU usage it ever gives is 34% while average is near the 30% mark (same power draw as FarCry). In 60 minutes playtime, av. FPS is hovering at 55-58 (pretty smooth). Highest seen is 101 FPS and lowest is 28 FPS. FRAPS doesn' work with it, it shows the FPS OK but keeps writing a massive list of 1 and 0 FPS with the benchmark option when there's plus 50 FPS for sure, although it does catch a few of higher FPS too. So I've been using the r_displayinfo = 1 command instead.

    At those settings, the CPU Bench gives me av. FPS out of 4 loops as 48 while the GPU bench gives me av. FPS of 79.

    Disabling one and then two cores gets the same play rate as four cores enabled.

    Click image for larger version. 

Name:	Crysissetting.jpg 
Views:	267 
Size:	139.3 KB 
ID:	73969

Page 3 of 3 FirstFirst 123

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •