X

    Subscribe to New Ads!

    Receive weekly ads in your inbox!



Page 11 of 11 FirstFirst ... 891011
Results 251 to 267 of 267

Thread: AMD FX "Bulldozer" Review - (4) !exclusive! Excuse for 1-Threaded Perf.

  1. #251
    Xtreme Member
    Join Date
    Dec 2007
    Location
    South Africa
    Posts
    215
    For me i always loved AMD for the price per performance relation and after seeing what you did i am sure that BD will give me the same. Now that they support SLI again i am happy to be back. As you indicated chew- in games the BD is on par with the SB and that is all i want. The BD performs better than my I7 setup on most of the test .

    Thanks Chew and others here for all your tests and feedback -helped making a good choice

  2. #252
    Xtreme Addict
    Join Date
    Oct 2006
    Posts
    2,136
    Quote Originally Posted by Oliverda View Post
    I asked because according to BIOS and Kernel Developer’s Guide BD's IMC is not capable for 1866 MHz with 4 modules.
    And older gen AMD isnt officially capable of DDR2000+
    Rig 1:
    ASUS P8Z77-V
    Intel i5 3570K @ 4.75GHz
    16GB of Team Xtreme DDR-2666 RAM (11-13-13-35-2T)
    Nvidia GTX 670 4GB SLI

    Rig 2:
    Asus Sabertooth 990FX
    AMD FX-8350 @ 5.6GHz
    16GB of Mushkin DDR-1866 RAM (8-9-8-26-1T)
    AMD 6950 with 6970 bios flash

    Yamakasi Catleap 2B overclocked to 120Hz refresh rate
    Audio-GD FUN DAC unit w/ AD797BRZ opamps
    Sennheiser PC350 headset w/ hero mod

  3. #253
    Xtreme Member
    Join Date
    Jan 2004
    Posts
    392
    Quote Originally Posted by dess View Post
    So you're willing to play at 640x480?

    ps. you don't need 120 fps for 120 Hz, if your goal is 3D. The GPU will render to two framebuffers at any rate and it's the two buffers that will be alternated at 120 Hz.

    so for you the only options are 640x480 or 1920x1080?!
    and I thought this was a forum for enthusiasts, that care to explore the CPU performance in more ways, than saying "it's enough for my games", because the real word is far more complex than that, you have a lot of different games and situations within the game, and by lowering some settings you can more or less simulate a faster VGA on higher settings, or simply use it as a more valid CPU benchmark,
    if you are willing to test OC and fine tuning gains I also think its valid to lower the settings and observe the gains more easily,

    I'm not saying that testing on these settings have no use, what I'm saying is, how much time does it take to run one or 2 more settings since you already have all running? 3 minutes? so why not? its more information,

    I'm not talking about 3d, I'm talking about using a 120Hz screen with vsync on,

    and I'm just curious to see what happens when the GPU impact is lower and the CPU impact higher,

    also when talking about price, don't forget the 2500k...

    I'm also curious about 4m/8c vs 4m/4c, from what I see on the other pages only 2m/4c vs 4m/4c was tested,

    anyway, thanks for the tests, but that's just my opinion.
    Last edited by Spectrobozo; 10-19-2011 at 09:36 PM.

  4. #254
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,263
    Quote Originally Posted by dess View Post
    So you're willing to play at 640x480?

    ps. you don't need 120 fps for 120 Hz, if your goal is 3D. The GPU will render to two framebuffers at any rate and it's the two buffers that will be alternated at 120 Hz.
    3D Vision generally requires 120fps 2d performance if you hope to have a fluid 60fps 3D experience. Sure you can play at lower framerates but sub 60fps with 3D is the suck as far as I'm concerned.

    And that's quite a stretch to assume you'd need to play at 640x480. There is a reason I'm running my system only at 1680x1050... Spectrobozo poses a very valid question.

    At 1920x1200 with full IQ (ie gpu limited) this 580 performed roughly the same on both my 920 and 2600k when it came to sufficient 60hz monitor performance. Now when you bring 1680x1050 and the 120hz performance ceiling into the equation ( of which is less gpu limited ) the 2600k pulls ahead by a fair margin ( I also often run games with little to no AA as to have consistently high performance )

    Most reviews tend to do both extremes ( low IQ and resolution or high res and IQ ) and not a middle ground. Techpowerup and HWC are two of the better outfits where this is concerned so its easier to see the full picture. I bought a 580 for 1680x1050 because its capable of a much larger performance gap at this resolution relative to the other single gpu options available.

    All this said, I'd like to see more middle ground results comparing various architectures clock for clock. I want to believe the reason why we don't see more of this is 60hz is still the defacto standard and people don't seem to care as much how things perform at lower resolutions ( the whole " More than 60 frames per second is a waste argument " )

    EDIT:

    Quote Originally Posted by Spectrobozo View Post
    and I'm just curious to see what happens when the GPU impact is lower and the CPU impact higher,
    ^ This 100%
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

  5. #255
    Xtreme Member
    Join Date
    Feb 2010
    Posts
    138
    Comparisons are never straight-forward, what with software acting as erratic as it does. Blender in Linux works so much better on AMD hardware than in Windows (refer to review by Johan at AT), it makes you think how many things are borked in Windows. So everything is relative. If it works for you good. If it doesn't, it doesn't mean it is not good, but merely that it doesn't work for you. Buy what you must and stop pushing your opinion on others.

    All the Intel shills quibbling over a difference of 10 odd fps (yes, this is slightly exaggerated) at resolutions nobody plays at, is starting to be a buzz-kill in an AMD thread. You want to buy Intel, then do so. You want to talk about what's screwed up in BD, then do so. However, please don't troll someone who's actually doing a whole lot of favours for all of us here at XS. There were also twerps trolling Movieman, and now i see some people hassling chew*. I'm dead certain that there's a reason why AMD called him for their world record run. He's being modest, but that doesn't stop me from praising him.
    Last edited by tifosi; 10-19-2011 at 09:54 PM.

  6. #256
    Xtreme Addict
    Join Date
    Feb 2008
    Posts
    1,209
    Well Bulldozer wont be pulling ahead of SandyBridge in games, and 2500k or 2600k users would do a mistake to change for a BD setup. Thats not the question right now.

    This thread is not about AMD vs Intel or what is better for gaming, it is about exploring the BD, its bottlenecks, and tweaking capabilities. Still more of an analysis then of a claim "its just as fast as Sandy". If I say such, its more like a joke, noone here is of the opinion BD is actually the better processor. But the more interesting one.

    Me and I think most People here are interested in the tech mostly... Seeing BD performing kinda comparable under some tweaking gives an insight on where it is limited in stock settings.. Nothing more nothing less.

    Ah well, and its giving hopes to people that wanna have one anyway, that they gonna be able to do some gaming actually and dont need a second rig for that - and what they can do about tweaking it...
    Last edited by Oese; 10-19-2011 at 10:06 PM.
    1. ASUS Sabertooth 990fx | FX 8320 || 2. DFI DK 790FXB-M3H5 | X4 810
    8GB Samsung 30nm DDR3-2000 9-10-10-28 || 4GB PSC DDR3-1333 6-7-6-21
    Corsair TX750W | Sapphire 6970 2GB || BeQuiet PurePower 450w | HD 4850
    EK Supreme | AC aquagratix | Laing Pro | MoRa 2 || Aircooled

  7. #257
    XIP
    Join Date
    Nov 2004
    Location
    NYC
    Posts
    5,580
    Like it or not....BD already released now

    Anyone can buy and test it...then give their opinion good or bad with real facts not only based on reviews

    Let BD be as is...AMD's new chip in good or bad ways depending on your own experience with it
    Last edited by Dumo; 10-20-2011 at 01:37 AM.

  8. #258
    Xtreme Enthusiast
    Join Date
    Oct 2008
    Posts
    678
    Quote Originally Posted by chew* View Post
    Thats all fine and well however the point is at at around the same speed on 2 almost identical systems with identical settings the scores and gaming experience and results are almost identical. aka this is apples to apples, not grapefruits to pineapples

    I'm also quite aware if I ran at 1280x1024 results would be different but this aint the 90's and not indicative of the real world


    Put simply do you think that guy with 570's hitting 150fps on a monitor capped at 60hz is getting any better gaming experience than myself on these 2 rigs?

    It's 64 bit win 7 if you haven't noticed the ram quantity, and the game and OS supports DX 11..........

    The goal here is not E-peen fps, just a direct comparison at real world settings that most users will end up with for 24/7

    btw your link = inconclusive Sli? 2 way? 3 way? not enough details imo
    Why are you even posting your results here then? If you only bench these two systems to see if they can play dirt over 60Hz why post it here?
    I thought you posted here because you wanted to share about the performance difference between BD and SB when tweaking the BD. If you hide that performance difference with GPU limitations because you don't care about the performance difference why bother post it? Don't you want to know how large differences you can make with tweaking? That there is no real world difference in one GPU limited game doesn't mean that it won't be differences in other games. It's not apples to apples if one apple is a pineapple viewed from another angle. You can make the pineapple look like an apple how much you like but it will never be an apple.
    Just as a PIII 800MHz and an 990X isn't apples to apples only because they score the same when you play Quake III at high resolutions with an Geforce 2 MX.

    Please, just run a bench in low resolutions to see how far behind BD actually is with these tweaks. It's very relevant to many of us.

  9. #259
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    581
    Full ack. At least run without AA/AF because they don't have anything to do with the CPU. Btw, how do I give thanks for a post? I didn't find any button in this forum.

  10. #260
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    249
    Lower left, just above "Reply to Thread".

  11. #261
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    103
    Quote Originally Posted by Spectrobozo View Post
    so for you the only options are 640x480 or 1920x1080?!
    No, but for 120fps in many recent games you may need to lower the settings as much.

    I'm also curious about 4m/8c vs 4m/4c, from what I see on the other pages only 2m/4c vs 4m/4c was tested,
    Here you can find such tests as well with several games that showed there is no gain running games with less than 5 threads in 4CU/8C "mode". It could even impact performance negatively, in some cases. A special game with 8 threads of course liked it.

    Quote Originally Posted by Chickenfeed View Post
    3D Vision generally requires 120fps 2d performance if you hope to have a fluid 60fps 3D experience. Sure you can play at lower framerates but sub 60fps with 3D is the suck as far as I'm concerned.
    For fluid 60fps, yes. But, I had such a kit a few years back and it wasn't as much bad experience with lower framerates, like 30fps. (And I'm not such a guy that can't tell the difference between 30fps and 60fps, in fact I clearly distinct 60fps from 120, as well, on a 120Hz display.)

    And that's quite a stretch to assume you'd need to play at 640x480. There is a reason I'm running my system only at 1680x1050... Spectrobozo poses a very valid question.
    Of course, it all depends on the game, how much one needs to lower settings. There and now many very demanding games, GPU wise.

    Quote Originally Posted by Dumo View Post
    Let BD be as is...AMD's new chip in good or bad ways depending on your own experience with it
    Yes, it all depend on the workload. I just hate when people make general claims that "BD sucks!".

    Quote Originally Posted by DedEmbryonicCe1 View Post
    Lower left, just above "Reply to Thread".
    Is it Firefox that swallows it, or bound to a certain rank?

  12. #262
    Xtreme Enthusiast
    Join Date
    Jun 2011
    Location
    Norway
    Posts
    609
    Quote Originally Posted by boxleitnerb View Post
    Full ack. At least run without AA/AF because they don't have anything to do with the CPU. Btw, how do I give thanks for a post? I didn't find any button in this forum.
    I think you need to get past 100 post before can thank anyone.
    1: AMD FX-8150-Sabertooth 990FX-8GB Corsair XMS3-C300 256GB-Gainward GTX 570-HX-750
    2: Phenom II X6 1100T-Asus M4A89TD Pro/usb3-8GB Corsair Dominator-Gainward GTX 460SE/-X25-V 40GB-(Crucial m4 64GB /Intel X25-M G1 80GB/X25-E 64GB/Mtron 7025/Vertex 1 donated to endurance testing)
    3: Asus U31JG - X25-M G2 160GB

  13. #263
    Registered User
    Join Date
    Apr 2005
    Posts
    42
    If an application were to change affinity at a rate equal to the time slice of a thread from odd to even cores, could this reduce the heat enough to get a higher average Turbo? Is it module temperature or core temperature that is used decide to Turbo or not?

  14. #264
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    103
    Here are some 2CU/2C tests with a 4100, with only affinity adjustments. It seems the thing is effective enough (+9-11%, compared to 1CU/2C) even without disabling the cores in the BIOS!

    Btw, regarding the searching for the cause of CMT not scaling in many times as well as it "should", part of the "effect" could be the so called L1D invalidation issue, which has been mentioned in another thread here (in case someone missed it). Although, AMD is claiming it accounts really for a slowdown of up to only 3%, but what if they're underestimating it? Anyway, there is a patch underway that will prevent it from happening, so at least some of the performance could be regained then, without disabling cores, benefitting also programs with 8 threads (for which the 4CU/4C mode wasn't a solution).
    Last edited by dess; 10-20-2011 at 03:08 PM.

  15. #265
    Xtreme Member
    Join Date
    Aug 2008
    Location
    Freedom PA
    Posts
    143
    What motherboards have the option to disable cores? My Gigabyte 990XA doesn't appear to have the option.

  16. #266
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    103
    Quote Originally Posted by Baam View Post
    What motherboards have the option to disable cores? My Gigabyte 990XA doesn't appear to have the option.
    Look the post one above yours.
    You can do that with a little program that were linked here a few days back.
    Last edited by dess; 10-20-2011 at 11:47 PM.

  17. #267
    Brilliant Idiot
    Join Date
    Jan 2005
    Location
    Hell on Earth
    Posts
    11,021
    Quote Originally Posted by Spectrobozo View Post
    how about a 2500k and a cheaper MB? you set HT off anyway..
    how about someone that have a faster VGA (2x, 3x 580), or will buy a new VGA next year? or play more games? or lower the details for higher framerate (one example, 120fps for 120hz, is it possible with these CPUs with lower settings to achieve a minimum of 120?)
    if you want to test at those settings OK, but you could lower the resolution to and anti aliasing to see what happens, it's a valid and easy thing to do, also you could lower the i7 clock to analyze the impact, and test the FX with the 8c enabled... why not?
    BD is 4 clusters off to.......2600K is more effec than 2500K and faster in games which is why i bought it.

    HT off is really not pertinant here since this game loads 4 cores at about 75%.

    Feel free to buy a 2500K and mobo for me just for this testing that you want to see I already had a 2500K and sold it after doing a similar compare.

    Like i said i already know what happens if i drop res, and it's a moot point since I won't actually game like that......

    I'm in no way saying that BD will beat 2600K but it is an alternative..........so is thuban or deneb, I have in no way told people they should sell off there current sandy,thuban or deneb rig becasue BD is the greatest........I have merely pointed out it's an alternative.

    As far as other compares with cards I have 3 580's but these systems are 24/7 rigs not bench rigs and i'm not changing my config out just for a compare + all my 580's have tri slot coolers so 3 way is not possible.

    2x6990 would not be hard but I've already run that config in my sandy rig and it has alot to be desired and is pointless.
    heatware chew*
    I've got no strings to hold me down.
    To make me fret, or make me frown.
    I had strings but now I'm free.
    There are no strings on me

Page 11 of 11 FirstFirst ... 891011

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •