I really hope after this fail you finally learn a lesson: Dont overhype AMD products before launch anymore. Barcelona wasnt 20-30% faster than Intel CPC, neither Zambezi does 11 pts in Cinebench 11.5, but AMD never said that. :up:
Printable View
I really hope after this fail you finally learn a lesson: Dont overhype AMD products before launch anymore. Barcelona wasnt 20-30% faster than Intel CPC, neither Zambezi does 11 pts in Cinebench 11.5, but AMD never said that. :up:
gimme a current intel tech with no cold bug and ill be glad to buy it.
....not unless i live in bazaro world.
http://www.guru3d.com/article/amd-fx...ssor-review/14
I wonder how BD would do on Intel's FAB process? Just curious...:)
I will admit, AMD really fooled me big time. I can't tell you how excited I was for this launch. For the last couple of months I couldn't wait to check XS news for any bulldozer updates.
I always pull for the underdog and I'm very much an AMD fanboy, but I almost feel betrayed. I know it's foolish to think this way but it's true. Honest to god I wish they would've been upfront with the details and maybe released benchmarks here and there.
AMD I'm not sure how much more disappointment I can take.
I have owned AMD stock for a long time and have allot invested, more than a years salary anyway. And therefore have always felt obligated to support them by always buying AMD. I have thought long and hard about this and I have decided that I will be using Intel CPU's in my next round of builds for the first time since 2001. I also like JAWS said feel betrayed as wrong as that is. It just makes me sick. It was hard enough sticking with them during the Barcelona era.
1) I didn't miss anything of what you're going on about.
2) We had a crapload of rumors flying on both sides of the spectrum for a long time. I wouldn't call any of the stuff we had up until recently "clear evidence."
lol you two, please. They aren't going to disappear :rolleyes: Yeah, BD kinda sucks. As much of a letdown that is, this isn't the first time we've heard ridiculous doomsday "the end of AMD!!!1" predictions. Besides, if AMD disappeared, what next? I'm sure some people might think they want AMD gone, but such a desire would be totally retarded.
OEMs used a buttload of Netburst-based Xeons years ago, even though they sucked compared to the Opterons. Although, maybe it had to do with the shady business dealings that were going on at that time as well.
Oh please :shakes:
It's decent/good at some things, definitely not the all round performer everyone was hoping for though. It'll probably improve a bit over time as applications and OSs are recompiled and/or optimized, but it seems like it would take some sort of a miracle to close the gap in the places where it's really falling behind...probably not gonna happpen. Guess the thuban in my gaming rig will be around a bit longer.
--Matt
right, you proved my point. 2600k@3.8ghz = 8150@4.2ghz with far less power consumption/heat. It seems that most of the BD chips aren't ~5ghz stable on decent air yet, where as we know the 2600k will reach those speeds with just a 212+. So unless if your "bazaro" world doesn't believe in overclocking intel chips, it would make far more sense to get the 2600k across the board
This chip is god awful, I thought fermi was bad. At least when Fermi dropped it had the performance to back up its TDP. This just looks hopeless. Maybe if AMD squeezed 16 cores on this chip it would be EPIC or even 12. This uArch is way to ambitious at this node, and how was this planned for 45nm. The FX "WAS" suppose to be a 8 core chip for the size and power draw of 4 cores, NOT the power draw of 8 with speed of 4 cores.
Big question on how this will pan out in the mobile laptop market.
Hell AMD should Invested everything into llano from the start. Llano looks to be AMDs best thing going for them as of now. AMD needs to just give up on high end desktop BD(it may have hope in servers) and focus on making a killer llano that fits into about 90% of all PC purchases. Just 1 blazing fast flawless APU with extremely low power would steal a lot of intels thunder. The people who will save AMD from bankruptcy are the everyday users, not "us" the enthusiasts.
Isnt FUSION the future? Where the GPU does all the heavy work with opencl. DONT WASTE TIME AND MONEY IN A DEAD END MARKET.
$279 at newegg hahaha
i thought it was supposed to be $245?
I have to admit that I laughed out loud when I saw the bit-tech power consumption page. With an overclock it was 586w. I didn't even see that with 6950 crossfire and an i5 760 at 4ghz when running three threads of Linx and furmark at the same time.
The X4 looks pretty interesting...I wonder how it compares to current tech. High clocks, probably pretty good performance. I found a review that benched the FX-4170 (4.2 Ghz Quad Core), and it did pretty well. I want to see how the FX-4100 and the FX-6100. It'd be a more fair competition vs. current AMD 4 and 6 core chips.
Anybody remember the "multithreading" tests that seemed to only live on certain sites around the time that dual cores came around? Running several big apps at the same time, timing which finished first? The A64's were strong with it, and AMD eventually started marketing the A64 for multitasking performance--since you couldn't slow those things down by doing too much. I wonder how a test like this would work out with tech from today...
so far the best review ive seen is in the amd section go check it out mainly because he actually tried something the others hadn't
name is "single thread...."
To me ipc or clock per clock wasn't a let down, neither was single threaded performance. The let down for me was overall clockspeed/overclocks and most importantly power usage. If power usage was sb levels the overclocks reviewer's got with bulldozer would be okay with me. If overclocks on bulldozer was 5.2ghz on air, power usage would be bearable.
the power consumption part i find to be a little odd
power consumption increase as temps are higher, even at the same voltage and clocks (this is expected)
BD seems to be able to survive stable conditions 20-30c higher than thuban could
combine those 2 things with reviewers OCing on the stock heatsink, and i bet the overclocked power consumption is beyond what we expect
i had a hard time finding OC results on a strong heatsink and where they didnt just max out voltages as if they were going to see SPi world records.
give me results of 1.375-1.425v on a heatsink that didnt come out of a cereal box, then lets see if BD is so horrible. sacrifice 100mhz, and shave off 30% of the power consumption with it too i bet.
Its not exactly that. It was design to be high clocked part, and expected clock speeds in general to be several hundred mhz higher in stock form, turbo and overclocked. Not really angry just more surprised.
That said, once paycheck arrives getting a BD to put in main system to play around with and tweak under water cooling :D.
I have been waiting for XS to get back online just so i can post this. Found it on hardocp:
http://www.youtube.com/watch?v=SArxc...layer_embedded
I can't beleive what a steaming pile of crap AMD had the balls to release. How in the hell can it be slower clock for clock in single and multi threaded programs than Thuban even with 2 more cores and be more power hungry? They should just hang it up and fold since there's no way anybody in the right mind is going buy these inneficient slugs. No wonder they made AM3+ boards available so early. There going to have to slash prices right away if they want to sell anything. I still just can't believe they went ahead and luanched these turds it totally blows my mind how poorly they perform. Everyone at AMD should be ashamed of themselves Intel must be luaghing there asses their at your incompetence and ignorance.
Damn, this is almost a bigger flop than the original Phenom and the 2900XT combined :down:
Well, Ket, me and quite a few other members of the XtremeSystems forums kept warning AMD fans not to whip themselves up into a FRENZY and to MANAGE THEIR EXPECTATIONS. This is EXACTLY what I expected.
For some reason if you turn back the clock ~9years the role is reversed. Intel had the Netburst and rather hot Prescott, AMD had the efficient and rather fast Athlon A64.
Core 2 Duo was the turning of AMD and Intel's role in the market place and amongst enthusiasts.
I see no reason to purchase a Faildozer... sorry Bulldozer. i7 2600k is looking more and more fruitful, however I am still waiting for Sandy Bridge-E before I upgrade from my SKT-775 platform.
I wish AMD would not hype so much before launch and that AMD Fans would not whip themselves up into a Frenzy. WHY? Because if appropriately priced this could be a very decent multi threaded crunching platform (especially on a multi-socket board).
In my opinion Fusion is the future, especially for low profile HTPC's For gaming and any serious CPU power Intel completely dominate.
John
Bulldozer has one big issue. GloFo's 32nm node.
Not high enough base and turbo clocks due to epic leakage.
The scariest thing about this architecture is that it seems like it has no long term viability. High clocks have been not the way to go for performance ever since laptops started outselling desktops.
If your putting bulldozer in a laptop, your both going to get bad performance and bad battery life. Even if they fix the manufacturing process, they will needs to boost speeds well above 3.5 ghz and this will be a battery killer. The key to getting long battery life, speed and thus mobility is high performance with low clocks. Months ago, before all this performance numbers were released, Bulldozer I thought had too high of clocks to get competitive performance.
This thing is in many ways worse than fermi. Fermi atleast took the crown and was much faster than its predecessor. It also appears to have an easier fix, as components that were already in the architecture were disabled on earlier models and reenabled for it forthcoming models when the bugs were kinked out.
Bulldozer is fully enabled and adding more clocks is a band aid solution because they are going to run into a wall where adding more clocks adds to much heat and power consumption.
AMD used to be about more performance per hertz. I miss those times. Remember when a 1800+ thunderbird performed like a 2 ghz. intel even though it was like 1.5 ghz or something. What happened to those days. AMD has lost focus so much. I hope AMD has a new design in the works, but it looks like from their 10-15 percent performance increase over years, we are stuck with BD.
The biggest reason for disappointment for me, is BD was supposed to be and had to be the foundation which AMD builds the rest of their company. With BD being so bad from the get go, I have a feeling AMD may concede the high end market altogether(they must hate selling such a big chip with so much cache for so little) and basically drain until it is dry, their APU product, which will dry up sooner than later(next year) when it doesn't bring enough CPU performance to the table. I could see AMD not being around in 4 years.
My updated signature describes the problem perfectly...
It is all the software that is to blame, it is actually a product that will beat Intel's flagship for 2015, really.
No, Fusion is doing extremely well and if AMD continue to make strides in this sector then I can see it becoming even more fruitful for AMD so AMD are not dead..... yet.. or even in 4 years time.
Also lest not forget that they are not doing too bad with their Graphics Cards either.
John
Heheh thnks M.Beier for a bit of light relief in this thread
Yea, I mean, why try to dig the hole deeper.... It is a failure, AMD will rise and shine again, you win some and you loose some - this is not the deathstroke for AMD in any way whats all ever, and particular in SFF they are kicking @ss on Intel.... But pretending this is really what we expected, and this is not fail, I mean, come on...
As many others before me have expressed, it does hurt that they, AMD, choose to abuse the FX legacy.
Loved the Hitler video, laughed hard when I watched it late last night, just as good as the Fermi video. My girlfriend didn't understand what was so funny, I didn't care to even try to explain.
Somehow the only one with FX-6100 review. Croatian site but it is easy to see one more fail from amd, only much lower price will help...
http://www.pcekspert.com/articles/905.html
Legion Hardware and Guru3D reviewed FX-6100
AMD FX 8150 - 8120 - 6100 and 4100 performance review
http://www.guru3d.com/article/amd-fx...rmance-review/
AMD FX-8150 at 4.7GHz. Does it stand tall?
http://hexus.net/tech/reviews/cpu/32...es-stand-tall/
Wow, the FX-6100 sucks so bad even the 1055T is a better option.
Yesterday after talking a lot about BD with a few colleagues an simple conclusion was reached. But when i came to post it the forum was in a 1Hr trance that did not seem to end :P
What was concluded was that since this is a totally new arc there were bound to be issues. BD 1st gen was never expected to be a Intel processor beater by Intel staff. You see it all boils down to how much experience have you got dealing with that arc now Intel has been working on the same base arc for quite some time "Derivative of c2d". All these Nehalem and SNB, etc carry a lot less risk than a full fledged different arc like new FX to the old Phenom 2.
You all might recall the problem that P4 had faced with efficiency and coupled with HT it was even worst. Well engg. concluded that longer pipes were off the table till a small enough process is adopted that can take advantage of it and we will see longer pipelines in future Intel processors for sure. Now before people start saying Haswell is completely new well there has been some changes to the original plan.
What is wrong with BD that everyone agrees with is not the lack of performance but the electricity used on full load. A A8 cpu seems a better alternative to a two module BD with 4 cores. If sufficient improvement is not done with the pile-driver we may have similar results as the A8's in light process application.
The other small bit is that i heard AMD was about to use the phrase "4+4 core" instead of the "8 core" moniker which in my mind would have been much better. Also the water cooler was suppose to be bundled with a faster processor but due to problems with yield sufficient speed bin could not be reached but we will surely see a cheap FX 8200 "Or something" with a bundled water cooler.
The worst part of HFR review :
http://tof.canardpc.com/view/55d9fbb...0dcad33ec8.jpg
http://www.hardware.fr/articles/842-...ergetique.html
Even a Q6600 is more efficient than others AMD processors reviewed :shocked:
I dont have the Time to read through all reviews but, is one of those reviews with a Gigabyte 990FX Board? I just checked if there was a Bios Update for my MB and for real yo - there is.
Only an Updated CPU Agesa Code tho - which could mean improved BD Performance?(Thats why im posting here:D)
Quick google brought nothing up
Link to teh Downloadpage: http://www.gigabyte.de/products/prod...?pid=3891#bios
I wasn't really surprised by the performance of Bulldozer. But AMD still managed to surprise me.
BD has 2x more transistors and 30% larger die then SB (and this is without moving SB IGP & SA out of the formula)? Common, what are all these transistors used for? Maybe BD has some secret functions we have no clue about?
Definitely it's not AVX. Because BD AVX implementation - and this is another surprise from AMD - seems even slower then SSE.
http://www.lostcircuits.com/mambo//i...&limitstart=13
Well at 4,7ghz performance doesn't look that ba.... holly balls that power consumption. :shocked:
I don't want to know the power consumption in those 8ghz ln2/lhe bench sessions...
There happens a lot of discussin about that, many say on of the reasons is that happend because of the heavy use of automated desgin tools and next to no hand optimization happend. If you look at what we got and when you look at what Cliff Maier said back in 2010:
And when you look back, how furious AMD fans where at this, calling him a disgruntled ex-employee who only wants to put amd into a bad light...Quote:
I don't know. It happened before I left, and there was very little cross-engineering going on. What did happen is that management decided there SHOULD BE such cross-engineering ,which meant we had to stop hand-crafting our CPU designs and switch to an SoC design style. This results in giving up a lot of performance, chip area, and efficiency. The reason DEC Alphas were always much faster than anything else is they designed each transistor by hand. Intel and AMD had always done so at least for the critical parts of the chip. That changed before I left - they started to rely on synthesis tools, automatic place and route tools, etc. I had been in charge of our design flow in the years before I left, and I had tested these tools by asking the companies who sold them to design blocks (adders, multipliers, etc.) using their tools. I let them take as long as they wanted. They always came back to me with designs that were 20% bigger, and 20% slower than our hand-crafted designs, and which suffered from electromigration and other problems.
Im already seeing a whole lot of motherboards being RMA'ed due to burnt up VRM area.....
Bloody hell Newegg has the FX8150 for $280 wtf?
http://www.newegg.com/Product/Produc...82E16819103960
While i am not very happy with FX8150 the FX8120 if priced around $200-$210 will be a good deal because Intel 2400 costs around $190 on newegg.
Well.. extremly limited supply and an automated pricing system... :p:
And damn.. double post everywhere again... I thought the downtime was there to fix that.
Well when the first generation Pentium 4 came out, it was very expensive and clock for clock was slower against Pentium III, but after a few revisions it stood up to the top model Athlon XP 3200+. Then later Nehalem was released, inheriting Hyperthreading which was developed thanks to Pentium 4.
There is no reason why tweaking cannot get the Bulldozer on track, remember it has excellent multithreading performance, just the IPC that needs to be tweaked.
Also AMD engineers are speaking out that they are underfunded by the new management which is why they don't have the resources to create a good chip.
Later moddels of P4 fought against A64... and it got clobbert, regradless how high you oced them... hell people even bought Pentium-Ms oced it and beat the crap out of P4s that where oced much higher...
Willamette came out at 1.3ghz i believe. and Tualatin did hit 1.4 in the end. willies started doing better at 1.8ish, but still lagged behind.
It took them 5 monts to realease a P4 that was faster then any of the P3s (1.7ghz Willamette was the first one that was able to outperform a P3 for good) the inital release of the 1.4/1.5 ghz part where awefull... god damit there even was a 1.3ghz part that sucked so much the quitely removed it form there table...
Canceled my order for bulldozer! Sticking with sandy till ivybridge now!
Put 16 cores 16 thread in the equation ...
and you get your response vs Xeon ... Sadly AMD look to have made this architecture only for server and workstation ,,,,
Piledriver is made for increase IPC and performance per core .... now we know what have AMD in mind.. look to Bulldozer as a baseline of the architecture, perf will come next.
Lol, Windows 7 is not even capable of take parts of this architecture yet... it can't recognise the modules as dispatcher for cores/ threads yet .
lol.... i mean that's on you. I pointed out performance, and you have twice made excuses for why it would make more sense to get BD over sandy bridge. Maybe in your situation specifically, but as a whole no it really doesn't at all. Especially considering your Thuban is quite a bit faster in most tests not named video rendering
when you read more reviews it becomes more funny
Oh man, the Hitler video was really funny and disturbing at he same time.
Its clear to me that AMD just decided scerw it, we just have to get this processor out, regardless of performace, wattage or otherwise. Release it and let the cards fall were they may. Its painfully obvious that AMD(and bigger companies also, like IBM, Samsung, TI ect) will not catch up to INTC anytime soon soon when it comes to manufacturing. Thats what a monopoly can do for your business, just like MSFT(these two tech giant monopolys are being allowed to stand, why, I dont know). Or at least till we get to the point were the node shrinks stop somewhere ~9nM. Whent the node shrinks slow to a crawl or stop, that will allow the other players to catch up.
AMD clearly is the loser in all of this. Those wattage #s are appalling. Thats trully is the worst thing about this whole mess. Its more than clear their engineering expertise is way behind INTCs, in all facets across the board. Its really suprising to me the stock hasn't acted to the downside more, but I guess its already so low, there just not more downside in it.
As Ive said before, my criteria for success or failure are the adds in PC World from IBUYPOWER and CYBERPOWER. If the FX line could make those advertisements, its a success, if not, then fail. Im really looking forward to the next PC World in the mail. My guess is it wont be there.
I was going to upgrade to one just to run on my chiller, but sadly I will be with my 955 for a while longer.
RussC
Both of you are wrong. Look at M.Beier's links and you'll see P4 Williamette debuted with only two speeds; 1.4 & 1.5 GHz. I know because I had a P4 1.4 GHz Williamette with RDRAM PC-800 that was purchased when those were the only choices available (SDRAM equipped P4 systems hadn't arrived yet). Yes, I learned my lesson from that mistake.:(:mad:
I dont agree, maybe Im alone, who really read most of reviews from all world (to time about 25 reviews). If u use superpi and wprime, yes, u can stay with Thuban. But in real aplications is FX better than Thuban-Photoshop, rendering, compression/decompression, videoencoding, microsoft office, in gaming simillary (in BF3 is FX better!). So, what do u need to proof? No, FX is really better than Thuban. U must more write or tetsing with your hands, no listen some people here. Without experinece is not knowledge ,-).
@FlanK3r:
Agree with you . :D
//I'm using an AMD X6 1090T & still looking for a FX-8120 . Look at this article --> Some real world apps such as 3dsmax renderring show me that a FX8150 is 8% slower than an I7 2600K with a decent headroom for overclocking , not too bad .
Attachment 121222
any boinc numbers ?
i remember when there were many people on madonion.com that were bragging about getting these 1.3ghz parts, only to be shocked at how the lower end p3 would outperform them.... it was much like the lead up to this launch. it really made you shake your head wondering what happened. however when they reached 1.8ghz, i bought a p4/rdram system as well... it replaced my coppermine system with 550e running at 770mhz. the 1.3's were dropped very fast.
here's alink for ya. http://ark.intel.com/products/27421/...he-400-MHz-FSB
maybe they should unload the forceware drivers :D
http://www.tweaktown.com/articles/43...ad/index2.html
indeed, look how great is bulldozer for a DOS game
http://img651.imageshack.us/img651/6...sboxscore1.png
lol they did so much article in limited time must be mixed smh copy paste errors :D
should have shown if there was any difference by increasing clocks (simply by showing a 4.5ghz setting along with the high overclocks so we can see what is or isnt cpu limited at that point)
also why only 4.76ghz on BD? 5.2ghz for 2600k sounds like it might be reaching the point of degrading, but an H100 that cant get BD anywhere near 5ghz sounds fishy.
also, why by 1000$ in gpus and use a cheap 1080p monitor, would have rather seen some eyefinity since they were way past the point of diminishing returns for cost and performance.
and failure on the power draw test, should have shown an average or peak, not just grab a randomly selected number during a benchmark
the one good point was that the BD setup was 155$ cheaper,
so 4 bad things and 1 good thing i have to say about that review...
i'd like to see this re-ran with one cluster on per module
Personally Im disgusted by the power draw when overclocked but at the same time I want to get into writing reviews instead of just recommending hardware to those I know. I would still get this in order to test its real world performance over a period of say 3 weeks. Im interested in Bulldozer even though it has its quirks. Its a first try shall we say and Im interested in Piledriver as well. My wall for getting a Bulldozer is that I dont have the funds.
Why? looking at the power consumption test on other sites... a 8150 @that speed needs 100W more then on stock speed.. and a 2600 at nearly the same speed consume 140W less..
http://hexus.net/tech/reviews/cpu/32...d-tall/?page=5
http://www.computerbase.de/artikel/p...t_overclocking
http://www.hardwareluxx.de/index.php....html?start=10
and pretty much every site that test power consumption while ocing shows the same results (and also the same oc limits). It seems people who hope that the will get a nice oc out of these will be up for a bas suprise... if you look at the hardwareluxx article they said to achive 5ghz the needed the watercooler and had 407W system load under 2D (stock load was 215W)... Every site reached between 4.6 and 4.8ghz but with the cooler on full force...
Because they weren't? :D
Quote:
The only other thing we've changed here is that we've dropped 1680 x 1050 testing; instead we've just stuck to 1920 x 1200 and probably the more realistic, 2560 x 1600 which is the kind of resolution you'd be opting for with a three card setup.
It would be interesting to see a power consumption chart compared to clockrate (with minumum vlotage to achive stable conditions). Looking at the reviews it seems once you only go minimal higher then stock voltage power consumption explodes...
AMD’s FX-8150 vs. Core i7 & Phenom II – Bulldozer Arrives!
http://alienbabeltech.com/main/amds-...ldozer-arrives
Bulldozer FX-8150: Twenty Game Benchmarks with CrossFire vs. Phenom II and Core i7
http://alienbabeltech.com/main/twent...ii-and-core-i7
This gaming benchmarks are so useless.. on the single gpu configuration every test is GPU bound.. and its funny how he tries to blame cf scaling issues when he clearly sees it scales on the intel rig and even on the P2 rig and just revealves that BD has a poor gaming performance...
[AMD Blogs] Our Take on AMD FX
http://blogs.amd.com/play/2011/10/13...ake-on-amd-fx/Quote:
This week we launched the highly anticipated AMD FX series of desktop processors. Based on initial technical reviews, there are some in our community who feel the product performance did not meet their expectations of the AMD FX and the “Bulldozer” architecture. Over the past two days we’ve been listening to you and wanted to help you make sense of the new processors. As you begin to play with the AMD FX CPU processor, I foresee a few things will register:
In our design considerations, AMD focused on applications and environments that we believe our customers use – and which we expect them to use in the future. The architecture focuses on high-frequency and resource sharing to achieve optimal throughput and speed in next generation applications and high-resolution gaming.
Here’s some example scenarios where the AMD FX processor shines:
Playing the Latest Games
A perfect example is Battlefield 3. Take a look at how our test of AMD FX CPU compared to the Core i7 2600K and AMD Phenom™ II X6 1100T processors at full settings:
Looks like the biggest failure wasn't Bulldozer , it's AMD Marketing team hype too much , and they have a real mess for the R&D guys to clear things up
Alice aka Bulldozer girl recommends MSI 990FXA-GD80 and AMD Bulldozer! :D
http://www.hwbox.gr/news-motherboard...amd-990fx.html
http://www.hwbox.gr/images/imagehost...8602d92aca.jpg
http://www.hwbox.gr/images/imagehost...8602df1e3e.jpg
Ex-AMD Engineer Explains Bulldozer Fiasco: Lack of Fine Tuning
It's all starting to make sense now...Quote:
Performance that Advanced Micro Devices' eight-core processor demonstrated in real-world applications is far from impressive as the chip barely outperforms competing quad-core central processing units from Intel. The reason why performance of the long-awaited Bulldozer was below expectations is not only because it was late, but because AMD had adopted design techniques that did not allow it tweak performance, according to an ex-AMD engineer.
Cliff A. Maier, an AMD engineer who left the company several years ago, the chip designer decided to abandon practice of hand-crafting various performance-critical parts of its chips and rely completely on automatic tools. While usage of tools that automatically implement certain technologies into silicon speeds up the design process, they cannot ensure maximum performance and efficienc
I don't like much the new FX....
BUT. This all is definitely hysteria. That is absolutely clear for me. For example, power draw.
FX-8150 takes in idle less that 1100T and practically same under load:
Attachment 121251
It seems, there is no point to worry about power draw, but there is no single cold-mind word on it, only whining.
Another sample:
Attachment 121252
Overclocked consumption?
yea, a whipping 489 watt at 4.6GHz...
Right. System total.
i7 2600K @ 5.2 takes 403 Watt. I would say, it is not too far, whatever you tell.
Attachment 121253
Performance wise, i agree - it is not as fast as expected. But for now it is definitely fastest AMD chip.
Did anybody notice what benchbeds were used? only few used RAM 1866, most were 1600 and 1333(!).
I'm still comparing reviews, later will post, but you can do it now yourself.
So, expect better reviews in a week or two...
P.S. Still I'm not impressed by it and not going to get one for myself. I just hate moaners...
Dont you love good hardware all around ;)
http://www.pcinpact.com/articles/amd-fx-8150/420-5.htm Linux performance seems a bit better than windows if these results are accurate
To be honest Bulldozer on the desktop segment wouldn't look half as bad if GloFo's 32nm process would be working well. Now the same model numbers have at lest two step lower clocks then in earlier leaks, and the magical +1GHz Turbo clocks are nowhere to be seen.