Thanks for the suggestions demonkevy666.
It's not looking good for me testing anything for a while as the chiller developed a leak, so I've been cleaning up a lot of windshield washer fluid today. What a mess.
Printable View
When will 8core bulldozer be in stock?
well I'm sorry I didn't know the reviewer was actuallyan user between us, actually it was the best review of Bdz I've seen so far.
But, you should know that NB and Ram works in a couple. So why haven't you done any ram
and NB scale together?
2200 NB - 1600 - 1866 - 2133
2500 NB -1600 - 1866 - 2133
2800 NB - 1600 - 1866 - 2133
You did the NB scale test and that was nice, but obviously performance has stopped scaling at some point, since you only used DDR3 1600 for that, that's cool cause you could find out whats the best NB to take advantage of 1600 Mhz Ram. So we could determine that no more than 1600mhz is needed for 2200mhz, that give us a ratio of 1.4. For 2800Mhz NB we should see improovement until DDR3 2000Mhz. (again if you used 1600mhz C8 you should still using 2000Mhz C8 otherwise it will lost any sense)
Why not!?
The NB doesn't OC that well. Might be able to do it on some chips.
So, BF3 is out now... any tests with BD yet? :)
I'm looking for the Bench Mark App, All I can do is run FRAPS for now
Ya' I've been running my NB @ 2500+ with my ram @ 2100+. The NB at 2700 or higher just doesn't work well for me. Doesn't seem stable. I never increased voltages on it though. I only increased vcore and ram voltage for my OC.
http://img248.imageshack.us/img248/5012/46gv.jpg
http://valid.canardpc.com/cache/banner/2066753.png
http://valid.canardpc.com/show_oc.php?id=2066753
I'm on air so I won't be going any higher than 4.6 with this particular chip.
with BF3-beta was only this, GPU HD6970
Attachment 121722
Well, BF3 final runs smoother for me than the beta. I don't have my BD yet. Hopefully they optimised it for BD.
The reason there's so little talk about BD is due to stock shortages.
NIce thing to note is that the min fps is better on the BD :)
Sweclocker.com has tested BF3 with i7 975, 2500K, 2600K, FX 8150, X6 1100T, X4 980, x2 560. For those who doesn't read swedish here are the results.
Attachment 121724Attachment 121723
Green is avg fps and red is min fps
I guess thats with def clocks on the 8150.
It does not look so bad in gaming performance.
I have canceled my order on the 8150 coz it is still not availeble in the netherlands,i have been waiting almost a month now..
Here is my 8120, nothing too spectacular judging by the ihs. But we'll see :)
http://i119.photobucket.com/albums/o...cis/bd8120.jpg
In the swedish test the GPU is not running on it's tippie toes... it still benefits from extra CPU power... AMD platform is very good when everything is maxed out, hence why it is pretty decent for gaming...
Testing tonite with GTX 480 and SLI cards...
I'll be doing some tests myself next week with an FX-8120 and two Radeon 6970s. That might interest the crossfire crowd if it wasn't SLI specifically that was drawing attention.
Also worth noting is that I plan to do testing under dry ice. It takes a few days to properly coat a motherboard in polyurethane to help protect it under cold conditions, but it should increase survivability in the event of an insulation error or failure. Now if only I'd get myself to actually order that aerogel cryogenic blanket.
Sweclockers are very pro intel. Check the GPU, HD 6990 does run best with intel.
Not long ago they had a intel banner in the middle of the forum buyer section
Here is another test:
http://gamegpu.ru/Action-/-FPS-/-TPS...-test-GPU.html
http://gamegpu.ru/images/stories/Tes...n/BF3_proz.png
dubble post
aha! the answer is here or is it?
http://techreport.com/articles.x/21865
This really has to be the ultimate comment... c'mon so they test with a high end GPU iso 6970 and then they are pro Intel ? So what I make of that comment is : if you go high end GPU and not extreme resolutions go Intel, if you go mainstream GPU go AMD ? Or is the real world scenario this, you need to cover up the less stellar performance by limiting GPU power and use extreme resolutions to make it sort of perform alike in games ?
In my opinions it's a decent CPU, it has it's strenghts, but far too many weaknesses. Initial design was great, surely when it all started, thinking Intel engineers peed in their pants at one moment. what's a normal cycle to bring out a new CPU ? 3 years ? There's the first issue, too long... PR understood it all wrong and overhyped the product : issue two and to make it all worse Global foundries provided the last blow as power consumption once pushed ( and this CPU needs to be OC'ed to be 2011 performance worthy ) is over the top... and the FX8150 price is still way to high...
They tested Battlefield 3 with two other GPU configurations that did get higher FPS, Geforce GTX 580 SLI got the highest FPS numbers. Why not use that?
Just look at the graphs, it is easy to se that they have selected one area in the game that shows intel at is best. The Phenom 980 really looks bad. I can promise you that other sites that test will get better numbers for AMD compared to Sweclockers, they allways get better numbers for intel compared to other review sites.
Sweclockers is a profitable company, customers need to get what they want
lol! I only care about real life situations.
low resolution may not be considered "real life" but everyone needs to stop bashing low res gaming benchmarks to show off a CPU's power. The point is to not bottleneck by anything but the CPU. Running at 640x480 shows how much the CPU can feed the GPU and is a completely valid way to show this in a gaming situation. The alternative is to use 3DMark, which is even less of a real life situation.
Well I understand that a high end review must have:
1 - Realistic settings, anything under 1680 x 1050 is too low (yes, it can be too high as well).
2 - A graph starts from ZERO, anything else is just misleading.
Edit: Well it seems to start from zero, but it's till misleading.
Are you for real? Since when is 640 x 480 useful in a high end review?Quote:
This discussion is for those who understand
Maybe the patch will come with 8170? It should be able to use the all module turbo which will be the same or higher as half core turbo on 8150 (4 threads running @ 4.2Ghz with "full cores Turbo" on 8170 via 4 modules Vs 4 threads running @ 4.2Ghz with "half cores Turbo mode" @ 8150 via 2 modules). In this case the power efficiency may end higher on 8170 since it will do more work per watt than the 8150.
If it doesn't improve current CPU's performance then it's not a patch, right? Just a newer revision.
But yeah, I hope they fix this somehow, together with lower power consumption.
Well, the tech report article looks promising. Would be great if AMD releases an app that does something similar.
My 8120 should be here next week sometime. Can't wait. Busy installing BF3 :)
I'm really pleased with how the polyurethane coating worked out on this 990FXA-UD3. While the Gigabyte board looked great in its original matte black finish, it looks absolutely spectacular now that it is glossy. It's not like regular PCB shiny, either, this is a whole new level. :D Best of all of course is that it did indeed form an excellent moisture barrier. I can't even get a reading on my meter when I go poking metal pieces with both of my pointy micro probes (lightly). I should have taped up the whole board and done it all instead of just the CPU socket area. heh
Dry ice will commence this weekend.
when will bulldozer be in stock? where?
some improvement in Windows 8.?
http://www.pcstats.com/articleview.cfm?articleID=2622
http://www.pcstats.com/articleimages...0_Win8task.jpg
I haven't seen any improvement in Windows 8 myself, but I must not have the same build as the one you've pictured. My task manager is still old style. Cinebench R11.5 scores actually went down by a few tenths on average versus Windows 7.
On Newegg the AMD FX-8120 has been listed as sold out for awhile but tonight it is back in stock.
I'm temped to order one but I'm going to wait for this FX-8120(FD8120WMW8KGU),3.1GHz,8C,95W,rev.B2G,AM3+ that is listed in the ASUS Sabertooth 990FX CPU support page. Just hope I don't have to wait too long.
^^^^^^
Sold out again now
I am in dire need of Giga 990FXA-UD7 beta bios G2. Some one please share. Any BIOS that can disable Dozer Modules.
yeah tigerdirect and newegg have the 8120 off and on. its on as of now lol
edit: to late lol its definitly off
Really need that bios badly, I have 40L here on standby and i'm stuck at 6.98GHz with FX-4100:
http://valid.canardpc.com/cache/screenshot/2069655.png
@tiborrr: did you try to deactivate one module? I was stuck at ~6,65 GHz with FX-6100 and three modules, but with deactivated two modules I reached 7,652 GHz;)
//eh sorry, I didn't see your first post on this page...so, it's really bad:(
someone with Sabertooth under LN2? At Sabertooth I can break 7 GHz with FX (and 2 active modules)
http://www.ocaholic.ch/xoops/html/mo...id=537&page=10
I was wondering why street fighter 4 here looks so good on Bulldozer then I found this
http://www.pcgameshardware.com/aid,6...n-detail/News/
Good review of FX-8150 with patchs at Techpowerup
Attachment 124098
http://www.techpowerup.com/reviews/AMD/FX8150/
Wow, those power consuption #s are shocking. I dont remember them being quite that bad. Oh my my.....
RussC
>Good review of FX-8150 with patchs at Techpowerup
I'd been saying this since release. To get the same performance with BD that you do with Phenom II you have to use more power and its a node smaller. Essentially, I was told to shut up and the blame was put on GloFo. System power consumption minus ~70w for other components, you're at about 170-175w for CPU power consumption with the Bulldozer CPU and 90-95w for i7 2600K.
In essence, the BD CPU consumes 90% more power than a 2600K and does not win any tests except for:
Pov Ray 3.7b by 2.9%
7-Zip 32M by 6.5% (Loses to 2120 in real-world compression time)
WinRAR Benchmark by 22% but the 2600K's HyperThreading is not working because of Core Parking bug that was also seen on BD when MS released patch the first time (Only matches in real-world compression time)
TrueCrypt by a healthy 25%.
The rest of the benchmarks are losses between 5 and 50% against 2500/2600K while being 90% less efficient :shrug: ...too many people mistake TDP for power consumption. They are not the same thing...TDP is a measure of heat that a heatsink needs to be able to dissipate, and power consumption is never listed.
Their CPU is at 4.7 GHz and 1.41v: At idle, it uses almost as much as a stock 1100T at full load.
http://tpucdn.com/reviews/AMD/FX8150...er_idle_oc.gif
http://tpucdn.com/reviews/AMD/FX8150...er_load_oc.gif
But who knows, maybe Bulldozer is only "good" at 16 cores and less than 3GHz, where it is at the server market competing with 2GHz 8 core intel CPUs that do more work with less power than it anyway.
there is problem with 32nm and GF, because I tested 3 FXs and between the worst and the best is about 60W!
architecture and power usage are different things.
It needs to remember to keep discussing those topics separately :)
So GloFo's 32nm SOI is up to 90% worse than Intel's 32nm bulk and around 33% worse than GloFo 45nm SOI? (~50% shrink, same power consumption between FX-8150 and 1100T CPUs, single core turbo broken for 1100T CPUs making FX ST/gaming performance look better than it should)
That certainly can't all be true.
I mean, you could compare Llano 3850/3870K to FX-4100 but there are several problems with that...
1. Llano can be under volted at stock voltage to hell and back, most cases around 1.1v @ about 3 GHz and post spectacular stock performance/watt vs stock Phenom II X4
2. FX series can barely run stock speeds at stock voltage (Think about 3.6 @ 1.25v).
3. Llano has GPU on die, uses different socket and different (weaker/less efficient) VRM components on cheaper boards and doesn't turbo / overclock worth bullcrap.
4. Llano is missing L3 cache, and it's added ~5% performance despite having performance similar to same clocked Phenom II counterparts (assuming you take advantage of Llano's IMC).
5. FX is more voltage tolerant, feed it with a stupid 1.6v+ on water and it will scale to oblivion as long as it doesn't trip your power breaker or catch your motherboard on fire. Llano just stops around 1.45-1.5v
why cannot it be true? thats is the problem, isnt?
Did you read the news that amd ditches SOI and sold off remaining to GF.
Does it not tell something to you?
For you is it supposed to cheapier, easier, faster to produce every lower nm for cpu?
Okay, first of all AMD did not "ditch SOI".
AMD is running Trinity's successor on 28nm bulk, and AMD changed agreement with GlobalFoundries from "pay per die" to "pay flat, non-variable price per wafer". Furthermore, the 28nm CPUs will still be made at GlobalFoundries. IMO the whole 28nm s*** is just to stop gap as far as low-power oriented products (read: APU) vs Intel's upcoming 22nm.
Where the hell do you think AMD will make their 22nm CPUs too? ...on TSMC 20nm bulk? SOI offers better power efficiency, less steps to manufacturing and higher performance over bulk, and by the time we get to ~16nm there will be no choice but to run everything SOI.
Intel has done well with bulk but they have a lot more R&D money too. You can't blame 90% worse efficiency at the same node with better technology though on the fab alone.
ok so did you read Alpha article? point me to article re 16nm will be everything on SOI? Intel, tmsc will turn to SOI?
are you sure that zambezi and llano are manufactured under the same way? for me Llano doesnt clock well. My old Phenon II on 45nm clocks better, like 500mhz at least.
Due lower voltage tolerance, I suspect Llano is manufactured at different way, like "cold treatment". Brazo doesnt clock high either, but is suitable as low watt cpu. I dont think it depends on built in gpu, imho. At 32nm, Llano is supposed clock better than Phenom II.
Without L3, it makes easier to produce cpus also. Producing high density caches never been Amd/Gf strong point compared to Intel's. Hence more transistors.
Thats why Im excited to see Trinity, it will be interesting!
We are simply reaching a point in design where soon, copper interconnects can no longer be used, hence, the development of photronics or "optical transistors".
Intel has successfully demonstrated photronics in their labs using SOI layers. Of course, there may well be other advances and changes, but from what I've seen they like the overall properties.
There is something in the insulating layers that prevents 'leakage' and allows for the use of different wave lengths of light.
As far as the Alpha article, it seemed to me that TS responded to the question on SOI by saying it was "on-board" for all product at 28nm, and that the roadmap had not changed.
I might be reading things differently than others but it seems some folks are parsing his statements into some things that just are not there.
1. Parasitic Capacitance becomes a larger problem as nodes shrink. Maybe you didn't know, but a big reason Intel is using their "3D Gate" is to combat it, meaning they can stick with bulk silicon instead of having to move to the more expensive SOI just yet. Of course, they would be at an even greater advantage if they moved to SOI on 22nm (they would have a huge advantage). Parasitic Capacitance is directly related to parasitic delay which in turn I believe is closely related to gate delay.
Unless more semiconductor companies move to similar 3D tri-gates or very least FinFET, I assure you that they will be using SOI by 16nm...
2. Yeah, you're right. However Llano has a GPU on the die, weaker motherboards and VRM, also missing 200 pins. (Not sure how they pulled that off.)
3. I suspect the same thing, however Llano is not the same core as Phenom II and that is the reason it has 10% IPC increase in some applications. Still, there is a lot more to look at here than GlobalFoundries side.
4. Yes, but in this case it was left out because of cost/die size. I'm not sure how many % of power consumption is used purely by L3, but Llano could easily have matched a Phenom II X4 CPU running 200 MHz higher if it had L3, IMO.
5. I'm excited to see Trinity as well. Hopefully Piledriver brings substantial improvements, as it currently looks right know that it won't be "run cold".
Stepping C0 FX ES
http://setiathome.berkeley.edu/sah/s...hostid=5747954
Since I was the one who found the BOINC report I want to add a bit to this discussion ;)
My original suggestion was C0 stepping. In the thread you translated (original here) one user assumed it to be a Vishera ES, which caused the corrective response by another user - the one you quoted. So indeed, the C0 stepping is no Vishera ES, since this would have a model number 20h-2fh.
Model 2 puts it clearly into BDver1 range (Orochi or OR-A0... steppings). As model 0 stepping x translated to OR-Ax steppings, and model 1 stepping y to OR-By steppings, I suppose model 2 stepping 0 simply means C0. The only thing missing is an official public document supporting this.
and how does it compare to older steppings? any visible difference?
None. BOINC scores are not exact enough to spot anything. C0 would not come with the changes which were planned for Piledriver, Steamroller, Excavator years ago (otherwise they wouldn't be able to deliver such updates in a one year cycle).
But it might have lower power consumption or reach higher clocks.
Something new about Vishera?
http://www.newegg.com/Product/Produc...82E16819106011
water cooled kits finally showing up...so much for at launch water cooled kits
I thought only Japan or somewhere had LC kits at launch...
Regardless, none of these are going to sell. I don't know what AMD is thinking.
be kewl if they sold the water cooled unit separately, they revised the mounting bracket to be much better (than the stock antec khuler 920)
I just wonder if there was any binning on the chips for these sets (to match water)...
It'll be interesting to see some results.
Provided anyone on here buys them OFC... :p:
Dave: I have one setup home...But I need some new CPU chip :(....
why are they show 3.9ghz? Are these different....
"AMD FX-8150 Zambezi 3.9GHz (4.2GHz Turbo) Socket AM3+ 125W Eight-Core Desktop Processor with Liquid Cooling Kit FD8150FRGUWOX"
Edit: Missed your post Beep..
"AMD FX-8150 Price Drops Like a Rock"
Source: Techpowerup
anybody remove the IHS on bulldozer yet? Or is it soldered on?
must be soldered! I have 2 dead cpu w/o warranty so i can try to remove the IHS to see how it is :p
IHS remove is some years dangerous...Thubans deaths after this, all Intell chips too of course. And FX will be not different Im thinking
The chips die because of wrong methods of removal, improper cooling or physical damage to the die or package, the missing IHS has nothing to do with it directly.
Even if you manage to remove the IHS without any physical damage and even cool it down properly, the chip might still die within 1-xxx days without any obvious reason. In this case the actual "cause of death" lies in the botched methods of removal:
Thermal shock - The chip was heated or cooled down too quickly during the IHS removal process, which caused damage to the silicon or silicon-substrate interconnects / bonding.
Mechanical stress - Excessive force was used on the IHS before the "TBC" had reached the melting point.
This caused damage to silicon-substrate interconnects / bonding.
The usual TBC (48% indium, 52% tin) used to bond the core to the IHS has liquefying point at ~120c.
This is nowhere near the temperature the chip can temporarily withstand in non-operating state.
According to Intel, the maximum short-term (>72h) STORAGING temperature for their chips is 125c.
So if you know exactly what you are doing, removing the IHS is quite safe.
Worth all the work and trouble nursing the fragile core and making custom cooler mountings for the now non-standard (height) chip?
:D
I know what he said.
Lets assume the contact between the thermal interfaces are 100% in each case:
Heat transfer path:
CPU with IHS: Die -> IHS -> Cooling element
CPU without IHS: Die -> Cooling element
Without the IHS there is one thermal interface less in the path, which means the heat transfer is quicker and due real world manufacturing tolerances (IHS surfaces, cooling element surface) more efficient.
I have removed "soldered" heatspreaders successfully from countless chips since the "soldering" method was introduced (IIRC in around 2003, P4 Gallatin). I never lost a single cpu because of IHS removal and I bet they all still work unless they have been scrapped. Still might have some of these P4īs, C2Ds and Phenoms II in my archives.
I was just asking if you had any explanation as to why he found what he did...
"Thermal shock - The chip was heated or cooled down too quickly during the IHS removal process, which caused damage to the silicon or silicon-substrate interconnects / bonding."
Or
"Mechanical stress - Excessive force was used on the IHS before the "TBC" had reached the melting point.
This caused damage to silicon-substrate interconnects / bonding."
Or
"Hotspotting" due bad contact between the die and the cooling element.
This is highly unlikely because silicon has quite good thermal conductivity properties.
Chip getting damaged this way would require VERY poor contact.
Also I am quite certain chew can install the cooling properly, so this is not the case.
Do u have some free FX for IHS remove?
FX-8170??? True or not?
Such a significant difference between official price and actual price may indicate that AMD has quietly reduced the price of the FX-8150 model in the light of Intel's latest product launch as well as to clear the road for the new model FX-8170, which should become available in Q2 2012.
http://www.xbitlabs.com/news/cpu/dis...hip_Chips.html
If true, i would be really pissed cause i just recently bought the FX-8150 and all this time i was waiting for the FX-8170!!