Great?? :f
http://www.3dmark.com/3dm11/2416290
5% faster compared to similar clocked FX-4100
Great?? :f
http://www.3dmark.com/3dm11/2416290
5% faster compared to similar clocked FX-4100
Are you serious? :rofl:
:banana::banana::banana::banana:ty performance + 5% = still :banana::banana::banana::banana:ty performance
IB over SB in this particular test is about 8%
http://www.xbitlabs.com/images/cpu/c...e/3dmark-2.png
Yeah the 3.9Ghz IB (constant Turbo is engaged on IB in almost ALL workloads across ALL cores) vs 3.5Ghz SB that can rarely Turbo to full 3.9Ghz on all cores. There goes you 8% IPC out of the window.
Again if you have no clue about what you are posting ,simply don't post. Thanks.
Oh and ":banana::banana::banana::banana:ty" performance? Sure you buy IB and shell out 200-330+$ in order to brag about your uber Physics score. 5800K will cost around 130$. Notice the price disparity?
Listen AMD fanboy.
I was only comparing top Trinity with its closest predecessor.
Oh name calling now,nice. I don't care about intel fanboys coming here and posting "slooow crap product" comments. Trinity is obviously faster and more efficient product than FX4xxx. It is priced according to its performance. If you cannot grasp this simple fact than nobody can help you.
the 5% comes clock vs clock has no power consumption measurement.
piledrive FX 8350 is shown to have clock speed of 4.0ghz and turbo of 4.2ghz in most leaks
Taken that from a FX 8150 which has a clock speed of 3.6ghz vs the "leaked" FX 8350 as the same TDP (125 watts) with 10% higher clocks means
Over all Increase is 15%
L3 is important for physics score in 3D11, so 5% is good for Trinity...And FX is not bad in overall score in 3D11...
All hail informal, great AMD fangod
Thank you for the great contribution to the topic Beep.
What's up with the default gpu clock?
It's 800Mhz but there is a bug in GPUz I guess. It's showing the power saving clocks. Effectivelly superkames achieved 46% overclock over stock on air cooling. AMD states 1+Ghz OCing in their recent newsletter about A10 ( “Break the 1GHz barrier on the GPU!”).
I dont see it that way, the processor is a much better cpu than the current TIM/IHS system allows. There are 25+ examples in the forum just under this one of what the cpu is with what Ill call industry standard cooling systems ie soldered IHS.
Thats a better analogy. Or, run FX with a max 250rpm fan, that would be equivalent also.
Since this is off topic, you can have the last word if u like.
RussC
5% only it's bad because FX 4100 it's bad. Amd made a trash cpu so it should have worked harder with Trinity/Pilledriver.
IB it's avesome because SB it's avesome, and because on LN did get a big jump from 6GHZ on 7Ghz.
And also IB has overall 5-7% performance improvment depends on aplication.
Personally APU doesn't make sense only on mobile, on desktop even if you spend 50$ more an i3 + dedicated graphics just make more sense.
As an AMD fanboy you shouldn't be so happy until the full picture it's clear, until the reviews are on. AMD it's now a master in bad surprises.
FX 8350 performance seems good but it will be ruined with another 10-15% more power consumption(if it is).
125W TDP doesn't translate clear in same power consumption. It could be more.
And AMD clear will not ruin it's new product with " 140W TDP" sign.
Let the results to speak for themselfes and then, do the praising to AMD.
Wow did I just read your whole post? Yeah I did. Can't get back those 30 seconds back now :P.
Anyway NDA is going to be over soon so we will know how good A10 is. As for Vishera ,there is no mystery. It has 11% higher clock and 0-10% higher IPC(so 5% average). Power draw should be on par with FX8150. Pretty clear picture.
SUPERKAMESs 3DMark11 score (if it is real of course ;)) would be heavily held back by the DRAM bandwidth.
The saturation of the DRAM bandwidth is around 164%, while the optimal would be 100% or less.
What held back, the memory was at 2133 which it's more than 1600 usually normal situation. :)
And the power draw of FX 8150 does look good to you? It should have been under FX 8150 with some 5-10% to be a real good product.Quote:
As for Vishera ,there is no mystery. It has 11% higher clock and 0-10% higher IPC(so 5% average). Power draw should be on par with FX8150. Pretty clear picture.
Anyway if the the power consumption it's FX 8150> than EPIC FAIL.
With AMD many "should have been" but they weren't.
And next year with Haswell AMD APU's will have a big competition.
I'm trying to just watch thread for info on PD, but I gotta jump in. Firstly TDP stands for Thermal Design Power, its reference on AMD to the maximum heat created, thus needed for cooling systems. It isnt the power draw. Secondly saying APU doesn't make sense is just ludicrous. You think its better to go back to the days when you have to have 5 different chips on a board? Five separate cooling systems, increased cost for board design, the cost of having to synchronize and work more separate components from a variety of vendors, and increase power loads? There is a reason the entire industry has moved to move more of the computer system into one chip. Even Intel is doing this and done so successfully. You can ALWAYS add a dedicated card into a system (if designing the system or building yourself) arguing it doesnt make sense to include is, as I said, ludicrous.
Furthermore I don't see anyone arguing that Intel's chips aren't overall superior to AMDs so acting like or calling someone a "fanboy" when discussing what may or may not be improvement to AMD's designs is just plain trolling.
Can we please stop talking about Intel in this thread with exception to data extrapolation.
First of all TDP does have a corelation with the power draw. They are related betwen. Because the heat it's produced by electrical power passing the tranzistors.
That's why an 125W TDP cpu consumes more power than a 95W Cpu.
Quote:
So 5% is bad now? Do you know what it takes in terms of CPU design to get 5% IPC increase? IB didn't get this much over SB but it's awesome!1!1!1!
Can we please stop talking about Intel?Quote:
Yeah the 3.9Ghz IB (constant Turbo is engaged on IB in almost ALL workloads across ALL cores) vs 3.5Ghz SB that can rarely Turbo to full 3.9Ghz on all cores. There goes you 8% IPC out of the window.
Again if you have no clue about what you are posting ,simply don't post. Thanks.
Oh and "ty" performance? Sure you buy IB and shell out 200-330+$ in order to brag about your uber Physics score
Yes please tell Informal to don't speak anymore about Intel, AMD it's his business, to saying Intel this/ SB that/ IB this...
You go way above what i sad... 1 dedicated VGA doesn't hinder your sistem space. Furthermore AMD APU requires stronger VRM than a MB with Intel i3 IB especially. So you cut in a place and put something in other.Quote:
You think its better to go back to the days when you have to have 5 different chips on a board?
Power draw should be on par with FX8150Quote:
exception to data extrapolation.
Very much data with this when already are rumours that it's higher. Just keep silence and hope it's not true.
His friend and him mention IB and SB first and now the others are to blame. How typical of certain kind :rolleyes:
If you don't care about this product and you think it's not good then just skip this topic. Go and post somewhere else.
Ok informal I'm sorry but the way you approach discussion is kind of rude, same goes to intel fanboys.
For mine and I'm sure others sake I'm gonna kindly ask if you can argue in PM...
I don't believe anyone can keep up with your bickering and I couldn't believe that anyone would want to either...
nice to see trinity gpu can be tweaked past 1ghz
can the NB freq be tweaked as well for A10's?
cant wait to see memory OC with these, if i were getting one i definitely get 2400mhz+ kits
It surely has more sense investing 50$ more to get an inferior dedicated gpu, no overclockeable cause of pci-e sub75w limit + no overclockeable cpu. :rofl:
And that "dedicated gpu" will be probably the most basic, noisy, cheapo-cooling HD6570 + inferior DDR3 1600. :clap:
On the APU side you only invest $130 for the nice cpu(+OC), a nice gpu than when OC'd can stay toe to toe with the HD6750 and be way faster than the new HD7750 DDR3, sure it doesn't make sense :shrug:
It also doesn't make sense when you see intel heavily investing in reducing power consumption and offering the "best gpu" they can for the mobile space. Sure, APU's doesn't make sense.
informal
We don't really need the NDA lift for Trinity, we already know from toms' previews. The only interesting thing will be OC capabilities on the gpu + power onsumption vs Llano.
The boxes, so nice.
I already planned my system:
A10-5800K + Biostar HiFi A85X + G.Skill Ripjaws Z DDR3 2400
Who need's dedicated gpu? I settle to 1600x900 on 1080p monitor.
Any "inferior" vga it's overclockable 10-15%.Quote:
It surely has more sense investing 50$ more to get an inferior dedicated gpu, no overclockeable cause of pci-e sub75w limit + no overclockeable cpu.
And that "dedicated gpu" will be probably the most basic, noisy, cheapo-cooling HD6570 + inferior DDR3 1600.
On the APU side you only invest $130 for the nice cpu(+OC), a nice gpu than when OC'd can stay toe to toe with the HD6750 and be way faster than the new HD7750 DDR3, sure it doesn't make sense
There are silent version.
The overclock(45-4.7ghz) i don't think it will be possible(cpu+ IGP) without an custom cooler which doesn't sound good for an HTPC/
Anyway where it's the upgrade path, what do you do when you need more cpu power?
AMD should have not split them in two sockets.
Poor AMD. AMD fans can now "Cry me a river"Quote:
On average, Trinity's high-end 384-core GPU manages to be around 16% faster than the fastest Llano GPU, while consuming around 7% more power when active.
Also: AMD attempts to shape review content with staged release of info
Good Job AMD with both how you handle these benchmarking websites & congratulation on releasing a fine chip within its intended price/performance range.
^^Haters gona hate.
If history repeats itself tbone (which it usually does), it's a good sign when Intel fans and biased websites start knocking a new chip in AMD threads right before it's released...
If anything, all this banter is giving me more hope that Piledriver cores will be a decent upgrade over Bulldozer. :D
We don't have long to wait!
Time will tell OFC, but the more negative comments I see from Intel folks only raises my hopes... ;)
People who buy only one brand have no lesson to give to other people. Fanboyism just sucks, blue side or green side, that's the same $hit.
How is this bad? APU is more like a two core CPU and it's keeping up just fine with a true quad core part. And before you start on the "IPC SUX!!!" thing, I'm only going to let you whine about IPC if you apply it to GPUs as well and you say some NV GPUs are better or worse than some AMDs because they don't perform the same at the same clock speeds.
If it does indeed come in at $130 and below, I think it'll succeed marvelously. There's a huge difference in comparing people who buy $200 i5 3xxx CPUs + $200 GPU to people who buy an AMD APU for a third the price. As long as the CPU performance is "good" (better than i3 3xxx in multithread, within a 8-12 percentage points single) I can't see AMD not doing well with it. Given the clockspeed and the throughput from the previews/rumours though, I wouldn't be surprised if release prices were closer to the $150-160 mark for the top end model. At the very least it should succeed as much as the phenom II's did.
I've got time to wait though, I won't have money until internship pays off so I'll be stuck on my "ancient" phenom II 940 for another half a year at the minimum =/.
AMD Trinity - only iGPU part reviews
http://www.techpowerup.com/reviews/A...Preview/1.html
http://www.anandtech.com/show/6332/a...-review-part-1
http://www.tomshardware.com/reviews/...400k,3224.html
http://www.legitreviews.com/article/2043/1/
http://www.xbitlabs.com/articles/gra...-graphics.html many tetsing games and performance DDR3 clocks/FPS ratio
http://hothardware.com/Reviews/AMD-A...ce-and-Gaming/
http://www.pcper.com/reviews/Process...rinity-Desktop
http://vr-zone.com/articles/amd-trin...#ixzz27eY1Tr76
You post an old news from March of this year?
@ FlanK3r
Thanks for the list of previews. Xbitlabs has an interesting page containing sysmark numbers. 5800K is pretty close to FX6200 and faster than FX4170. Pretty good results making it around 12% faster than BD with same core count and clock speed.
Source: http://www.legitreviews.com/article/2043/11/Quote:
Legit Bottom Line: The AMD A10-5800K APU looks impressive in the game benchmarks and easily beat the Intel Core i7-3770K, which is a much more expensive processor!
:eek:
Why are you surprised that 5800Ks iGPU is much faster than HD4000 in 3770K? Even Llano is faster than 3770K when you look at iGPU performance...
Take it with irony. LOL
I don't think the focus of OC an APU is the CPU side, but the GPU side. For example from xibit:If you want pure CPU performance you would probably get one of the PD FX processors and OC that. (or an Intel system)Quote:
Graphics core performance proved amazingly scalable as the memory frequency and bandwidth increased. By simply raising the memory frequency by 266 MHz, we could boost the fps rate by 10-15%. Of course, as the memory frequency increased, this dependence becomes less prominent, but nevertheless, if you are building a Trinity based system and intend to use its graphics core for 3D applications, you must pay special attention to finding high-speed DDR3 SDRAM. This is excellent new for overclocker memory makers, because it creates potentially larger market for them. It is a very convincing argument that you can easily boost the gaming performance of your AMD A10-5800K processor by as much as 15-20% by simply replacing the common DDR3-1600 with DDR3-2400 in your Socket FM2 system.
Also if you saw one of the slides below they showed FM2 will be around for another generation, which means if you need more CPU power the Steamroller APU should get you some (no idea how much).
I also question the "staged release info", but I was thinking, since it is marketing department likely controlling this release, it does make sense to have focus on the GPU of Trinity before the CPU. Not "just" because you dont want people talking about the not as good CPU performance, but you will have everyone talking about PD vs BD. It does make sense to release CPU info for Trinity desktop the same time you as Vishera. That way you get a week discussing one of your strengths before the online conversation changes to PD vs BD (as it will).
The comparation it's just epic fail.
They should have sad " the gaming performance of IGP"- because saying Intel Core i7-3770K, which is a much more expensive processor you can understand that if you buy a dedicated vga, 5800K it's better for gaming.
Also to put i7 3770k in this it's just something stupid.
To compare HD 4000 with Trinity you have the new IB i3 3225 with HD 4000 at 144$ even in Trinity 5800K target price.
To bring i7 3770k which it's another class cpu in that review was biased from the begining.
The people who buy i7 3770k, don't buy it for IGP performance , they buy it for cpu performance.
Yah but it's the best intel CPU? Why not?
The best Intel cpu it's i7 3960X. This have no IGP so we compare with what?
HD 4000 which was to compare with HD 7660D, it's in more cheaper cpu's than i7 3770k also.
Why not? We compare apples with oranges that's why.
Maybe they just did not have other intel cpu's at hand. Pretty pointless to start arguing about "omg they used this and that with same iGPU".
Then if this was, you do not say this in conclusion.
In this way anybody on this forum can work as reviewer.Quote:
Legit Bottom Line: The AMD A10-5800K APU looks impressive in the game benchmarks and easily beat the Intel Core i7-3770K, which is a much more expensive processor!
They should had say something , so Trinity APU it's much better in gaming performance than Intel HD 4000 graphics so, the new i3 3225 which also have HD 4000 included( but we don't have it to test) will have hard time with A10 5800K.
I just think they had no other CPU with HD4000,not a conspiracy to make A10 "faster" than 3770K. Even AMD positions A10 as i3 competitor in terms of CPU performance . It may even compete with i5 in smaller number of workloads. But they never touted it as i7 competitor. What Legitreviews wrote was just a iGPU comment that some people misunderstood.
Really tired of these crappy reviews.
Price-wise, this is what A8-5800k should be competing against: http://www.newegg.com/Product/Produc...82E16819116774
A dual core with hyperthreading. Any review that compares products not in the same price range and suggests one is better than the other is a complete failure. I blame this on AMD marketing fail once again. They should have said that i3 3225 is A8-5800ks direct competitor and you should benchmark against that. It would have been in AMD's interest to even provide the i3 3225. The 2 module design is much stronger than a 2 core with hyperthreading and the iGPU is far stronger than HD 4000.
AMD's problem lately with CPUs isn't that they're bad, it's that review sites compare $140 AMDs to $200 Intels and say AMD sucks. AMD is targeting a specific market, no one is going to put a $200 CPU in their HTPC when the CPU isn't even that important there anyways. They're going to go with something smaller like the i3 3225.
Sorry if old news already, haven't been here in awhile.
AMD sticks with Socket AM3+ for Steamroller
just give us the new socket already
The same sourceQuote:
We have Intel Core i3-3220T and Core i3-3225 processors on order for a better comparison in the weeks ahead!
XBit's preview did it with i3-3225 ( http://www.xbitlabs.com/articles/gra...raphics_7.html )
Why? The benchmarks in the review don't use also the cpu processing power of that particular processor?Quote:
To bring i7 3770k which it's another class cpu in that review was biased from the begining.
The people who buy i7 3770k, don't buy it for IGP performance , they buy it for cpu performance.
As a budget PC build, Trinity/FM2 systems are stronger than any Intel alternative even if it uses one of the top CPUs. This is my opinion and I respect others opinions too.
I'm confused. AMD lifts the gpu side only NDA for trinity so the cpu results don't overshadow the gpu, but Toms has had cpu results for months now. What has changed? I know older docs said stuff along the lines of Trinity gets 1xh piledriver cores and vishera gets 2xh, then all the stuff happens with OBR and the AMD 5.0ghz showing the same stepping, and I'm not sure what to believe anymore. Desktop Trinity also got delayed long enough that they maybe got 2xh in there? My thought process was (if 1xh 2xh cores are true) they were going to release that with Trinity 2.0, again, older article and nothing has been said since.
Anyone got some info that will confuse me less?
Also, someone mentioned earlier that llano got sub 1000 scores in 3dmark. No overclock, I got to 1146 score. I'm sure I'm not the only one to get over 1100 with a stock llano. I can't wait for a bios multi so I don't have to baseclock my ram to 2400.
Yeroon (waiting impatiently)
umm this is a bit off topic but i thought that steamroller fm2 socket and the vishera was the last cpu for am3+, but now ppl are saying the steamroller will also be am3+?
waiting for overclocking....
5ghz cpu/1ghz+ gpu @ 2400mhz memory speeds on air cooling
fingers crossed!
If I had to guess, I'd say that Steamroller CPUs will be like AM3 processors, in that they'll have a DDR3 and a DDR4 memory controller inside. AMD would be foolish to release a new socket for Steamroller, only to have to replace it quickly afterwards with another one to accommodate DDR4.
Anyone who went through the incredibly short lived S939 will know what I'm talking about. From my perspective, they should have either focused on AM2 and skipped S939, or have put a DDR1 and a DDR2 controller in AM2 CPUs.
well with the fm1's short life and amd promising fm2 life to be longer, i have a feeling they will do the same for am3+
i guess it all depends on how the server side is progressing
I heard OC is a bit worse than classic FX, because power for CPU is only from "few pins" and others is for for iGPU part of APU.
Ok , I used Maxforces' file to test :
As first I could say that the rendering times are not always the same with this file/renderer , I tried ~ 5 times at the frequency 4.5ghz .
- Lowest ~ 2'21"
- Highest ~ 2'35"
Attachment 130338
So I decided to test in both 800*600 & 1200*900 resolution so that someone who need/want can compare with higher res :
Default:
Attachment 130339
@4.5ghz:
Attachment 130340
And I also render the file cpu.c4d in cinebench R11.5 roof folder :
Attachment 130341
Hope you like it .Goodbye everyone. :D
its about 10% faster then bulli and bit faster then sb c4c
Gaming benches using Discrete Graphics
A10-5800k Trinity (4.2GHz Turbo) vs i5-3470 (3.6GHz Turbo)
http://vr-zone.com/articles/amd-trin...e/17272-1.html
Seems to do alright in some games, then there was those few games were its was like...what the hell just happened? Interesting how Vishera might shape up.
I find it interesting how much AMD is pushing their clock frequency of this design. Higher than BD, but it looks like clock margins are no higher.
It reminds me of A64 X2 days when they upped clock speed to the point where you could only realistically overclock 200 MHz.
4.2 GHz Turbo on chips that reach 4.4-4.6 OCed vs intel pushing out chips with mid-upper 3 GHz turbo and CPUs that also do 4.6+ OCed (Of course, not on locked chips, but there is potential).
I feel like they pushed clocks too much on FX-8150 for what it was, many CPUs having instability at stock with vdroop / what should be mild OC.
AMD A10-5800K & A8-5600K Review:
- http://www.xbitlabs.com/articles/cpu..._10.html#sect0
- http://www.anandtech.com/show/6347/a...sktop-part-2/5
- http://www.bjorn3d.com/2012/10/amd-v.../#.UGqToy_zo_k
- http://www.guru3d.com/articles_pages...ew_apu,15.html
- http://hexus.net/tech/reviews/cpu/46...ty-apu/?page=7
- http://hothardware.com/Reviews/AMD-A...rmance/?page=7
- http://www.legitreviews.com/article/2047/18/
- http://www.pureoverclock.com/Review-...-5800k-review/
- http://www.tomshardware.com/reviews/...ency,3315.html
- .................
Where are the s-ata rezults of A85X?
^Storage controller performance shouldn't be affected much vs. A75 since A85 is the same chip just with all the SATA ports enabled and what not.
I know we shouldn't talk about you know who, but when he makes fun of members here it's a new low by him and takes the cake on being the douchebag of the year!
He has no integrity and is just a little :banana::banana::banana::banana::banana: looking for website hits
http://www.obr-hardware.com/2012/10/...0k-by.html?m=1
Sorry I had to say something
XD flank3r is famous now. Only one that is pathetic (i dont even know if he uses that word correctly) is the blogger himself :)
And still no single measurement about higher power consumption like he boasted 1month ago...
yeah its somewhat ridiculous that that badly made little site with one single contact somewhere in asia attains that much attention... the self-renowned robin hood of true gaming performance :)
So now that piledriver cores have found their way into laptops and low end desktops, is it safe to assume that the full thing is near? Anyone know when the NDA lifts?
Vishera Launch is around 23th to 28th october :)
msrp better be good!
http://techreport.com/review/23663/a...rices-stagnate
then watch it crash!
http://techreport.com/review/23662/a...us-reviewed/11
piledriver looks good against fx-4170
A10-5800k = FX-8150 performance in Crysis 2...interesting
edit: hmm not sure about drivers says he used 12.3 for FX and 12.8 for A10? or is it 12.3 for both cause its using a discrete gpu (7950) i cant tell anyways
looking forward for fx-8350 performance and OC
i dont know either, but i think i read somewhere, and AMD stated it will be during october.
23th is a tuesday so maybe thats it??
Why did you call them execution cores ?
I like that "execution cores", but well My sense of IPC is completely different from what reviewers claim is IPC. I'd much rather call it executions pre clock from a core (silly sure)
the Instruction cache is called instruction cache for a reason right ? to my knowledgeable is also has a clock rate from what I've read.
I'm not sure what you are trying to say. Yes, L1 instruction cache has a clock rate just like every other transistor in a CPU. It runs at core clock, same as L1D cache and L2 cache. But what does it have to do with IPC?
IPC, much like TDP, is something people talk about too much without understanding it. Instructions per clock can refer to maximum possible number of instructions retired by one core in one clock cycle (which is 100% theoretical number, based on a scenario that does not occur in real world usage); or average number of instructions retired by one core per clock when running some procedure in real life software (which is a more meaningful number, but is dependent on software quality).
Make yourself clear :)
http://www.youtube.com/watch?feature...&v=o5hxOsrukPY
Vishera talking from AMD guy....
"15% more efficient, 7% higher IPC, 7% more clock speed"
we will see :)
It's soo cute to watch a hot girl try and speak geek.
:) she can build me a piledriver system anytime ;)
Thanks to FlanK3r's friendly site we can read now about:
Vishera prices (in Czech Republic), specifications and Turbo Core 3.0 feature
AMD FX-8350 (Piledriver) details
I like the last sentence: "We will say more in a few days in the review."
Doing some math, Vishera prices in the USA should go like this:
FX-8350: $235, FX-8320: $186, FX-6300: $150, FX-4300: $122.
these need to be cheaper if they want any good reviews....no igpu to cover for its low IPC and high power consumption under load
and they need to make micro-atx am3+ 9 series mobos this time around.
heck or even a mini-itx , like the baller Asus P8Z77 http://www.newegg.com/Product/Produc...82E16813131840
that would be killer!
sadly they wont :(
Can someone help on the pricing conversion - it looks like the 8320 is going to be around $185 US dollars??
its not much, fx-8320 will be a bit better than FX-8150 in averafg (I think.)