Core clock at 1250Mhz for integrated gpu?
You say that like it's nothing while the HD58xx series are topped at 850Mhz core clock.
Printable View
This is AMD-lingo for "Products some time in Q1 2011", consistent with their recent Analyst Day slides, with all Llano products in early 2011.
http://phx.corporate-ir.net/phoenix....rol-analystday
See Rick Bergan's presentation, pages labeled 27 & 28.
As Intel mentioned on their CC, the industry does not care for big Q4 product intros, they much prefer Q1. I have no doubt AMD will align to this, and launch in Jan if they are ready, later if they have 32nm hiccups.
what I really looking forward for Fusion is that I hope AMD could merge GPU to act/behave like FPU in CPU
Different process (32 nm SOI High K Metal gate vs 40 nm bulk), different manufacturer (GloFo vs TSMC), so why can't it happen the way Nedjo expected ? The clock might be on the upper range, and might not all that feasible from the TDP standpoint, but i'm quite optimistic regarding this new baby.
when can we expect this on an ultra thin notebook? i'm itching to pull the trigger on a dell vostro v13
Anyone have any TDP numbers?
I would imagine it would have to be around 25-35w...
Also, I thought it was either 240 or 480SPs.
Someone care to chime in? *wink* *wink* *nudge* *nudge*
Its 480SP's and the TDP most likely will be around 55w for the whole chip.
Well, a 55 w TDP quadcore fusion chip specced at 2.4 GHz CPU 800 MHz GPU WILL sound very nice to my ear, epecially if the chip can maintain its TDP's threshold even when loaded simultaneously with Prime 95 & Furmark. :D
It's an interesting question. ATI/AMD GPU has never manufactured with SOI or HKMG technology. Theoretically the SOI(+HKMG) is capable for higher frequencies than the simple bulk (but it's a more expensive process technology). Just compare the present day CPU and GPU frequencies.
I know that roadmap very well but recently the AMD likes to play the "ahead of schedule" game. :)
Intel is a different company. For example the first Phenom CPUs were launched on November 19th, 2007.
Thats kinda weird.. considering Q4 is where the holydays are.. that's when ppl spend the most. So why would they aim to launch it when the holydays are over?
If they launched it in Q4 then that would mean it will end up in store shelves in Q1 of the next year. Unless they launch it at the end of Q3 or beginning of Q4 then they may be able to get some products on store shelves for the holiday shopping season.
Usually what's launched in Q1 will end up in the laptops sold during the back to school season. You will see products featuring the new chips earlier than that, but then volume is a bit lower and there are still plenty of 'previous generation' chips in laptops at that time.
I can't completely deny this :p:
I was only talking about notebooks here. It's very common in notebooks that whatever gets launched in the Q2 and Q3 time frame does not end up in the back to school notebook lineup. That's just an observation I made though, nothing factual and I could thus very well be wrong.
Since we are already on the topic of early silicon can someone tell me what is the die shot on the left?
http://pl.tinypic.com/r/2hezu5v/6
http://i48.tinypic.com/2hezu5v.jpg
Others are Deneb, Fusion and GPU's but I cant figure out the first one.
It's a K8 single core CPU
http://gecko54000.free.fr/documentat...er_anatomy.jpg
edit: generics_user was faster :D
There's that pesky 8 core cpu again.
That's not a quad core barcelona is it? I thought it looked like the die from that other thread. Aren't the quad core dies square not rectangular? Its pretty small, but that sure looks like that other die shot that was apparantly a photoshop job.
Edit:
Ahh the stretched version. Ok
i need fusion.......now! lol ugh i hate waiting for things.
Like this?:)
http://prohardver.hu/dl/upc/2010-01/...600photo_p.jpg
It's a Radeon E4690 GPU (128-bit, 512MB GDDR3).
I remember when you had to add L2 daughter boards to your motherboard...
I wouldnt mind doing that again for extra AGU performance.. but say comes stock with GDDR3 and put GDDR5 in.. lol ;)