Kind of filling a strange gap in my opinion, but what the hey. At least the die shrink will bring down the TDP.
http://pc.watch.impress.co.jp/docs/2.../kaigai04l.gif
Source: http://pc.watch.impress.co.jp/docs/2.../kaigai428.htm
Printable View
Kind of filling a strange gap in my opinion, but what the hey. At least the die shrink will bring down the TDP.
http://pc.watch.impress.co.jp/docs/2.../kaigai04l.gif
Source: http://pc.watch.impress.co.jp/docs/2.../kaigai428.htm
triple channel ddr3?
socket 1366 is for servers
Beckton Ftw!!!
This image tells a different story - http://pc.watch.impress.co.jp/docs/2.../kaigai03l.gif
Seems they might be increasing the size of the L2 cache on each core by 512KB.
we want FAST dual cores :banana::banana::banana::banana::banana:es :yepp: :D
The Dual core cycle will finish when Nehalen comes. The multi threading feature came to stay for good. The standard will be soon quad cores. I can not see me with a dual core by that time. If this article is correct to what it said. I will get a Gainestown for a change.
Metroid.
i want a becktown!
just remember that a Beckton cpu will likely be something like 4x as expensive as a similarly binned Gainstown, and the die size limits it to a different socket and thus it will end up in multi socket platforms.
I want one too :p:
yes, just like Bloomfield/Gainstown. And LGA1366 is for both DP workstations and high end desktops.
naw, they say the same thing, the one in your picture has total on die L2, not per core
hax!!
It has started, many softwares use 4 cores but not all. So that is what I said dual cores are still very much alive but not for so long. Probably by middle of 2009 we are going to have a good quaded software support. It is not that easy to implement it as you can see written in this article.
We have to write avoiding deadlocks meaning consistency of the cores. When we add a processor to work lets say, Pentium used 1 core after they implemented the hyperthreading stayed at 2 cores not real though. So software makers had to implement or add a core to help with the tasks, that means multithreading. It is a step ahead, but many of them do not have the time. It depends the demand that means how much the software is being on use. 3Dmark is an example of support for quad cores because it is small as for games that will be really hard for being too complex as it can not fail the tasks as would result on a BSOD. So which games do have support for quad cores?
Yes not many. So would not be any benefit from a quad core, so that is the reason I still have a dual core, after Nehalen's launch will be a different era called quad core era as for now we still have many single threaded applications and some multithreaded 2 cores.
The games usually support only a single thread and occasionally two, but not many of them support four. I still have hope of by that time will be added support. I reckon is not enough time to add support to less demanding applications. So many applications will still be singled.
I can wait. I am not going anywhere far from earth soon :P:
Metroid.
When does Beckton comes out ?? 2009 ?
Who gives a crap about dedicated raytracing architectures and GPU's on CPU's, if Beckton is what it looks to be then it'll do Raytracing real time anyway!
Integrated Memory Controllers on them too :D
Well.. Kinda excited and not too hyped about it =)
Hmm, so Intel finally has the solution to run Crysis properly ? :P
...and by the time we have power to run crysis @1920x1080 very high we still
don't have what it takes to run Far Cry 2 even on medium.:ROTF: :rofl:
Maybe I'll switch to Intel when these are out and IF I have the money to do so.
I run at 1920x1200. I don't have much of a problem with Crysis?
Mobile will still have dual core, but it will be 4 thread capable. When the lowest rung of the ladder has quad core, more programming time will be spent on making programs >2 capable.
Do remember now that Vista, and to some degree XP, will direct secondary programs to their own threads where possible. Video drivers are multi cpu aware for example, but basically drivers of a variety of subsystems are capable of going to their own thread/cpu leaving the thread/cpu running the game more free to do that job than it has been in years past.
So we are seeing "some" benefit now without rewriting. There will be more as there are more physics and AI in games that can be thrown into their own thread.
Probably folks aren't recalling how slow it came to be that dual cores showed a gain. If you remember that, then you'll have a decent idea of how the current timeline is likely to go.
Last thing to remember is that we aren't going to see a ton of speed increases. 4-4.5ghz now is likely to be that way in 2009 too. The power will come with more efficient cores (tiny bit) and more cores on silicon (the bigger power add). If software wants to show it's actually worthy of upgrade investment dollars, it's going to either have to become more efficient, or use more cores to get the job done faster.
$.02
Anemone wrote:+1 on software needs to get more efficient. As my main uses for a rig includesQuote:
Mobile will still have dual core, but it will be 4 thread capable. When the lowest rung of the ladder has quad core, more programming time will be spent on making programs >2 capable.
Do remember now that Vista, and to some degree XP, will direct secondary programs to their own threads where possible. Video drivers are multi cpu aware for example, but basically drivers of a variety of subsystems are capable of going to their own thread/cpu leaving the thread/cpu running the game more free to do that job than it has been in years past.
So we are seeing "some" benefit now without rewriting. There will be more as there are more physics and AI in games that can be thrown into their own thread.
Probably folks aren't recalling how slow it came to be that dual cores showed a gain. If you remember that, then you'll have a decent idea of how the current timeline is likely to go.
Last thing to remember is that we aren't going to see a ton of speed increases. 4-4.5ghz now is likely to be that way in 2009 too. The power will come with more efficient cores (tiny bit) and more cores on silicon (the bigger power add). If software wants to show it's actually worthy of upgrade investment dollars, it's going to either have to become more efficient, or use more cores to get the job done faster.
$.02
gaming, surfing and not much more i don't think that my fps would rise drastically if I would buy some new Intel proc+mobo+ram but not GPU.
But it wouldn't matter much if there's no gains to be had it still nice to have new hw to play with.
Far Cry 2 is supposed to be a lot more optimized, no? I thought it was supposed to be very playable.
Ooooohhh... sounds nice! The Oct-core will only be for Xeons, IIRC. Not even Crysis would benefit from an Oct-core any more than it would with a highly clocked quad-core.
Right now, I wish there were tri-cores out there at a sweet price point of say $200 or $250---on a 45nm process, Intel could sell them in droves for a profit. I think I can see Intel doing it for their Celeron series at least, after AMD releases tri-core K10's.
Man, 32nm is hard to imagine. Honestly, I thought we would never ever come to see chips being made on 32nm process, due to some kind of a "hitting-the-wall" limitation of silicon. Remember the severe problems associated with 65nm?
I think AMD will make a comeback someday real soon. They did an early release of R680's on 55nm. Now, they are getting ready for 45nm with both K10's and R700's. But for Intel to do 32nm is almost un-thinkable for me! 22nm anyone? When will we hit the wall on silicon? Even improved 193nm lithography can only do so much.
Hmmm so that G did look funny, since it was a C.
I definitely meant TSMC's 40nm and that ATi/AMD might have 40nm GPUs.
My bad...
http://www.digitimes.com/bits_chips/a20080324PR205.html
The first production is as usual not GPUs or anything like that. But embedded items.
40nm GPU will be much later. TSMC starts by making the very simple things first, then the more advanced 6-9 months later. As I see it any GPU would be more around xmas timeline.
Quote:
A full range of mixed signal and RF options accompany the 40G and 40LP processes along with embedded DRAM, to match the breath of applications that can take advantage of the new node's size and performance combination
I want to see/hear more about the dual-processor platforms. To me, quad-processor is too much (motherboard real estate taken up by 4 sockets & all the extra RAM), and dual-processor with 16 cores will be excellent. Beckton DP is what I really want--hopefully they'll have processors at under $500 to do exactly that. Extend the gaming life of the computer and make it awesome for WCG and similar tasks.
Q is.. how many years will it take to release these..
@ current pace we'll be lucky to get the other penryns by next year
Quite basically--I'll be due for an upgrade in the 2-3 years it'll take to get Beckton launched and likely on its second revision. I want something like today's Xeon E5420, but equivalent for Nehalem. BTW--how do you say "Nehalem"? I think it's Nay-hell-um. :p:
:eek: Give me a Beckton. Now.
Paul Otellini of Intel pronounces it Na-hay-lum.
http://www.metacafe.com/watch/830178/i_am_nehalem/
AH--thanks. I don't like Metacafe though... oh well. Anyways--I'm really looking forward to Beckton DP... they say possible by the end of this year... meaning that I may pick up the 32nm version of Beckton or beyond. Anything for WCG :p:
Ok--just trying to stay fairly consistent. So, Beckton will be DDR2 FB-DIMMs? I'm a bit curious about that since FB-DIMMs have pretty high latency, made up for massive sizes required in servers.
yes, FB on beckton. The buffering cuts down on errors that come up from having long path lengths between large ammounts of memory and the MCH. I believe there will be some fun in this area coming out to help reduce that hit, but in general, the systems which this is used in don't care about memory latency and care far more about data integrity.
I'm like 95% sure there will be no such thing as a beckton DP. And even if there was it will be like 4x the price of nehalem due to the 2x die size. A little more then twice as much on production cost due to the # of them you can fit on a wafer, and twice the chance of getting a die killing defect on it.
Ne-ha-lem (being "lem" the strongest :p:)