Mucho x86 :rofl:
http://img8.imageshack.us/img8/5158/lrb2.jpg
Printable View
32 cores :)
I will put this on my t-shirt. And it will make me irresitible to all nerdy girls :D
the core setup is rather interrestening:
1|2|2|3|1|1
4|4|4
1|2|2|4|1
Can't wait to see some game benchmarks.
how long til its out and about !!
Hopefully Nvidia will figure out a way to stay in the market after people start getting incentives to build AMD (AMD/ATI) only and Intel (Core/Larrabee) only systems.
Perkam
Late this year, early next year. So sometime in the winter months.
Its gonna hurt not only nVidia. But also AMD.
However it cant be fun to be nVidia. First you lose all chipsets/IGPs. Then someone comes and grab a good portion of your discrete card segment....ouch.
Yeah I bet Larabee wuill hurt something. My Bananas.. Until I see REAL proof IE benchmarks I think this is about as exciting as the i740 intel graphics deaccelerator.
I just don't see all developers Jumping on the Larabee bandwagon, unless intel graphic drivers are superb. Frankly I never had any luck with the garbage drivers intel releases for their gimped/garbage 3d deaccelerators.
larrabee has much more to offer then just graphics power....
i predict amd going down this route as well.. as of right now, a route nvidia wont be able to follow .
mmhhh you might wanna think twice if thats really what you want... :D
and itll also attract male nerds who will "try" to socialize with you :D
rather messy to use other words :D
id say cebit time, same time as gt300 :D
it depends... drivers, pricing... :D
if larrabee is really as fast as intel claims, intel will be eating into nvidias market share from the top and ati from the bottom... might be interesting...
and thank god for no more nvidia chipsets... although it IS a shame... they could have come up with an x58 competitor that has lower qpi multipliers to allow higher bclocks... 250 bclock on air would have been quite nice...
from the data i saw on the wiki by intel. 25 cores was needed to keep FEAR at 1680x1050 with 4xaa. that was at 1ghz too, so i wonder if they produce this at the actual 2.4ghz with 32 cores, it could be quite close to ati and nvidia top offerings. however i also saw the chip was supposedly 49mmx49mm, just massive in die size. so i guess its really gonna be interesting to see theoretical costs and benchmarks.
i heard ~600mm˛, probably more
Intel will not release regular driver updates. That is not a software development model that they will be able to integrate into their business. Imagine what kind of performance you will get out of year old video drivers.
Will it really hurt amd and nvidia? Gt300 and rv870 will hurt high-performance computing with the cpu. Cpus will become less important, so intel really needs a product against it. Intel will get market share, but the market should grow fast with opencl. I don't think Larrabee will be a good choice for gamers, but it will be strong in gpgpu.
Intel eating nvs market share from the top??? Never heard intel claiming this. All comments i heard are that it's a performance chip, so first it has to beat rv870.
Can someone please enlighten me. Is this larrabee supposed to be a high-end GPU or an "improved" integrated GPU in a motherboard?
Its a GPGPU. But it will also be used as IGP in a CPU later on. Think something like 4-8 cores in IGP. 8-32 Cores in discrete GFX card. (Larrabee cores).
LOL! Ye and CUDA and CTM already made the CPU so obsolete. They are both useless.
RV870/GT300/Larrabee is still an utter joke for generel computing. Larrabee would be by large portions the fastest of the 3 in those terms. And we talk about Atom performance.
I dont think RVxxx/GTxxx can do even 5% of what a CPU can. RVxxx/GTxxx are very fast at a very very limited amount of things. They are DSP type chips. Even Cell can do much more than those 2.
I don't know. Some new games run fine on my x3100 and some old games run like crap. OpenGL performance is piss poor from what I've experienced.
Technically the hardware sucks off course.
I really have no high expectations from the first Larrabee generation.
Display drivers from AMD and nVidia both have about the same amount of code that Windows XP has. That's allot. Considering Intel doesn't have any good drivers at this moment with good game support, I don't think they will manage to make one within the first year ofter the Larrabee release.
It's a good point, but Larrabee is a highstakes gamble for Intel and I'm sure they're going to do whatever they can to bring as much initial performance as possible. I think Intel is slowly but steadily taking over every performance segment of general computing with their hardware/implementation; ipc, ram bandwidth (1st generation imc implementation mind you), ssds in the io dept. and now gpgpu in graphics processing.
i think this was really a way for intel to offer an IGP thats not so sucky, so when they sell laptops they make alot more of the profit on their own. i think this is a great idea for them when it comes to the next step in expanding their reach. how it will do against high end chips, i really dont care.
lol, the die layout is messy? wow, we sure have upped the criteria we use to evaluate hardware :ROTF:
I'd love to see Larrabee come in and smash all the competition. Just means a better graphics card for me.
Can't really see it though, I've seen very few products be truly quality on attempt #1. Intel could have something up their sleeve though, so who knows.
Yeah. I still remember the buzz generated when Intel released it's first graphics card. Even the first rewievs where the journalists tried to be as "pollite' as possible. This is a bad move coming from their part. If they really wanted to capture the video graphics market they should have moved in silence. This way Nvidia and AMD are just holding on and preparing their big cards for the time Larabee will be released. The true winner will hold on to his big card for the second wave. That will be the true battle: the second version of Larabee against Nvidias chips and AMD's offering...
From a gamer's perspective, i'm not all that excited. But if it does compete for the top spot, interesting things could happen.
The layout does not make instant sense. Nor does it make proper layout sense even several passes later.
I really don't think Intel will "smash the competition" - not on the Gaming market ATI and Nvidia got to much experience there.
But as a GPGPU(GPCPU? whatever lol) and server processorer it is very likely it will be a beast.
yeah but your not playing games are you? :D their driveres arent even made by their own engineers but an external company afaik... their drivers are great for 2d, but 3d... not really... and check the matrox review on 3dcenter where they throw in g45 as well. there is almost no scaling in most games between diferent resolutions, which looks like very poor to no driver optimizations.
heheheh, yeah its quite silent these days... but nvidia will open another "can of whop4ss" marketing wise as soon as they have something new out, you can rely on that :D
thats what nvidia and ati want us to believe... but who is actually using gpgpu and for what?
...
...
...
exactly! :D
nvidia is barely selling any gpgpu stuff and not making much money on it, and amd even less... gpgpu hasnt taken off yet, and im not so sure it ever will take off big time... its another feature hardware makers are trying to tell their customers is great, its not like customers wanted a gpgpu cause its so useful, its hardware makers that thought they could make extra money with their chips if they add some more instructions and change the design a bit...
i doubt intel would launch larrabee unless they can come out on top... they want to enter the market, and they want to enter at the top, thats the best strategy for them actually, so i agree on that. they really want it, they got loads of cash, very good chip fabs, very good engineers... if they want it, theyll get it... its just a matter of time and money
what intel will probably not be good at is price perf... but thats not their goal afaik... thats why amd will actually be less impacted by larrabee than nvidia i think...
latest game is out, you get it, perf sucks, or there are artifacts... oh, but no problem, you just have to wait a FEW WEEKS for a new driver...
now im sure you get the point he made about driver updates not beeing frequent enough, at least atm...
it looks messy, doesnt mean it is messy :D
im sure they had a reason to arrange the cores like that...
keeping something like larrabee secret is not possible.. period... nvidia and ati would know about it either way, so it would be public, and once its publ;ic you might wanna use pr to get the launch going as good as possible. and intel did an ... ok job on that... there are quite some people who want to buy larrabee and think it will be the best since sliced bread without actually knowing anything about it, let alone its competing products :lol:
Ya just loook at them. I can use CUDA to speed up my MPEG-2 work, and brute force keys WEP, WPA, and help solve the RC5-72 competition.... an order of magnitude faster than a Q6600.
You should try to stay away from absolute statements Shintai. It makes you look stupid.
RC5 is what...utter waste. MPEG-2 with worse quality than CPU based? The current CTM/CUDA encoders aint exactly quality minded. Is that all we got after 3-4 years?
Plus what actual useful work can you do? There is so much PR, but yet nothing we really use. PhysX would be the closests. And how is that going?
So if thats the best you can come up with..look again :rofl:
Its a no brainer that Larrabee is an instant killer of CTM and CUDA. Better performance and without all the limitations.
Irrelevent.
You said CUDA was useless. I say it isn't. "Nothing you really use?" Maybe you should actually get a CUDA capable card and buy/download a CUDA app. Like others who use CUDA in research on behalf of their own companies.
CUDA is highly effective in some niche areas =! CUDA is useless.
Only a troll would say CUDA is useless.
CoreAVC uses CUDA. The best HD decoder having strong support for Nvidia is pretty significant.
hmmm well for the average user i think cuda is useless... there arent really any notable advantages that a cuda or no-cuda vga brings, or are there?
i mean sure, if you happen to encode videos a lot and this badaboom tool or cs4 or whatever works for you... but even then the boost is limited... idk, im sceptical about all those attempts to make their cards value look higher than it actually is for average users...
people go all star eyes about what they can do with their new nvidia card, but they never actually use the features... the ones that do usually find out thats its kinda pointless... i remember when i bought my g4mx card it was advertized as having tv out, but the quality sucked, it had tv in, but was too weak to encode properly, it supported 3d glasses but driver support was a nightmare, it was marketed as a gaming card, but was very slow in games...
isnt it the same with cuda?
Yeah just like most advanced technology.
saaya, instead of just blindly talking about stuff you can try doing some research at http://www.hpcwire.com if you really wanted to know who was using GPGPU.
looking forward to it!
yup! I think some of their drivers are made by a 3rd party. But who in the hell uses Intel Integrated Graphics for Games. Then these games did run for me but slow as hell. I didn't have driver problems. I upgrade systems for folks as a hobby, not a business. My way of getting them to upgrade is showing how slow Intel's IGP is. Hell, it would be nice if Games did crash more than the one in twenty times I run them. The biggest thing these folks want to see run are The Sims and WOW! Not bad or poorly coded games like Crysis. In fact, I've found that even noobs know it's useless trying to upgrade to run it LOL!:rofl: Some are showing more common sense than so called Tech savvy Geeks. For the record, I own both Crysis and Warhead.
i never did say they would need to use all 32 cores in a north bridge IGP, the point was they can actually compete with other IGPs. to me this is an excellent way for intel to have their little laptops actually run hd movies. selling these as drop in cards should not be their main objective since im betting they are truly not ready to run with the big boys. i expect to see a crappy performance per $ and performance per watt. but that wont matter if you consider the total cost of the computer (preferably the laptop) and how intel is making as much profit as possible due to having almost every part in there made by them. they should start small, with good IGPs, then later go to true GPUs
Larabee will have a tough run if the die is anywhere near 600mm. People will need to accept Larrabee as a graphics capable chip but thats not what its being designed for. The die is too big and its being built for gpgpu purposes to fight off amd/nvidia. All this graphics talk around Larrabee is marketing and a great big smoke screen.
That's doubtful. Intel has a completely separate HPC team that is working on CPUs for that market. They aren't going to pit two of their own products against each other. Word is that there was actually some angst from the CPU group about Larrabee trying to muscle in. So don't expect to see it going up against Xeon anytime soon.
Project offset is supposed to be build around Larrabee (Game engine). And this video is supposed to be made with Larrabee.
http://www.projectoffset.com/media/Video/Meteor.wmv
Maybe someone has said before, but I honestly can't see how a bunch of x86 CPUs stuck together is supposed to be so great for graphics :shrug:
cool vid, and i think that should have been the nature game of 3DmarkV.
Too bad the Engine was created a few years ago and the game was already started before Intel purchased them.
With Intel's backing, money and manpower, I'm sure they could decently rework the engine to optimize for Larrabee but it was not originally created or built around Larrabee.
x86 for the compute power, Larrabee will still have some fix function units.
i think were happy they built it first. from what i read on their site, its all about having an engine that reduces development time and gives user friendly tools to develop. if intel built it, i bet it would take 2x as many people to put a game together. i hope the engine gets used by a few games.
It might be a bit irrelevant, but I think it would be smarter for Intel to make a product that brings something new in terms of gaming instead of GPGPU or computing applications. They could buy Caustic graphics and make a GPU with real time ray tracing for example, this is much more interesting for an average user than GPU computing anyway imo. I hope the performance is decent at least, otherwise there's not much point to move from Nvidia or ATI products at all since these companies have a lot more experience in this area anyway.
Heh, it's the same for GPGPU. However, I think it's a better feature in regards of gaming, and could become more popular easily with Intel pushing it. Every single 3D game would benefit from it (if the game supported this technology that is), but most applications don't need GPGPU processing power. And gaming is what most people buy top graphics cards for. Besides, this could make Larrabee unique in some way, which isn't really true feature wise atm.
Larrabee will most likely face the same trials that Radeon and Geforce experienced in their infancy i.e. it will see R600 and Geforce FX type fiascos as well.
Perkam
GPU+CPU in one chip it was happened before, hope the new upcoming will make all user feel better than that chip maker have done before, i mean now they fighting of expectantly in this time too
Here are some speculation from techreport:
http://techreport.com/discussions.x/16920
I suspect Larrabee is aimed more at dx11 than backwards compatibility. It certainly won't match Nvidia or ATI for any performance crowns on dx9/10 games. It is aimed at being competitive in the dx11 market place.
Anyone who thinks otherwise will be very disappointed.