Who cares if this thing can run games or not.. Crrrrrunchiiiiiiiing!!! :D
Who cares if this thing can run games or not.. Crrrrrunchiiiiiiiing!!! :D
Intel said Friday that its Larrabee graphics processor is delayed and that the chip will initially appear as a software development platform only.
This is a blow to the world's largest chipmaker, which was looking to launch its first discrete (standalone) graphics chip in more than a decade.
"Larrabee silicon and software development are behind where we hoped to be at this point in the project," Intel spokesperson Nick Knupffer said Friday. "As a result, our first Larrabee product will not be launched as a standalone discrete graphics product," he said.
"Rather, it will be used as a software development platform for internal and external use," he added. Intel is not discussing what other versions may appear after the initial software development platform product, or "kit," is launched next year.
Larrabee, a chronically delayed chip, was originally expected to appear in 2008. It was slated to compete with discrete graphics chips from Nvidia and Advanced Micro Devices' ATI graphics unit.
Intel would not give a projected date for the Larrabee software development platform and is only saying "next year."
Intel says its plans are unchanged to deliver this month the first chip with graphics integrated onto the CPU. This new Atom processor is referred to as "Pineview" (the platform is called "Pine Trail") and will be targeted at Netbooks.
http://news.cnet.com/8301-13924_3-10409715-64.html
R.I.P larabbbbbeeeeeeeeeeeee :P
Delayed? Dead to consumers is more like it. Dunno why they even did that Demo TBH.
Why have they killed the GPU variant? Performance seemed like it would be quite good...
The thread title should go more like this : " Intel Larrabee finally hits 1TFLOPS - 2.7x faster than nVidia GT200, gets canceled !!!" :(
Workloads that we'd expect are actually pretty much top-end. And it's that comp-intensive stuff that Larrabee is most attractive for using. If you were doing CGI, for example, and wanted to keep a platform long-term (which isn't easily possible when changing uArch--x86 to Sun to nVidia to ATI to x86 to ...--you get my point), something x86-based like Larrabee would be a top choice in my opinion. So long as Larrabee remains stable and "cost effective", it'll get a great amount of use. Remember, a decent 650W power supply (ie: Corsair 650TX) would be able to run a pair of Larrabees at 150Wmax/ea plus a full system. That power supply costs about $100. The alternative is to have two or times as many computers WITHOUT Larrabee--making the (probably incorrect) assumption that Larrabee will perform as well as a regular CPU. Modular co-processors are expensive, but the benefits are quite real. Wouldn't you love to run virtual machines on a GPGPU? I know I would.
Yeah, maybe. However, Intel hasn't been the best at keeping promises lately, ie. execution has been very poor. Right from SSD's to the promises of LRB to it's inability to work with Hyper-V, so i'll wait until I see real hardware from Intel before I pass judgment on it. Then again, I don't crunch so obviously i'm looking at it from an entirely different viewpoint. :D
Didn't you know anandtech is a shameless "intel bumper", so everything on that site are blunt lies... according to a certain group... :p:
Pretty sure 99% of people could care less about ray tracing. I mean most of the games we play are still DX9. Whats gonna sell something nobody is going to use or something than play whats out there. I still feel sorry for all the suckers who bought PhysX cards, wait no I don't.