So in 2011 or 2012 Intel may finally have a decent onboard GPU?Great!
![]()
So in 2011 or 2012 Intel may finally have a decent onboard GPU?Great!
![]()
One thing is for sure - Intel NEEDS Larrabee, and it will be released even if it can't compete against AMD and Nvidia. Larrabee pretty much is crucial for the survival of their company on the chip side.
SweClockers.com
CPU: Phenom II X4 955BE
Clock: 4200MHz 1.4375v
Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
Motherboard: ASUS Crosshair IV Formula
GPU: HD 5770
Maybe it can't compete when it comes to desktop solution, but since now - more and more go for portable solutions (even for home personal use) I believe that is what they'll aim. You guys seem to miss one point: Top products from ATi and nVidia - as in High End cards like ATI HD 5870 or nVidia GTX 395 are not the once that will bring the huge profit... Same as with the other generations, the big money comes from Low End, Mainstream class and integrated solutions. So yeh, for Harcore gamer/Entusiast Larabee might be a total fiasco, but for general users will do just fine, which means Intel will profit form this and that's the only thing important - for them at least.
If Intel don't release Larrabee, a CGPU, Nvidia and ATI's GPUs will be come more general-purpose in the future and eat their CPU market share.![]()
.
Intel is perfectly fine today, but not tomorrow without Larrabee, its a matter of surviving in the chipmarket which is exactly why AMD acquired ATI cause AMD knew back then it had to get a GPU architecture, and Intel most likely had even begun their Larrabee project way before the AMD/ATI acquisition.
GPGPU is evolving to become actually useful, and a merger of the CPU and GPU is only a matter of time, Intel can at no cost be left out from that. It's the same for Nvidia who doesn't have a CPU architecture except one licensed from ARM they also need to come up with something for the future to survive as a company.
SweClockers.com
CPU: Phenom II X4 955BE
Clock: 4200MHz 1.4375v
Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
Motherboard: ASUS Crosshair IV Formula
GPU: HD 5770
Depending on what share, i dont foresee a massive switch to gpgpus on consumers nore on embeded devices in the near future. HPC maybe, but there are more specialised cpu there already cover some of the bases the GPGPUs also cover.
CPUs and GPUs in the end will pump into the same wall at one point (parallelization) but one early then the other.
fail post.
if intel says larrabee will compete with radeon/geforce that means it will be low power all the way up to high end.intel designs all of their architectures to be scalable from atom to nehalem-ex so i dont see why they wouldnt do this for their graphics line up.
Not exactly. Atom and Nehalem are different architectures. Atom wouldn't scale up to Nehalem level very well and Nehalem wouldn't scale down to Atom level very well.
With larrabee it will be a little easier though since they can just add/remove processor cores to achieve different performance levels.
Adobe is working on Flash Player support for 64-bit platforms as part of our ongoing commitment to the cross-platform compatibility of Flash Player. We expect to provide native support for 64-bit platforms in an upcoming release of Flash Player following the release of Flash Player 10.1.
Didn't the last rumors state the 64core LRB at 1ghz performed similar to the 4890/285? at worst like the 275.
What if intel releases this chip at 2ghz? which is doable with their own fab's?
Lots of questions and IF's surrounding LRB.
Even if it does only perform at 275 level, I'd still buy one just to be a GPGPU. Games and software will definitely support it for anything.
Intel 2600K @ 4.8ghz 1.31v on Water.
ASROCK Z68 Ex4 Gen 3, 16GB G.skill pc1600
MSI GTX 680 1200/6800mhz
2x Vertex LE 60GB Raid 0
I doubt intel will release a product that is not competitive with the AMD's and NV's cards in 2010 or 2011(depending on the launch schedule of Larrabee). What may be the problem for Lrb is the power draw or perf./watt ratio and one more very important thing called drivers(game support and compatibility).
Based on the rumors at 2ghz it should be able to perform like a 5850.
I hope it comes out without snags around x-mas
Intel 2600K @ 4.8ghz 1.31v on Water.
ASROCK Z68 Ex4 Gen 3, 16GB G.skill pc1600
MSI GTX 680 1200/6800mhz
2x Vertex LE 60GB Raid 0
I don't know the tech specs of it but it still strikes me as a rather inefficient thing to take a bunch of x86 CPUs and have them do graphics. I mean, that's why the graphics card was created, because it was designed specifically for graphics and did a much better job than a CPU![]()
The Cardboard Master Crunch with us, the XS WCG team
Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64
Since it's based on software performance is likely to vary a lot more than on conventional GPUs. It is still my belief that LRB always was doomed for failure in the GPU segment. The GPGPU segment is where it has the potential to shine, but both ATI and especially nvidia are working hard to improve in this.
Seems we made our greatest error when we named it at the start
for though we called it "Human Nature" - it was cancer of the heart
CPU: AMD X3 720BE@ 3,4Ghz
Cooler: Xigmatek S1283(Terrible mounting system for AM2/3)
Motherboard: Gigabyte 790FXT-UD5P(F4) RAM: 2x 2GB OCZ DDR3 1600Mhz Gold 8-8-8-24
GPU:HD5850 1GB
PSU: Seasonic M12D 750W Case: Coolermaster HAF932(aka Dusty)
Soundcards where created for the same purpose, and now every mobo has a chip with the sound codecs on board and the cpu takes over the decoding. Its just a matter of calculation power.
If your cpu would be fast enough, it wouldn't matter if x86 is inefficent. The problem, right now is, they arn't and thats why specialised hardware is faster (and always will be faster)
Also, intel also is eager to bring x86 to the embeded market. So they arn't focusing only on the high performance market but also on a mass market (set top boxes, VDR etc.).
Intel's Senior Graphics Software Architect Manager...
please, what do you expect him to say?
oh well, now that you ask... lrb is actually quite a weak design and doesnt perform well at all![]()
what what WHAT???? ok... now im surprised... intel doesnt seem to take lrb half as serious as i thought they would...and possibly leading a small team of engineers through any of the above.
its about perf per transistor... intel wont be able to match that of a gpu for rasterization, so they need more transistors, means a bigger gpu... and considering how people crack jokes about gt300 being huge and a failure cause tsmcs 40nm process will probably never have good yields with such a massive chip... well think about a chip that is 20-50% bigger to reach the same performance... at a higher tdp... and now you might see why intel might not be able to capture the highend market...
hahah, wow... intel must be praying that there are enough naive customers like you who buy a product cause they are convinced it must be good at something even if it sucks at what its targeted at
think of phhysix... ageia taught everybody a lesson in how naive many customers are nowadays and how much money they spend on a product that doesnt even have any real use
same as this killer nic card that speeds up your games but you cant meassure it LOL
i think they will wait for 32nm, at least for the highend version, and itll be 64 or even more... and there is no meassured perf so far, its all smart guesses from intel at this point afaik...
heh yeah... and not only that... i think intel is really taking this too light hearted and too arrogantly...
they want lrb to be:
cost competitive
tdp competitive
rasterization competitive
x86 competitive
anything else?
seriously... how arrogant do you have to be to think you can not only create a product that beats fixed function logic in its home territory, but also delivers outstanding general purpose performance, and all this at the same or worse mfc process, within the same tdp envelope and same price?
its like boing anouncing they will launch a new plane that can transport more people than an A380, fly faster than a concorde, and all that at the same price and fuel consumption as an avergae A330 passanger jet
oh and that not being enough, they make a, using their own words, SMALL, design team work on this... most of them have never worked with each other before and each comes from a diferent background, probably resulting in diferent views on things and conflicts...
i think intels top guys dont seem to realize what an important strategic value lrb has for the future of their company... small team...
ok, lets look at soundcards... while they heavily use the cpu to get stuff done, is there any mainboard, at all, that uses the cpu ONLY?
is there any mainboard that does audio completely in software on a cpu?
thats the same reason why many people doubt doing graphics on x86 cores entirely, makes no sense...
Last edited by saaya; 09-20-2009 at 07:57 PM.
How in God's name can Intel claim Larrabee is faster than anything when
1. There is no Larrabee
2. They have no idea what Nvidia and ATI will have by the time there is a Larrabee
http://www.xtremesystems.org/forums/...4&postcount=47
I think "blows the socks off anything else" is pretty explicit.
Bookmarks