MMM
Page 3 of 8 FirstFirst 123456 ... LastLast
Results 51 to 75 of 188

Thread: Larrabee: A fiasco, or the future?

  1. #51
    Banned
    Join Date
    May 2006
    Location
    Skopje, Macedonia
    Posts
    1,716
    So in 2011 or 2012 Intel may finally have a decent onboard GPU? Great!

  2. #52
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034
    One thing is for sure - Intel NEEDS Larrabee, and it will be released even if it can't compete against AMD and Nvidia. Larrabee pretty much is crucial for the survival of their company on the chip side.
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  3. #53
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    Maybe it can't compete when it comes to desktop solution, but since now - more and more go for portable solutions (even for home personal use) I believe that is what they'll aim. You guys seem to miss one point: Top products from ATi and nVidia - as in High End cards like ATI HD 5870 or nVidia GTX 395 are not the once that will bring the huge profit... Same as with the other generations, the big money comes from Low End, Mainstream class and integrated solutions. So yeh, for Harcore gamer/Entusiast Larabee might be a total fiasco, but for general users will do just fine, which means Intel will profit form this and that's the only thing important - for them at least.

  4. #54
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Smartidiot89 View Post
    One thing is for sure - Intel NEEDS Larrabee, and it will be released even if it can't compete against AMD and Nvidia. Larrabee pretty much is crucial for the survival of their company on the chip side.

    Intel is perfectly fine without LB...
    If LB turns out good Intel will just capture yet another chunk of PC market, otherwise they will enjoy dominating as is...
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  5. #55
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    228
    If Intel don't release Larrabee, a CGPU, Nvidia and ATI's GPUs will be come more general-purpose in the future and eat their CPU market share.
    .

  6. #56
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by XSAlliN View Post
    Maybe it can't compete when it comes to desktop solution, but since now - more and more go for portable solutions (even for home personal use) I believe that is what they'll aim. You guys seem to miss one point: Top products from ATi and nVidia - as in High End cards like ATI HD 5870 or nVidia GTX 395 are not the once that will bring the huge profit... Same as with the other generations, the big money comes from Low End, Mainstream class and integrated solutions. So yeh, for Harcore gamer/Entusiast Larabee might be a total fiasco, but for general users will do just fine, which means Intel will profit form this and that's the only thing important - for them at least.
    Larrabee is designed to compete against geforce and radeon. but its real advantage is gpgpu. there is very little fixed function logic, and cache coherency to allow the processors to communicate efficiently. it will also easier to program on.

  7. #57
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034
    Quote Originally Posted by zalbard View Post

    Intel is perfectly fine without LB...
    If LB turns out good Intel will just capture yet another chunk of PC market, otherwise they will enjoy dominating as is...
    Intel is perfectly fine today, but not tomorrow without Larrabee, its a matter of surviving in the chipmarket which is exactly why AMD acquired ATI cause AMD knew back then it had to get a GPU architecture, and Intel most likely had even begun their Larrabee project way before the AMD/ATI acquisition.

    GPGPU is evolving to become actually useful, and a merger of the CPU and GPU is only a matter of time, Intel can at no cost be left out from that. It's the same for Nvidia who doesn't have a CPU architecture except one licensed from ARM they also need to come up with something for the future to survive as a company.
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  8. #58
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Vozer View Post
    If Intel don't release Larrabee, a CGPU, Nvidia and ATI's GPUs will be come more general-purpose in the future and eat their CPU market share.
    Depending on what share, i dont foresee a massive switch to gpgpus on consumers nore on embeded devices in the near future. HPC maybe, but there are more specialised cpu there already cover some of the bases the GPGPUs also cover.

    CPUs and GPUs in the end will pump into the same wall at one point (parallelization) but one early then the other.

  9. #59
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    Quote Originally Posted by Chumbucket843 View Post
    Larrabee is designed to compete against geforce and radeon. but its real advantage is gpgpu. there is very little fixed function logic, and cache coherency to allow the processors to communicate efficiently. it will also easier to program on.
    Didn't knew only High End products from ATi/nVidia can be called GeForce/Radeon...

  10. #60
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by XSAlliN View Post
    Didn't knew only High End products from ATi/nVidia can be called GeForce/Radeon...
    fail post.

    if intel says larrabee will compete with radeon/geforce that means it will be low power all the way up to high end. intel designs all of their architectures to be scalable from atom to nehalem-ex so i dont see why they wouldnt do this for their graphics line up.

  11. #61
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by Chumbucket843 View Post
    intel designs all of their architectures to be scalable from atom to nehalem-ex so i dont see why they wouldnt do this for their graphics line up.
    Not exactly. Atom and Nehalem are different architectures. Atom wouldn't scale up to Nehalem level very well and Nehalem wouldn't scale down to Atom level very well.

    With larrabee it will be a little easier though since they can just add/remove processor cores to achieve different performance levels.

  12. #62
    Xtreme Mentor
    Join Date
    Apr 2005
    Posts
    2,550
    Quote Originally Posted by Sh1tyMcGee View Post
    Project Offset, its a nice looking game.
    really? as far as I'm aware Project Offset is still in the Project phase, and there's no game base on this Project that's announced, or shipped!
    Adobe is working on Flash Player support for 64-bit platforms as part of our ongoing commitment to the cross-platform compatibility of Flash Player. We expect to provide native support for 64-bit platforms in an upcoming release of Flash Player following the release of Flash Player 10.1.

  13. #63
    Xtreme Member
    Join Date
    Nov 2003
    Posts
    450
    Didn't the last rumors state the 64core LRB at 1ghz performed similar to the 4890/285? at worst like the 275.

    What if intel releases this chip at 2ghz? which is doable with their own fab's?

    Lots of questions and IF's surrounding LRB.

    Even if it does only perform at 275 level, I'd still buy one just to be a GPGPU. Games and software will definitely support it for anything.
    Intel 2600K @ 4.8ghz 1.31v on Water.
    ASROCK Z68 Ex4 Gen 3, 16GB G.skill pc1600
    MSI GTX 680 1200/6800mhz
    2x Vertex LE 60GB Raid 0

  14. #64
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    I doubt intel will release a product that is not competitive with the AMD's and NV's cards in 2010 or 2011(depending on the launch schedule of Larrabee). What may be the problem for Lrb is the power draw or perf./watt ratio and one more very important thing called drivers(game support and compatibility).

  15. #65
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by Unoid View Post
    Didn't the last rumors state the 64core LRB at 1ghz performed similar to the 4890/285? at worst like the 275.

    What if intel releases this chip at 2ghz? which is doable with their own fab's?

    Lots of questions and IF's surrounding LRB.

    Even if it does only perform at 275 level, I'd still buy one just to be a GPGPU. Games and software will definitely support it for anything.
    they tested it at 1GHz but rumours say it'll release at 2GHz.

    and iirc it's 48 cores not 64.

  16. #66
    Xtreme Member
    Join Date
    Nov 2003
    Posts
    450
    Based on the rumors at 2ghz it should be able to perform like a 5850.

    I hope it comes out without snags around x-mas
    Intel 2600K @ 4.8ghz 1.31v on Water.
    ASROCK Z68 Ex4 Gen 3, 16GB G.skill pc1600
    MSI GTX 680 1200/6800mhz
    2x Vertex LE 60GB Raid 0

  17. #67
    Xtremely High Voltage Sparky's Avatar
    Join Date
    Mar 2006
    Location
    Ohio, USA
    Posts
    16,040
    I don't know the tech specs of it but it still strikes me as a rather inefficient thing to take a bunch of x86 CPUs and have them do graphics. I mean, that's why the graphics card was created, because it was designed specifically for graphics and did a much better job than a CPU
    The Cardboard Master
    Crunch with us, the XS WCG team
    Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64

  18. #68
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Thessaloniki, Greece
    Posts
    1,307
    Quote Originally Posted by Unoid View Post
    Based on the rumors at 2ghz it should be able to perform like a 5850.

    I hope it comes out without snags around x-mas
    Since it's based on software performance is likely to vary a lot more than on conventional GPUs. It is still my belief that LRB always was doomed for failure in the GPU segment. The GPGPU segment is where it has the potential to shine, but both ATI and especially nvidia are working hard to improve in this.
    Seems we made our greatest error when we named it at the start
    for though we called it "Human Nature" - it was cancer of the heart
    CPU: AMD X3 720BE@ 3,4Ghz
    Cooler: Xigmatek S1283(Terrible mounting system for AM2/3)
    Motherboard: Gigabyte 790FXT-UD5P(F4) RAM: 2x 2GB OCZ DDR3 1600Mhz Gold 8-8-8-24
    GPU:HD5850 1GB
    PSU: Seasonic M12D 750W Case: Coolermaster HAF932(aka Dusty )

  19. #69
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by SparkyJJO View Post
    I don't know the tech specs of it but it still strikes me as a rather inefficient thing to take a bunch of x86 CPUs and have them do graphics. I mean, that's why the graphics card was created, because it was designed specifically for graphics and did a much better job than a CPU
    Soundcards where created for the same purpose, and now every mobo has a chip with the sound codecs on board and the cpu takes over the decoding. Its just a matter of calculation power.

    If your cpu would be fast enough, it wouldn't matter if x86 is inefficent. The problem, right now is, they arn't and thats why specialised hardware is faster (and always will be faster)

    Also, intel also is eager to bring x86 to the embeded market. So they arn't focusing only on the high performance market but also on a mass market (set top boxes, VDR etc.).

  20. #70
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Vozer View Post
    I think we don't need to worry about LRB's raw power.
    Intel's Senior Graphics Software Architect Manager...
    please, what do you expect him to say?
    oh well, now that you ask... lrb is actually quite a weak design and doesnt perform well at all

    and possibly leading a small team of engineers through any of the above.
    what what WHAT???? ok... now im surprised... intel doesnt seem to take lrb half as serious as i thought they would...

    Quote Originally Posted by Chumbucket843 View Post
    fail post.

    if intel says larrabee will compete with radeon/geforce that means it will be low power all the way up to high end. intel designs all of their architectures to be scalable from atom to nehalem-ex so i dont see why they wouldnt do this for their graphics line up.
    its about perf per transistor... intel wont be able to match that of a gpu for rasterization, so they need more transistors, means a bigger gpu... and considering how people crack jokes about gt300 being huge and a failure cause tsmcs 40nm process will probably never have good yields with such a massive chip... well think about a chip that is 20-50% bigger to reach the same performance... at a higher tdp... and now you might see why intel might not be able to capture the highend market...

    Quote Originally Posted by Unoid View Post
    Even if it does only perform at 275 level, I'd still buy one just to be a GPGPU. Games and software will definitely support it for anything.
    hahah, wow... intel must be praying that there are enough naive customers like you who buy a product cause they are convinced it must be good at something even if it sucks at what its targeted at

    think of phhysix... ageia taught everybody a lesson in how naive many customers are nowadays and how much money they spend on a product that doesnt even have any real use

    same as this killer nic card that speeds up your games but you cant meassure it LOL

    Quote Originally Posted by Helloworld_98 View Post
    they tested it at 1GHz but rumours say it'll release at 2GHz.

    and iirc it's 48 cores not 64.
    i think they will wait for 32nm, at least for the highend version, and itll be 64 or even more... and there is no meassured perf so far, its all smart guesses from intel at this point afaik...


    Quote Originally Posted by SparkyJJO View Post
    I don't know the tech specs of it but it still strikes me as a rather inefficient thing to take a bunch of x86 CPUs and have them do graphics. I mean, that's why the graphics card was created, because it was designed specifically for graphics and did a much better job than a CPU
    heh yeah... and not only that... i think intel is really taking this too light hearted and too arrogantly...

    they want lrb to be:
    cost competitive
    tdp competitive
    rasterization competitive
    x86 competitive

    anything else?
    seriously... how arrogant do you have to be to think you can not only create a product that beats fixed function logic in its home territory, but also delivers outstanding general purpose performance, and all this at the same or worse mfc process, within the same tdp envelope and same price?

    its like boing anouncing they will launch a new plane that can transport more people than an A380, fly faster than a concorde, and all that at the same price and fuel consumption as an avergae A330 passanger jet

    oh and that not being enough, they make a, using their own words, SMALL, design team work on this... most of them have never worked with each other before and each comes from a diferent background, probably resulting in diferent views on things and conflicts...

    i think intels top guys dont seem to realize what an important strategic value lrb has for the future of their company... small team...

    Quote Originally Posted by Hornet331 View Post
    Soundcards where created for the same purpose, and now every mobo has a chip with the sound codecs on board and the cpu takes over the decoding. Its just a matter of calculation power.

    If your cpu would be fast enough, it wouldn't matter if x86 is inefficent. The problem, right now is, they arn't and thats why specialised hardware is faster (and always will be faster)

    Also, intel also is eager to bring x86 to the embeded market. So they arn't focusing only on the high performance market but also on a mass market (set top boxes, VDR etc.).
    ok, lets look at soundcards... while they heavily use the cpu to get stuff done, is there any mainboard, at all, that uses the cpu ONLY?
    is there any mainboard that does audio completely in software on a cpu?

    thats the same reason why many people doubt doing graphics on x86 cores entirely, makes no sense...
    Last edited by saaya; 09-20-2009 at 07:57 PM.

  21. #71
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by saaya View Post
    ok, lets look at soundcards... while they heavily use the cpu to get stuff done, is there any mainboard, at all, that uses the cpu ONLY?
    is there any mainboard that does audio completely in software on a cpu?
    Every Mobo with a Audio codec...?
    The "Audio Codec" in hardware on the mobo is nothing more then a A&D/D&A converter.

    The real decoding is done through software aka the CPU.

  22. #72
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Hornet331 View Post
    Every Mobo with a Audio codec...?
    The "Audio Codec" in hardware on the mobo is nothing more then a A&D/D&A converter.

    The real decoding is done through software aka the CPU.
    your right, bad example...

  23. #73
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    How in God's name can Intel claim Larrabee is faster than anything when

    1. There is no Larrabee
    2. They have no idea what Nvidia and ATI will have by the time there is a Larrabee

  24. #74
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,366
    Quote Originally Posted by trinibwoy View Post
    How in God's name can Intel claim Larrabee is faster than anything when

    1. There is no Larrabee
    2. They have no idea what Nvidia and ATI will have by the time there is a Larrabee
    What exactly Intel's claim you point at?

  25. #75
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by kl0012 View Post
    What exactly Intel's claim you point at?
    http://www.xtremesystems.org/forums/...4&postcount=47

    I think "blows the socks off anything else" is pretty explicit.

Page 3 of 8 FirstFirst 123456 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •