Page 1 of 8 1234 ... LastLast
Results 1 to 25 of 188

Thread: Larrabee: A fiasco, or the future?

  1. #1
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Posts
    516

    Larrabee: A fiasco, or the future?

    Rumormongers seem to paint very different pictures:

    http://vr-zone.com/forums/485017/pat...e-fiasco-.html

    http://www.semiaccurate.com./2009/09...out-weeks-ago/

    What do we actually know? Is there any solid info out there? Any published roadmaps, specs, or anything similar?

  2. #2
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Larrabee = R600/G92 levels at best.

    Perkam

  3. #3
    Xtreme Enthusiast
    Join Date
    Apr 2008
    Posts
    912
    bsn is full of and theo valich is someone who posts fiction along with news, way more so than charlie.

    I'd have a 'wait and see' attitude to this. It's not the first time Pat Gelsinger has quit Intel anyway.

  4. #4
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by bowman View Post
    I'd have a 'wait and see' attitude to this.
    You're in for a long wait.

  5. #5
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    278
    Larrabee is kind of like Lucid Hydra, Blink and its gone in the wind!

  6. #6
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    in terms of good design : Fiasco

    in terms of feeding off your old successes: Intel's Future
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  7. #7
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,366
    Future, I think. One way or another one day all GPUs become more general purpose.

  8. #8
    Xtreme 3D Team
    Join Date
    Jan 2009
    Location
    Ohio
    Posts
    8,499
    Fiasco, I can't imaging an intel graphics card, it may turn out bad...ATI+Nvidia is enough imo
    Smile

  9. #9
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by perkam View Post
    Larrabee = R600/G92 levels at best.
    Perkam
    and you base that on what exactly?

    the first silicon was broken, so what?
    did anybody honestly expect intels first cpu-gpu merger will work right from the start?

  10. #10
    Registered User
    Join Date
    Sep 2009
    Posts
    20
    its only going to be the "future" because intel is attempting to throw the market in that direction

  11. #11
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    My bet is Larrabee goes the way of the Duke Nuken...
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  12. #12
    Xtreme Enthusiast
    Join Date
    Nov 2005
    Posts
    844
    It's only going to be the "future" because it ain't here today. =P

    I admit I don't know much of larrabee, but raytracing looks pretty promising.
    -Cpu:Opteron 170 LCBQE 0722RPBW(2.87ghz @ 1.300v)
    (retired)Opteron 146 (939) CAB2E 0540
    -Heatsink: Thermalright XP-90
    -Fan:120mm Yate Loon 1650 RPM @ 12V, 70.5 CFM, 33dB
    -Motherboard: DFI Lanparty nF4 UT Ultra-D
    -Ram: Mushkin High Performance blue, 2gigs(2X1gig kit) PC3200 991434
    -Hard drive: Seagate 400GB Barracuda SATA HD 7200.10(AS noisey model)
    -Video card: evga 6800GS @520/1170
    -Case: P180
    -PSU:Enermax 535Watt EG565P-VE FMA (24P)

  13. #13
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Dimitriman View Post
    My bet is Larrabee goes the way of the Duke Nuken...
    no way, they will release it even if it sucks...

    if they dont manage to graphics back on the cpu and to x86, then it means gpus will manage to get cpu functionality merged in... and that means, long term, bye bye x86 and bye bye intel... intel NEEDS larrabee...

  14. #14
    Xtreme Member
    Join Date
    Dec 2006
    Location
    Edmonton,Alberta
    Posts
    182
    You would have sneak into Dreamworks and Pixar to find how well Larrabee is working.

    Think more renderfarm then gaming.

    From Motley Fool Sept. 12
    And if today's action wasn't hot enough for you, keep an eye out for Intel's Larrabee GPU chips next year. Intel hasn't been a force in high-performance graphics hardware since the early 1990s, but that should change with Larrabee. DreamWorks Animation (Nasdaq: DWA) CEO Jeffrey Katzenberg waxes poetic over the product, saying that "Larrabee raises the bar of what we can do not just by 2X or 3X but by 20X."/
    http://www.fool.com/investing/genera...-mountain.aspx

  15. #15
    Mr. Boardburner
    Join Date
    Jun 2005
    Location
    the Netherlands
    Posts
    5,340
    I think Larrabee will be a lot like the Intel 740 graphics chip... Lots of talking during development, great performance at the time, but outdated by the time it'll be released...
    Main rig:
    CPU: I7 920C0 @ 3.6Ghz (180*20)
    Mobo: DFI UT X58 T3eH8
    RAM: 12GB OCZ DDR3-1600 Platinum
    GPU/LCD: GeForce GTX280 + GeForce 8600GTS (Quad LCDs)
    Intel X25-M G2 80GB, 12TB storage
    PSU/Case: Corsair AX850, Silverstone TJ07

  16. #16
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Vancouver
    Posts
    1,073

    intel can do anything.

    It's not a matter of If it will pan out, but rather when. How much money and how much ROI is what will dictate whether the product comes to fruition or not.. you cant tell me a company with 10 times the market cap of its top 2 competitors combined cant afford to hire engineers, developers to come up with something better than what is out there...its just a matter of resources.
    " Business is Binary, your either a 1 or a 0, alive or dead." - Gary Winston ^^



    Asus rampage III formula,i7 980xm, H70, Silverstone Ft02, Gigabyte Windforce 580 GTX SLI, Corsair AX1200, intel x-25m 160gb, 2 x OCZ vertex 2 180gb, hp zr30w, 12gb corsair vengeance

    Rig 2
    i7 980x ,h70, Antec Lanboy Air, Samsung md230x3 ,Saphhire 6970 Xfired, Antec ax1200w, x-25m 160gb, 2 x OCZ vertex 2 180gb,12gb Corsair Vengence MSI Big Bang Xpower

  17. #17
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Quote Originally Posted by villa1n View Post
    It's not a matter of If it will pan out, but rather when. How much money and how much ROI is what will dictate whether the product comes to fruition or not.. you cant tell me a company with 10 times the market cap of its top 2 competitors combined cant afford to hire engineers, developers to come up with something better than what is out there...its just a matter of resources.
    Well, first I have to say that I have basically no knowledge about ICs whatsoever. But I don't think you can "easily" start from scratch and release a GPU that is as fast as the ones of your competitor, which have been in the business for several years. Somehow I doubt it's that easy, even with a f*ckload of resources.
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  18. #18
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Intel does not adapt to market. Market adapts to Intel.

    Thus said, no matter how Larrabee turns out, Intel will make sure the market will follow it's lead.

  19. #19
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    if it's aimed at the GPGPU market, it'll do well

    if ray-traced games are around when it comes out, it'll do well

    otherwise, I don't think it'll do that well.

  20. #20
    Registered User
    Join Date
    Mar 2006
    Posts
    69
    Would intel release low performing chip that is outdated by today's standards? I don't think so...

  21. #21
    Xtreme Addict
    Join Date
    Dec 2005
    Location
    UK
    Posts
    1,713
    Quote Originally Posted by DelaMaris View Post
    Would intel release low performing chip that is outdated by today's standards? I don't think so...
    Yea, its not like they never done that before
    TAMGc5: PhII X4 945, Gigabyte GA-MA790X-UD3P, 2x Kingston PC2-6400 HyperX CL4 2GB, 2x ASUS HD 5770 CUcore Xfire, Razer Barracuda AC1, Win8 Pro x64 (Current)

    TAMGc6: AMD FX, Gigabyte GA-xxxx-UDx, 8GB/16GB DDR3, Nvidia 680 GTX, ASUS Xonar, 2x 120/160GB SSD, 1x WD Caviar Black 1TB SATA 6Gb/s, Win8 Pro x64 (Planned)

  22. #22
    Registered User
    Join Date
    Aug 2009
    Location
    Australia
    Posts
    37
    I think larrabee without getting too political about it all is Intel saying "GTFO gpgpu" as the more your offloading to the GPU the less important big expensive processors become.

    Intel wants to ensure that their product is the main part that determines performance and at the moment it looks like they will be taking a backseat in the next couple of years if the developers start moving towards largley multi-threaded and gpgpu applications.

    Nvidia -> Processing >> Graphics Core
    Intel -> Graphics >> Processing Core

    A tugg-o-war it seems

  23. #23
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,366
    Quote Originally Posted by Helloworld_98 View Post
    if it's aimed at the GPGPU market, it'll do well

    if ray-traced games are around when it comes out, it'll do well

    otherwise, I don't think it'll do that well.
    With the properly implemented driver the software rasterizer my yield very good result on Larrabee. Here is interesting article:
    http://software.intel.com/en-us/arti...n-on-larrabee/
    And with that, we conclude our lightning tour of the Larrabee rasterization approach, and our examination of how vector programming can be applied to a semi-parallel task. As I mentioned earlier, software rasterization will never match dedicated hardware peak performance and power efficiency for a given area of silicon, but so far it's proven to be efficient enough. It also has a significant advantage, which is that because it uses general purpose cores, the same resources that are used for rasterization can be used for other purposes at other times, and vice-versa. As Tom Forsyth puts it, because the whole chip is programmable, we can effectively bring more square millimeters to bear on any specific task as needed - up to and including the whole chip; in other words, the pipeline can dynamically reconfigure its processing resources as the rendering workload changes. If we get a heavy rasterization load, we can have all the cores working on it, if necessary; it wouldn't be the most efficient rasterizer per square millimeter, but it would be one heck of a lot of square millimeters of rasterizer, all doing what was most important at that moment, in contrast to a traditional graphics chip with a hardware rasterizer, where most of the circuitry would be idle when there was a heavy rasterization load. A little while later, when the load switches to shading, the whole Larrabee chip can become a shader if necessary. Software simply brings a whole different set of strengths and weaknesses to the table.
    More about Larrabee here:
    http://software.intel.com/en-us/articles/larrabee/

  24. #24
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,631
    I think Future, only if Intel put's lots of cash behind developers to come up with special raytracing games.... raytracing is the only and the best feature of Larrabee, Intel has the money to make some thing like RGR *Realistic Game Rendering*, etc to promote Larrabee like Nvidia did with their TWIMTBP program.

    The bottom line is raytracing would be a big big thing, but it is not so in the present. Intel should release some re-releases of some epic games and some new games using raytracing and people will flock to get the card that does not mean that it should suck in rasterization but lets be honest rasterization is ATi's/Nvidia thing, x86 muscles dont get you far much "Emulation is Emulation".

    Intel's card also has to up the performance in DPP and SPP, AMD's 5800 and Nvidia's GT300 are suppose to exceed the initial speculative figs from intels card in terms of performance in both DPP and SPP. But x86 arc does scale very well, just add some cores and up the speed and you are done :>
    Coming Soon

  25. #25
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Helloworld_98 View Post
    if it's aimed at the GPGPU market, it'll do well

    if ray-traced games are around when it comes out, it'll do well

    otherwise, I don't think it'll do that well.
    Yup, think the same. The first iteration will mainly be marketed to the industry and not the consumers.

    Btw the vr-zone article comes not from Vr-zone, but form BSN.... *cough* Theo Valich *cough*
    Last edited by Hornet331; 09-19-2009 at 02:52 AM.

Page 1 of 8 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •