MMM
Page 2 of 8 FirstFirst 12345 ... LastLast
Results 26 to 50 of 188

Thread: Larrabee: A fiasco, or the future?

  1. #26
    Xtreme Addict
    Join Date
    Nov 2006
    Posts
    1,402
    Larabee is futur, but can't compete with actual 3D sytems. Raster is the king, but is an old method of work.

    Larabee could be very far faster in vector & Ray tracing.

    Ray tracing is real 3D. Raster is fake 3D. Raster is fast & showed us the limits.

  2. #27
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by FischOderAal View Post
    Well, first I have to say that I have basically no knowledge about ICs whatsoever. But I don't think you can "easily" start from scratch and release a GPU that is as fast as the ones of your competitor, which have been in the business for several years. Somehow I doubt it's that easy, even with a f*ckload of resources.
    why would an engineering team be able to build a sweet gpu at nvidia and ati but not intel? is there something special in the air or water in their labs there?

    intel hired quite some nvidia engineers, if all they do is rebuild G200 on intels 45nm process they would probably be enough for a mainstream product already... the thing is, its not about building a fast gpu... intel could do that but it would accelerate them losing grip as x86 fades away... what they want is build a cpu that can act as a gpu... a software emulated gpu... and one that is at least as fast as a propper gpu... THAT is all still possible and a matter of resources... BUT, intel wants it to be financially feasible and possibly even wantsd to mkake money with it...

    THAT might be something that is actually impossible... itll either be slower or cost more than a conventional gpu... you cant beat a fixed function processor in its own game with a general purpose processor at the same transistor budget...

    its like expecting an SUV which can drive everywhere and is sortof a general purpose car, to beat an F1 car, which can only drive on very flat streets, but does so at insane speeds... now you either build an insane suv mutant at insane costs, that can drive as fast as a f1 car on a racetrack, but can ALSO drive everywhere else... or youll lose the race on an f1 track... obviously...

    im confused how intel thought this was going to work... with their mfc advantage in their sleeve that might work out, but apparently they wanna use the cheap old 45nm fabs for lrb once cpus have moved to 32nm? 0_o

  3. #28
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    450
    Quote Originally Posted by saaya View Post

    its like expecting an SUV which can drive everywhere and is sortof a general purpose car, to beat an F1 car, which can only drive on very flat streets, but does so at insane speeds... now you either build an insane suv mutant at insane costs, that can drive as fast as a f1 car on a racetrack, but can ALSO drive everywhere else... or youll lose the race on an f1 track... obviously...
    Great car anology!

  4. #29
    Xtreme Member
    Join Date
    Jan 2009
    Posts
    169
    Quote Originally Posted by marten_larsson View Post
    Great car anology!
    Indeed. Unfortunately, F-1 seems insipid these days.

    XmX

  5. #30
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Larabee needs to be compatible with software in some way and if you develop for Larabee that software needs to work on other hardware with reasonably speed.

    Intel would need to spend loads of money to develop software for Larabee if it differs to much from existing hardware. As a programmer you don't start to develop advanced software if there isn't a market.

    If DX12 has raytracing and that works on AMD and/or nVidia GPU's. This could maybe be the entry for Larabee into the market.

  6. #31
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Raytracing in HD... In your dreams!

  7. #32
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    They might put something out eventually, some intel fanbois will love it say how awesome it is cause it has x feature is but isn't beneficial for gaming. Everyone else will buy ATI or Nvidia cause it will likely be cheaper. Intels investment in it will be a major loss. Then they'll try and buy Nvidia to make up for their losses lol :P
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  8. #33
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by saaya View Post
    why would an engineering team be able to build a sweet gpu at nvidia and ati but not intel? is there something special in the air or water in their labs there?

    intel hired quite some nvidia engineers, if all they do is rebuild G200 on intels 45nm process they would probably be enough for a mainstream product already... the thing is, its not about building a fast gpu... intel could do that but it would accelerate them losing grip as x86 fades away... what they want is build a cpu that can act as a gpu... a software emulated gpu... and one that is at least as fast as a propper gpu... THAT is all still possible and a matter of resources... BUT, intel wants it to be financially feasible and possibly even wantsd to mkake money with it...

    THAT might be something that is actually impossible... itll either be slower or cost more than a conventional gpu... you cant beat a fixed function processor in its own game with a general purpose processor at the same transistor budget...

    its like expecting an SUV which can drive everywhere and is sortof a general purpose car, to beat an F1 car, which can only drive on very flat streets, but does so at insane speeds... now you either build an insane suv mutant at insane costs, that can drive as fast as a f1 car on a racetrack, but can ALSO drive everywhere else... or youll lose the race on an f1 track... obviously...

    im confused how intel thought this was going to work... with their mfc advantage in their sleeve that might work out, but apparently they wanna use the cheap old 45nm fabs for lrb once cpus have moved to 32nm? 0_o
    Well the short answer is that Intel artificially handicapped themselves by demanding x86 Compatibility. Ati and nVidia are completely free to do what ever they desire with the underlying architectures and modify them at a minutes notice to improve performance. While Intel is stuck supporting every single bad idea introduced into x86 since the introduction of the 8086.
    It is a long known historical fact that legacy ISAs always have a lower performance per transistor compared to completely new ISAs which often have the benefits of learning from the mistakes of previous ISAs.
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  9. #34
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    The most critical thing that will determine if Larrabee will be a success or not is how well it performs in relation to it's competition in the most popular games and applications.

    The next most critical thing is what developers do between now and then. If programmers start writing for OpenCL and DX11 CS then it doesn't matter that Larrabee supports x86 because such programs would work on all three manufacturer's GPUs. If programmers are lazy and don't bother learning OpenCL and DX11 CS then Larrabee will be easier to develop for. On the game side of things if Intel can convince enough game developers to make games with a Larrabee exclusive feature (say raytracing) then it could have an advantage. But if no killer feature captures the game dev's and public's imagination then it's going to come back to raw performance.

  10. #35
    Xtreme Member
    Join Date
    Jun 2008
    Posts
    197
    Quote Originally Posted by Solus Corvus View Post
    On the game side of things if Intel can convince enough game developers to make games with a Larrabee exclusive feature (say raytracing) then it could have an advantage. But if no killer feature captures the game dev's and public's imagination then it's going to come back to raw performance.
    You do have a point there, if nvidia can convince game developers of using PissX, intel won't have too much trouble either to get them working on some LB exclusive features....

    I also remember some rambling charlie did last year, about being convinced that PS4 was going to use LB. If this pans out it will be a HUGE win for intel...

  11. #36
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by JohnJohn View Post
    You do have a point there, if nvidia can convince game developers of using PissX, intel won't have too much trouble either to get them working on some LB exclusive features..
    Remember that there exists a lot of computers with nVidia cards and that means that there is a market

  12. #37
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by saaya View Post
    why would an engineering team be able to build a sweet gpu at nvidia and ati but not intel? is there something special in the air or water in their labs there?
    You already said it yourself: both ati and nvidia have design wins and losses, and the pedigree of their technology stretches back many many years. And yet, even though they are in the GPU business, they produce different designs.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  13. #38
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,631
    Quote Originally Posted by cegras View Post
    You already said it yourself: both ati and nvidia have design wins and losses, and the pedigree of their technology stretches back many many years. And yet, even though they are in the GPU business, they produce different designs.
    +1

    It would have been easier for Intel to just buy Nvidia and go for the x86 GPU at a later stage. The question remains can Intel still buy Nvidia or cant they ??


    Initially the x86 GPU will be back breaking but in long term not so much intel all ready has shown how the thing can do with more core. The base just needs to be stable so that the future is secure. The future seems more like a Crysis 5 Holographic edition than anything like Crysis 5 now with Ray tracing. 3D holograms need lots of processing power something that x86 GPU will be very very good at!! that is if it gets launched.
    Coming Soon

  14. #39
    Xtreme Member
    Join Date
    Nov 2003
    Posts
    450
    Intel bought Havok in 07, as well as purchasing a game studio that is making Project offset.

    I can see Intel releasing the game with advanced larrabeee support with kickass physics all done on larrabee

    http://www.projectoffset.com/index.p...d=58&Itemid=12
    Intel 2600K @ 4.8ghz 1.31v on Water.
    ASROCK Z68 Ex4 Gen 3, 16GB G.skill pc1600
    MSI GTX 680 1200/6800mhz
    2x Vertex LE 60GB Raid 0

  15. #40
    Xtreme Mentor
    Join Date
    Apr 2005
    Posts
    2,550
    Quote Originally Posted by JohnJohn View Post

    I also remember some rambling charlie did last year, about being convinced that PS4 was going to use LB. If this pans out it will be a HUGE win for intel...
    A possible console that's out in 2015 can hardly have effect on LB's success, since LB should be out in 2011
    Adobe is working on Flash Player support for 64-bit platforms as part of our ongoing commitment to the cross-platform compatibility of Flash Player. We expect to provide native support for 64-bit platforms in an upcoming release of Flash Player following the release of Flash Player 10.1.

  16. #41
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    278
    Project Offset, its a nice looking game.

  17. #42
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    What can I say, 2 Year ago Larabee was on most people's lips, mind and finger as the GPU messiah, walking on ATi/nVidia's hopes of survival. Now, looks exactly the opposite to what it looked back then...

    I'm sure you'll see it as a portable solution, at some IT convention playing and HD movie and a CEO bragging about it, proud on how good it can play that HD movies, as if they reinvented the High Definition.

  18. #43
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Toon
    Posts
    1,570
    Quote Originally Posted by XSAlliN View Post
    I'm sure you'll see it as a portable solution
    I'm inclined to agree but I think it will be integrated into the CPU package or die rather than being a discrete processor or add on card.
    Intel i7 920 C0 @ 3.67GHz
    ASUS 6T Deluxe
    Powercolor 7970 @ 1050/1475
    12GB GSkill Ripjaws
    Antec 850W TruePower Quattro
    50" Full HD PDP
    Red Cosmos 1000

  19. #44
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Fiasco imo, when it finally launches nvidia and ati would be wayyy ahead, to make even consider buying a Larrabe...I could be wrong though.
    ░█▀▀ ░█▀█ ░█ ░█▀▀ ░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░█▀▀ ░█▀▀ ░█ ░█ ░░░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░▀▀▀ ░▀ ░░░▀ ░▀▀▀ ░░▀ ░░░▀░▀ ░▀ ░▀▀▀ ░

  20. #45
    Xtreme Addict
    Join Date
    Dec 2004
    Location
    Flying through Space, with armoire, Armoire of INVINCIBILATAAAAY!
    Posts
    1,939
    Quote Originally Posted by madcho View Post
    Larabee is futur, but can't compete with actual 3D sytems. Raster is the king, but is an old method of work.

    Larabee could be very far faster in vector & Ray tracing.

    Ray tracing is real 3D. Raster is fake 3D. Raster is fast & showed us the limits.
    whichever method you pick, you're in for a lot of float calculations. a processing resource that's good at one is just as good at the other.
    Sigs are obnoxious.

  21. #46
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    525
    i think that if they want it to be competitive, they would have to release it by the end of this year.... maybe early next year.

    performance will be based on how well it clocks, and how many cores they can add without getting too power hungry... which is the real question...

  22. #47
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    228
    Gamasutra: How do you see Larrabee fulfilling those promises of flexibility and freedom?

    Mike Burrows (Intel's Senior Graphics Software Architect Manager): Just in terms of raw computing power and development flexibility, it blows the socks off anything else I'm aware of. I don't think I can say any more than that without tripping over NDA issues.
    I think we don't need to worry about LRB's raw power.
    .

  23. #48
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Los Angeles, CA
    Posts
    528
    imo
    larabee will come and will only be the start of things. regardless if it can or can not compete with current gpu's or it will or will not talk with certain software.

    larabee is part of SoC. once the entire system is on chip it won't even matter what else you going to hookup to it. you still going to pay for the whole system.
    chipsets or at least a major part of it is already moving on chip with the p55/p57 platform and 1156 socket which boxes out quiet a bit of competition on the chipset market and graphics will be next. the enthusiast and gamer will still be able to use a dedicated gpu but the mainstream won't need to and corporate will be happy too.

    good idea, but i foresee poor future competition bursts that spur development and price wars which the end user usually benefits from

  24. #49
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    228
    Intel is looking for a LRB3 uArch - RTL Engineer in Santa Clara, CA

    Company: Intel
    JobID: 563020 - LRB3 uArch - RTL Engineer
    Category: Hardware & Electronics Design/Engineering - Junior (Mid-level, Experienced)
    Location: Santa Clara, CA

    Description

    Position: LRB3 uArch - RTL Engineer

    Responsibilities and Details
    Description
    As a member of the micro-architecture team in Intel's Enterprise Microprocessor Group (EMG), you will be responsible for defining and implementing portions of processors for graphics computing applications. Designs typically target high frequencies in a power-constrained, schedule-driven and product cost-conscious environment. Your responsibilities could lie in a processor's execution core, caches, system interface and would included proposing innovative solutions to micro-architecture design challenges, specifying or codeveloping performance models, specifying or running performance simulations, drawing actionable conclusions from the simulation results, cooperating with circuit designers to complete physical feasibility studies, authoring micro-architecture design specifications, developing high-level or Register-Transfer-Level (RTL) modeling methodology, coding high-level or RTL models, authoring validation plans, contributing to functional and performance validation, supporting physical design (e.g. to meet timing goals), participating in silicon debug and possibly leading a small team of engineers through any of the above. You will work in a team-oriented environment and would have to interact with engineers from other design disciplines and other groups.

    Qualifications
    You should possess a relevant educational qualification.

    Additional qualifications include:
    - Strong understanding of advanced computer architecture and micro-architecture design concepts spanning both core and system interconnect design
    - Some processor design or chipset design experience would be an added advantage
    - Knowledge and background in graphics hardware design, high speed industry standard I/O design and a working knowledge of physical design (for example, typical implementation styles, wire delay constraints) would be an added advantage

    Job Category Engineering
    Location USA-California, Santa Clara
    Full/Part Time Full Time
    Job Type Recent College Graduate
    Regular/Temporary Regular
    http://prod.itzbig.com/jobs/santa_cl...tel/69764.html
    .

  25. #50
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by Unoid View Post
    Intel bought Havok in 07, as well as purchasing a game studio that is making Project offset.

    I can see Intel releasing the game with advanced larrabeee support with kickass physics all done on larrabee

    http://www.projectoffset.com/index.p...d=58&Itemid=12


    Yep. We discussed this a while back and I also believe that Seth * Co's Project Offset will be Larrabee's golden egg.

    Glad more people see this coming!



    PS: whoever said "the'll need to learn to write new software for it (larrabee)" Doesn't understand Larrabee is X86...
    Last edited by Xoulz; 09-19-2009 at 11:41 PM.

Page 2 of 8 FirstFirst 12345 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •