Larabee is futur, but can't compete with actual 3D sytems. Raster is the king, but is an old method of work.
Larabee could be very far faster in vector & Ray tracing.
Ray tracing is real 3D. Raster is fake 3D. Raster is fast & showed us the limits.
Printable View
Larabee is futur, but can't compete with actual 3D sytems. Raster is the king, but is an old method of work.
Larabee could be very far faster in vector & Ray tracing.
Ray tracing is real 3D. Raster is fake 3D. Raster is fast & showed us the limits.
why would an engineering team be able to build a sweet gpu at nvidia and ati but not intel? is there something special in the air or water in their labs there? :D
intel hired quite some nvidia engineers, if all they do is rebuild G200 on intels 45nm process they would probably be enough for a mainstream product already... the thing is, its not about building a fast gpu... intel could do that but it would accelerate them losing grip as x86 fades away... what they want is build a cpu that can act as a gpu... a software emulated gpu... and one that is at least as fast as a propper gpu... THAT is all still possible and a matter of resources... BUT, intel wants it to be financially feasible and possibly even wantsd to mkake money with it...
THAT might be something that is actually impossible... itll either be slower or cost more than a conventional gpu... you cant beat a fixed function processor in its own game with a general purpose processor at the same transistor budget...
its like expecting an SUV which can drive everywhere and is sortof a general purpose car, to beat an F1 car, which can only drive on very flat streets, but does so at insane speeds... now you either build an insane suv mutant at insane costs, that can drive as fast as a f1 car on a racetrack, but can ALSO drive everywhere else... or youll lose the race on an f1 track... obviously...
im confused how intel thought this was going to work... with their mfc advantage in their sleeve that might work out, but apparently they wanna use the cheap old 45nm fabs for lrb once cpus have moved to 32nm? 0_o
Larabee needs to be compatible with software in some way and if you develop for Larabee that software needs to work on other hardware with reasonably speed.
Intel would need to spend loads of money to develop software for Larabee if it differs to much from existing hardware. As a programmer you don't start to develop advanced software if there isn't a market.
If DX12 has raytracing and that works on AMD and/or nVidia GPU's. This could maybe be the entry for Larabee into the market.
Raytracing in HD... In your dreams! :ROTF:
They might put something out eventually, some intel fanbois will love it say how awesome it is cause it has x feature is but isn't beneficial for gaming. Everyone else will buy ATI or Nvidia cause it will likely be cheaper. Intels investment in it will be a major loss. Then they'll try and buy Nvidia to make up for their losses lol :P
Well the short answer is that Intel artificially handicapped themselves by demanding x86 Compatibility. Ati and nVidia are completely free to do what ever they desire with the underlying architectures and modify them at a minutes notice to improve performance. While Intel is stuck supporting every single bad idea introduced into x86 since the introduction of the 8086.
It is a long known historical fact that legacy ISAs always have a lower performance per transistor compared to completely new ISAs which often have the benefits of learning from the mistakes of previous ISAs.
The most critical thing that will determine if Larrabee will be a success or not is how well it performs in relation to it's competition in the most popular games and applications.
The next most critical thing is what developers do between now and then. If programmers start writing for OpenCL and DX11 CS then it doesn't matter that Larrabee supports x86 because such programs would work on all three manufacturer's GPUs. If programmers are lazy and don't bother learning OpenCL and DX11 CS then Larrabee will be easier to develop for. On the game side of things if Intel can convince enough game developers to make games with a Larrabee exclusive feature (say raytracing) then it could have an advantage. But if no killer feature captures the game dev's and public's imagination then it's going to come back to raw performance.
You do have a point there, if nvidia can convince game developers of using PissX, intel won't have too much trouble either to get them working on some LB exclusive features....
I also remember some rambling charlie did last year, about being convinced that PS4 was going to use LB. If this pans out it will be a HUGE win for intel...
+1
It would have been easier for Intel to just buy Nvidia and go for the x86 GPU at a later stage. The question remains can Intel still buy Nvidia or cant they ?? :shrug:
Initially the x86 GPU will be back breaking but in long term not so much intel all ready has shown how the thing can do with more core. The base just needs to be stable so that the future is secure. The future seems more like a Crysis 5 Holographic edition than anything like Crysis 5 now with Ray tracing. 3D holograms need lots of processing power something that x86 GPU will be very very good at!! :yepp: that is if it gets launched. :shakes:
Intel bought Havok in 07, as well as purchasing a game studio that is making Project offset.
I can see Intel releasing the game with advanced larrabeee support with kickass physics all done on larrabee
http://www.projectoffset.com/index.p...d=58&Itemid=12
Project Offset, its a nice looking game.
What can I say, 2 Year ago Larabee was on most people's lips, mind and finger as the GPU messiah, walking on ATi/nVidia's hopes of survival. Now, looks exactly the opposite to what it looked back then...
I'm sure you'll see it as a portable solution, at some IT convention playing and HD movie and a CEO bragging about it, proud on how good it can play that HD movies, as if they reinvented the High Definition.
Fiasco imo, when it finally launches nvidia and ati would be wayyy ahead, to make even consider buying a Larrabe...I could be wrong though.
i think that if they want it to be competitive, they would have to release it by the end of this year.... maybe early next year.
performance will be based on how well it clocks, and how many cores they can add without getting too power hungry... which is the real question...
I think we don't need to worry about LRB's raw power.Quote:
Gamasutra: How do you see Larrabee fulfilling those promises of flexibility and freedom?
Mike Burrows (Intel's Senior Graphics Software Architect Manager): Just in terms of raw computing power and development flexibility, it blows the socks off anything else I'm aware of. I don't think I can say any more than that without tripping over NDA issues.
imo
larabee will come and will only be the start of things. regardless if it can or can not compete with current gpu's or it will or will not talk with certain software.
larabee is part of SoC. once the entire system is on chip it won't even matter what else you going to hookup to it. you still going to pay for the whole system.
chipsets or at least a major part of it is already moving on chip with the p55/p57 platform and 1156 socket which boxes out quiet a bit of competition on the chipset market and graphics will be next. the enthusiast and gamer will still be able to use a dedicated gpu but the mainstream won't need to and corporate will be happy too.
good idea, but i foresee poor future competition bursts that spur development and price wars which the end user usually benefits from
http://prod.itzbig.com/jobs/santa_cl...tel/69764.htmlQuote:
Intel is looking for a LRB3 uArch - RTL Engineer in Santa Clara, CA
Company: Intel
JobID: 563020 - LRB3 uArch - RTL Engineer
Category: Hardware & Electronics Design/Engineering - Junior (Mid-level, Experienced)
Location: Santa Clara, CA
Description
Position: LRB3 uArch - RTL Engineer
Responsibilities and Details
Description
As a member of the micro-architecture team in Intel's Enterprise Microprocessor Group (EMG), you will be responsible for defining and implementing portions of processors for graphics computing applications. Designs typically target high frequencies in a power-constrained, schedule-driven and product cost-conscious environment. Your responsibilities could lie in a processor's execution core, caches, system interface and would included proposing innovative solutions to micro-architecture design challenges, specifying or codeveloping performance models, specifying or running performance simulations, drawing actionable conclusions from the simulation results, cooperating with circuit designers to complete physical feasibility studies, authoring micro-architecture design specifications, developing high-level or Register-Transfer-Level (RTL) modeling methodology, coding high-level or RTL models, authoring validation plans, contributing to functional and performance validation, supporting physical design (e.g. to meet timing goals), participating in silicon debug and possibly leading a small team of engineers through any of the above. You will work in a team-oriented environment and would have to interact with engineers from other design disciplines and other groups.
Qualifications
You should possess a relevant educational qualification.
Additional qualifications include:
- Strong understanding of advanced computer architecture and micro-architecture design concepts spanning both core and system interconnect design
- Some processor design or chipset design experience would be an added advantage
- Knowledge and background in graphics hardware design, high speed industry standard I/O design and a working knowledge of physical design (for example, typical implementation styles, wire delay constraints) would be an added advantage
Job Category Engineering
Location USA-California, Santa Clara
Full/Part Time Full Time
Job Type Recent College Graduate
Regular/Temporary Regular
Yep. We discussed this a while back and I also believe that Seth * Co's Project Offset will be Larrabee's golden egg.
Glad more people see this coming! :up:
PS: whoever said "the'll need to learn to write new software for it (larrabee)" Doesn't understand Larrabee is X86...