Page 2 of 3 FirstFirst 123 LastLast
Results 26 to 50 of 68

Thread: Larrabee - lets get naked!

  1. #26
    Xtreme Member
    Join Date
    Sep 2006
    Location
    Slovenia
    Posts
    209
    Quote Originally Posted by Shintai View Post
    Its a GPGPU. But it will also be used as IGP in a CPU later on. Think something like 4-8 cores in IGP. 8-32 Cores in discrete GFX card. (Larrabee cores).
    K thnx for clearing it up.
    |ASUS Sabertooth Z77|Intel Core I7-2700K|32GB G.Skill TridentX F3-2400C10Q-32GTX|Corsair AX1200W|
    |ASUS GTX TITAN + Zotac GTX680 for PhysX|Samsung S27A950D|Corsair Obsidian 800D|Corsair Hydro Series H100|

    |ASUS Crosshair IV Formula|AMD Phenom II X6 1090T BE|16GB G.Skill RipjawsX F3-12800CL7D-8GBXH|
    |2X AMD Radeon HD6970 Crossfire|TT Kandalf - MOD|Corsair HX1000W|Corsair Hydro Series H70|

  2. #27
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    i think this was really a way for intel to offer an IGP thats not so sucky, so when they sell laptops they make alot more of the profit on their own. i think this is a great idea for them when it comes to the next step in expanding their reach. how it will do against high end chips, i really dont care.

  3. #28
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    lol, the die layout is messy? wow, we sure have upped the criteria we use to evaluate hardware

  4. #29
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    Quote Originally Posted by trinibwoy View Post
    lol, the die layout is messy? wow, we sure have upped the criteria we use to evaluate hardware

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  5. #30
    Banned
    Join Date
    Jun 2008
    Posts
    763
    Quote Originally Posted by Manicdan View Post
    i think this was really a way for intel to offer an IGP thats not so sucky, so when they sell laptops they make alot more of the profit on their own. i think this is a great idea for them when it comes to the next step in expanding their reach. how it will do against high end chips, i really dont care.
    Yeah, the only problem is that it's a discrete GPU not integrated. That and the fact that we don't really need 32 processing cores on the northbridge...

  6. #31
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    I'd love to see Larrabee come in and smash all the competition. Just means a better graphics card for me.

    Can't really see it though, I've seen very few products be truly quality on attempt #1. Intel could have something up their sleeve though, so who knows.

  7. #32
    Banned
    Join Date
    Jun 2008
    Posts
    763
    Quote Originally Posted by Sly Fox View Post
    I'd love to see Larrabee come in and smash all the competition. Just means a better graphics card for me.

    Can't really see it though, I've seen very few products be truly quality on attempt #1. Intel could have something up their sleeve though, so who knows.
    Yeah. I still remember the buzz generated when Intel released it's first graphics card. Even the first rewievs where the journalists tried to be as "pollite' as possible. This is a bad move coming from their part. If they really wanted to capture the video graphics market they should have moved in silence. This way Nvidia and AMD are just holding on and preparing their big cards for the time Larabee will be released. The true winner will hold on to his big card for the second wave. That will be the true battle: the second version of Larabee against Nvidias chips and AMD's offering...

  8. #33
    Xtreme Addict
    Join Date
    Jul 2008
    Location
    US
    Posts
    1,379
    Quote Originally Posted by Katanai View Post
    Yeah. I still remember the buzz generated when Intel released it's first graphics card. Even the first rewievs where the journalists tried to be as "pollite' as possible. This is a bad move coming from their part. If they really wanted to capture the video graphics market they should have moved in silence. This way Nvidia and AMD are just holding on and preparing their big cards for the time Larabee will be released. The true winner will hold on to his big card for the second wave. That will be the true battle: the second version of Larabee against Nvidias chips and AMD's offering...
    i740? yeah, what a flop that was! Lots of hype followed shortly after by lots of disappointment.

    --Matt
    My Rig :
    Core i5 4570S - ASUS Z87I-DELUXE - 16GB Mushkin Blackline DDR3-2400 - 256GB Plextor M5 Pro Xtreme

  9. #34
    Moderator
    Join Date
    Mar 2006
    Posts
    8,556
    Quote Originally Posted by Shintai View Post
    LOL! Ye and CUDA and CTM already made the CPU so obsolete. They are both useless.
    I disagree.

  10. #35
    Xtreme Enthusiast
    Join Date
    Aug 2005
    Location
    Melbourne, Australia
    Posts
    942
    From a gamer's perspective, i'm not all that excited. But if it does compete for the top spot, interesting things could happen.
    Q9550 || DFI P45 Jr || 4x 2G generic ram || 4870X2 || Aerocool M40 case || 3TB storage


  11. #36
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by YukonTrooper View Post
    Yep, not surprising though. We dont have much else to go on.

  12. #37
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    The layout does not make instant sense. Nor does it make proper layout sense even several passes later.
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  13. #38
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034
    I really don't think Intel will "smash the competition" - not on the Gaming market ATI and Nvidia got to much experience there.

    But as a GPGPU(GPCPU? whatever lol) and server processorer it is very likely it will be a beast.
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  14. #39
    Xtreme Member
    Join Date
    Apr 2008
    Location
    England
    Posts
    209
    Quote Originally Posted by [XC] riptide View Post
    I disagree.
    I think he's being sarcastic

  15. #40
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by acidpython View Post
    I think he's being sarcastic
    Not really. Just look at the CUDA/CTM applications..or lack of same.
    Crunching for Comrades and the Common good of the People.

  16. #41
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Donnie27 View Post
    It sucking had NOTHING to do with drivers, it's weak Hardware. I've never had any problems with Intel's IGP Drivers. Or even the integrated ATI rage drivers that Intel shipped, I can't say the same for nVidia, SIS and or VIA
    yeah but your not playing games are you? their driveres arent even made by their own engineers but an external company afaik... their drivers are great for 2d, but 3d... not really... and check the matrox review on 3dcenter where they throw in g45 as well. there is almost no scaling in most games between diferent resolutions, which looks like very poor to no driver optimizations.

    Quote Originally Posted by Helmore View Post
    Does AMD(ATI) do marketing? I barely knew .
    heheheh, yeah its quite silent these days... but nvidia will open another "can of whop4ss" marketing wise as soon as they have something new out, you can rely on that

    Quote Originally Posted by AffenJack View Post
    Will it really hurt amd and nvidia? Gt300 and rv870 will hurt high-performance computing with the cpu. Cpus will become less important, so intel really needs a product against it. Intel will get market share, but the market should grow fast with opencl. I don't think Larrabee will be a good choice for gamers, but it will be strong in gpgpu.
    thats what nvidia and ati want us to believe... but who is actually using gpgpu and for what?
    ...
    ...
    ...
    exactly!
    nvidia is barely selling any gpgpu stuff and not making much money on it, and amd even less... gpgpu hasnt taken off yet, and im not so sure it ever will take off big time... its another feature hardware makers are trying to tell their customers is great, its not like customers wanted a gpgpu cause its so useful, its hardware makers that thought they could make extra money with their chips if they add some more instructions and change the design a bit...

    Quote Originally Posted by AffenJack View Post
    Intel eating nvs market share from the top??? Never heard intel claiming this. All comments i heard are that it's a performance chip, so first it has to beat rv870.
    i doubt intel would launch larrabee unless they can come out on top... they want to enter the market, and they want to enter at the top, thats the best strategy for them actually, so i agree on that. they really want it, they got loads of cash, very good chip fabs, very good engineers... if they want it, theyll get it... its just a matter of time and money

    what intel will probably not be good at is price perf... but thats not their goal afaik... thats why amd will actually be less impacted by larrabee than nvidia i think...

    Quote Originally Posted by .Tret View Post
    Where is the proof to that? Intel releases new drivers for their integrated vga about every couple weeks.
    latest game is out, you get it, perf sucks, or there are artifacts... oh, but no problem, you just have to wait a FEW WEEKS for a new driver...
    now im sure you get the point he made about driver updates not beeing frequent enough, at least atm...

    Quote Originally Posted by trinibwoy View Post
    lol, the die layout is messy? wow, we sure have upped the criteria we use to evaluate hardware
    it looks messy, doesnt mean it is messy
    im sure they had a reason to arrange the cores like that...

    Quote Originally Posted by Katanai View Post
    Yeah. I still remember the buzz generated when Intel released it's first graphics card. Even the first rewievs where the journalists tried to be as "pollite' as possible. This is a bad move coming from their part. If they really wanted to capture the video graphics market they should have moved in silence. This way Nvidia and AMD are just holding on and preparing their big cards for the time Larabee will be released. The true winner will hold on to his big card for the second wave. That will be the true battle: the second version of Larabee against Nvidias chips and AMD's offering...
    keeping something like larrabee secret is not possible.. period... nvidia and ati would know about it either way, so it would be public, and once its publ;ic you might wanna use pr to get the launch going as good as possible. and intel did an ... ok job on that... there are quite some people who want to buy larrabee and think it will be the best since sliced bread without actually knowing anything about it, let alone its competing products

  17. #42
    Moderator
    Join Date
    Mar 2006
    Posts
    8,556
    Quote Originally Posted by Shintai View Post
    Not really. Just look at the CUDA/CTM applications..or lack of same.
    Ya just loook at them. I can use CUDA to speed up my MPEG-2 work, and brute force keys WEP, WPA, and help solve the RC5-72 competition.... an order of magnitude faster than a Q6600.

    You should try to stay away from absolute statements Shintai. It makes you look stupid.

  18. #43
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by [XC] riptide View Post
    Ya just loook at them. I can use CUDA to speed up my MPEG-2 work, and brute force keys WEP, WPA, and help solve the RC5-72 competition.... an order of magnitude faster than a Q6600.

    You should try to stay away from absolute statements Shintai. It makes you look stupid.
    RC5 is what...utter waste. MPEG-2 with worse quality than CPU based? The current CTM/CUDA encoders aint exactly quality minded. Is that all we got after 3-4 years?

    Plus what actual useful work can you do? There is so much PR, but yet nothing we really use. PhysX would be the closests. And how is that going?

    So if thats the best you can come up with..look again

    Its a no brainer that Larrabee is an instant killer of CTM and CUDA. Better performance and without all the limitations.
    Crunching for Comrades and the Common good of the People.

  19. #44
    Moderator
    Join Date
    Mar 2006
    Posts
    8,556
    Quote Originally Posted by Shintai View Post
    RC5 is what...utter waste. MPEG-2 with worse quality than CPU based? The current CTM/CUDA encoders aint exactly quality minded. Is that all we got after 3-4 years?

    Plus what actual useful work can you do? There is so much PR, but yet nothing we really use. PhysX would be the closests. And how is that going?

    So if thats the best you can come up with..look again

    Its a no brainer that Larrabee is an instant killer of CTM and CUDA. Better performance and without all the limitations.
    Irrelevent.

    You said CUDA was useless. I say it isn't. "Nothing you really use?" Maybe you should actually get a CUDA capable card and buy/download a CUDA app. Like others who use CUDA in research on behalf of their own companies.

    CUDA is highly effective in some niche areas =! CUDA is useless.

    Only a troll would say CUDA is useless.

  20. #45
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by [XC] riptide View Post
    CUDA is highly effective in some niche areas.
    So its useless for 99.9% of all people?

  21. #46
    Xtreme Member
    Join Date
    May 2005
    Posts
    196
    CoreAVC uses CUDA. The best HD decoder having strong support for Nvidia is pretty significant.
    i5 750 @ 4.2ghz
    EVGA P55 FTW
    8gig G.Skill Ripjaw @ 1055mhz
    Gigabyte 6950 modded
    Seasonic X-650
    Antec P180 modded and watercooled
    Thermochill PA160
    Apogee XT
    MCP350

  22. #47
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    hmmm well for the average user i think cuda is useless... there arent really any notable advantages that a cuda or no-cuda vga brings, or are there?
    i mean sure, if you happen to encode videos a lot and this badaboom tool or cs4 or whatever works for you... but even then the boost is limited... idk, im sceptical about all those attempts to make their cards value look higher than it actually is for average users...

    people go all star eyes about what they can do with their new nvidia card, but they never actually use the features... the ones that do usually find out thats its kinda pointless... i remember when i bought my g4mx card it was advertized as having tv out, but the quality sucked, it had tv in, but was too weak to encode properly, it supported 3d glasses but driver support was a nightmare, it was marketed as a gaming card, but was very slow in games...

    isnt it the same with cuda?

  23. #48
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Hornet331 View Post
    So its useless for 99.9% of all people?
    Yeah just like most advanced technology.

    saaya, instead of just blindly talking about stuff you can try doing some research at http://www.hpcwire.com if you really wanted to know who was using GPGPU.

  24. #49
    Banned
    Join Date
    Feb 2009
    Posts
    172
    looking forward to it!

  25. #50
    I am Xtreme
    Join Date
    Jul 2004
    Location
    Little Rock
    Posts
    7,204
    Quote Originally Posted by saaya View Post
    yeah but your not playing games are you? their driveres arent even made by their own engineers but an external company afaik... their drivers are great for 2d, but 3d... not really... and check the matrox review on 3dcenter where they throw in g45 as well. there is almost no scaling in most games between diferent resolutions, which looks like very poor to no driver optimizations.
    yup! I think some of their drivers are made by a 3rd party. But who in the hell uses Intel Integrated Graphics for Games. Then these games did run for me but slow as hell. I didn't have driver problems. I upgrade systems for folks as a hobby, not a business. My way of getting them to upgrade is showing how slow Intel's IGP is. Hell, it would be nice if Games did crash more than the one in twenty times I run them. The biggest thing these folks want to see run are The Sims and WOW! Not bad or poorly coded games like Crysis. In fact, I've found that even noobs know it's useless trying to upgrade to run it LOL! Some are showing more common sense than so called Tech savvy Geeks. For the record, I own both Crysis and Warhead.
    Quote Originally Posted by Movieman
    With the two approaches to "how" to design a processor WE are the lucky ones as we get to choose what is important to us as individuals.
    For that we should thank BOTH (AMD and Intel) companies!


    Posted by duploxxx
    I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
    Posted by gallag
    there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.
    qft!

Page 2 of 3 FirstFirst 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •