Page 1 of 10 1234 ... LastLast
Results 1 to 25 of 247

Thread: Lucid Hydra 200: Vendor Agnostic Multi-GPU, Available in 30 Days

  1. #1
    Xtreme Enthusiast
    Join Date
    Jun 2006
    Location
    stuck in Bloomington, IN right now
    Posts
    800

    Lucid Hydra 200: Vendor Agnostic Multi-GPU, Available in 30 Days

    http://www.anandtech.com/video/showdoc.aspx?i=3646

    A year ago Lucid announced the Hydra 100: a physical chip that could enable hardware multi-GPU without any pesky SLI/Crossfire software, game profiles or anything like that.

    At a high level what Lucid's technology does is intercept OpenGL/DirectX commands from the CPU to the GPU and load balance them across any number of GPUs. The final buffers are read back by the Lucid chip and sent to primary GPU for display.

    The technology sounds flawless. You don't need to worry about game profiles or driver support, you just add more GPUs and they should be perfectly load balanced. Even more impressive is Lucid's claim that you can mix and match GPUs of different performance levels. For example you could put a GeForce GTX 285 and a GeForce 9800 GTX in parallel and the two would be perfectly load balanced by Lucid's hardware; you'd get a real speedup. Eventually, Lucid will also enable multi-GPU configurations from different vendors (e.g. one NVIDIA GPU + one AMD GPU).
    DistroWatch - find your flavor of Linux
    Petra's Tech Shop - for all your watercooling needs

  2. #2
    Xtreme Member
    Join Date
    Oct 2006
    Location
    Redding, CA
    Posts
    232
    Wow, it's been ages since the first press release came out. I figured this thing would never materialize.

  3. #3
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    the rebirth of the coprocessor. you gotta love it.


    $36 a chip are you for rizzle?

  4. #4
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Birmingham AL.
    Posts
    1,079
    been saving this quote for quite a while. Im gonna have to break it out now just as he said to do.

    Quote Originally Posted by GAR View Post
    You can QUOTE me on this, THIS WILL NEVER BE A SUCCESS. As a matter of fact this will never even be mass produced. I say it will dissapear from the face of this world very shortly.
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

  5. #5
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    Quote Originally Posted by G0ldBr1ck View Post
    been saving this quote for quite a while. Im gonna have to break it out now just as he said to do.
    It seems GAR underestimated the power of Intel.

    Let me be the first to add the obligatory "Now pair this with two AMD 5870 GPUs" remark. Lucid+AMD+Intel = Menage a tois of PWNAGE.
    Core i7 2600K@4.6Ghz| 16GB G.Skill@2133Mhz 9-11-10-28-38 1.65v| ASUS P8Z77-V PRO | Corsair 750i PSU | ASUS GTX 980 OC | Xonar DSX | Samsung 840 Pro 128GB |A bunch of HDDs and terabytes | Oculus Rift w/ touch | ASUS 24" 144Hz G-sync monitor

    Quote Originally Posted by phelan1777 View Post
    Hail fellow warrior albeit a surat Mercenary. I Hail to you from the Clans, Ghost Bear that is (Yes freebirth we still do and shall always view mercenaries with great disdain!) I have long been an honorable warrior of the mighty Warden Clan Ghost Bear the honorable Bekker surname. I salute your tenacity to show your freebirth sibkin their ignorance!

  6. #6
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Finally..!




    Been waiting on more info about the MSI's Big Bang and Lucid's Hydra chip...

    What more needs to be said, no more SLI... and this:




    From the moment I heard about this technology and Company (15 months ago), I've been a close follower. Just looking at the people who are on the board @ Lucid gave me hope this Hydra chip was for real.

    I can't wait for the bench reviews, but this looks pleasing..

  7. #7
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    this is multi-gpu blasphemy!!

  8. #8
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    I wonder if there will be any appreciable latency with an added middleman chip?
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  9. #9
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Bayamon,PR
    Posts
    257
    Mmmm , No love for AMD boards eh ? T_T

  10. #10
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    Quote Originally Posted by LC_Nab View Post
    Mmmm , No love for AMD boards eh ? T_T
    +1

    Lucid has to get these on AMD boards of we'll cry "bloody murder!" Intel has their fingers in Lucid, but I hope they wouldn't deprive AMD users of this man-made miracle.

    I don't think computing life will ever be the same after this. Today has been a momentous day in the world of hardware betweem AMD's 5870, Lucid's Hydra, and Intel's IDF. I will sleep with a smile on my face.
    Core i7 2600K@4.6Ghz| 16GB G.Skill@2133Mhz 9-11-10-28-38 1.65v| ASUS P8Z77-V PRO | Corsair 750i PSU | ASUS GTX 980 OC | Xonar DSX | Samsung 840 Pro 128GB |A bunch of HDDs and terabytes | Oculus Rift w/ touch | ASUS 24" 144Hz G-sync monitor

    Quote Originally Posted by phelan1777 View Post
    Hail fellow warrior albeit a surat Mercenary. I Hail to you from the Clans, Ghost Bear that is (Yes freebirth we still do and shall always view mercenaries with great disdain!) I have long been an honorable warrior of the mighty Warden Clan Ghost Bear the honorable Bekker surname. I salute your tenacity to show your freebirth sibkin their ignorance!

  11. #11
    Xtreme Addict
    Join Date
    Nov 2005
    Location
    Where the Cheese Heads Reside
    Posts
    2,173
    Why would they need 2 VGA connectors hooked up?

    BTW Thumbs up. Glad they moved forward with this, and im sure Intel is helping push this all the way as well.
    -=The Gamer=-
    MSI Z68A-GD65 (G3) | i5 2500k @ 4.5Ghz | 1.3875V | 28C Idle / 65C Load (LinX)
    8Gig G.Skill Ripjaw PC3-12800 9-9-9-24 @ 1600Mhz w/ 1.5V | TR Ultra eXtreme 120 w/ 2 Fans
    Sapphire 7950 VaporX 1150/1500 w/ 1.2V/1.5V | 32C Idle / 64C Load | 2x 128Gig Crucial M4 SSD's
    BitFenix Shinobi Window Case | SilverStone DA750 | Dell 2405FPW 24" Screen
    -=The Server=-
    Synology DS1511+ | Dual Core 1.8Ghz CPU | 30C Idle / 38C Load
    3 Gig PC2-6400 | 3x Samsung F4 2TB Raid5 | 2x Samsung F4 2TB
    Heat

  12. #12
    Xtreme Guru
    Join Date
    Aug 2009
    Location
    Wichita, Ks
    Posts
    3,887
    As i understand lucid is owned now by a subsidiary of intel, or at least funded by them indirectly somehow, which is great, because this is the best thing since the gpu!!!!
    My next rig...
    sandy bridge
    sata 6gb/s ssd
    AND lucid hydra, doesnt really matter which gpu now!
    "Lurking" Since 1977


    Jesus Saves, God Backs-Up
    *I come to the news section to ban people, not read complaints.*-[XC]Gomeler
    Don't believe Squish, his hardware does control him!

  13. #13
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Vancouver
    Posts
    1,073
    Quote Originally Posted by Xoulz View Post
    Finally..!




    Been waiting on more info about the MSI's Big Bang and Lucid's Hydra chip...

    What more needs to be said, no more SLI... and this:




    From the moment I heard about this technology and Company (15 months ago), I've been a close follower. Just looking at the people who are on the board @ Lucid gave me hope this Hydra chip was for real.

    I can't wait for the bench reviews, but this looks pleasing..
    I m confused on the 4X the 8x pcie on a p55... since the cpu only has 16 pci-e lanes dedicated to gfx...but the hydra is issuing instructions via 32 lanes..?
    " Business is Binary, your either a 1 or a 0, alive or dead." - Gary Winston ^^



    Asus rampage III formula,i7 980xm, H70, Silverstone Ft02, Gigabyte Windforce 580 GTX SLI, Corsair AX1200, intel x-25m 160gb, 2 x OCZ vertex 2 180gb, hp zr30w, 12gb corsair vengeance

    Rig 2
    i7 980x ,h70, Antec Lanboy Air, Samsung md230x3 ,Saphhire 6970 Xfired, Antec ax1200w, x-25m 160gb, 2 x OCZ vertex 2 180gb,12gb Corsair Vengence MSI Big Bang Xpower

  14. #14
    Xtreme Addict
    Join Date
    Jul 2005
    Location
    ATX
    Posts
    1,004
    Quote Originally Posted by villa1n View Post
    I m confused on the 4X the 8x pcie on a p55... since the cpu only has 16 pci-e lanes dedicated to gfx...but the hydra is issuing instructions via 32 lanes..?
    I think this quote from the article will help:

    And the highest end solution, the one being used on the MSI board, has a x16 to the CPU and then a configurable pair of x16s to GPUs. You can operate this controller in 4 x8 mode, 1 x16 + 2 x8 or 2 x16. It's all auto sensing and auto-configurable.
    The way I read it is that the chip communicates with CPU with a single x16 lane; It subsequently communicates with the GPUs with a 4x8, 2x16, or 1x16 interface.

    DAMN THIS IS COOL

  15. #15
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Vancouver
    Posts
    1,073
    Quote Originally Posted by m0da View Post
    I think this quote from the article will help:



    The way I read it is that the chip communicates with CPU with a single x16 lane; It subsequently communicates with the GPUs with a 4x8, 2x16, or 1x16 interface.

    DAMN THIS IS COOL
    I read that, but thats what i m asking, if it communicates to the cpu at pci-e 16... doesnt that necessarily limit the amount of instructions it can send to the gpus at 16lanes.. ie 1x16, 2x8, or 4x4?
    " Business is Binary, your either a 1 or a 0, alive or dead." - Gary Winston ^^



    Asus rampage III formula,i7 980xm, H70, Silverstone Ft02, Gigabyte Windforce 580 GTX SLI, Corsair AX1200, intel x-25m 160gb, 2 x OCZ vertex 2 180gb, hp zr30w, 12gb corsair vengeance

    Rig 2
    i7 980x ,h70, Antec Lanboy Air, Samsung md230x3 ,Saphhire 6970 Xfired, Antec ax1200w, x-25m 160gb, 2 x OCZ vertex 2 180gb,12gb Corsair Vengence MSI Big Bang Xpower

  16. #16
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Chumbucket843 View Post
    the rebirth of the coprocessor. you gotta love it.
    thats not really what this is... its a smart plx chip sortof... but i would not call it a co-processor...

  17. #17
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Personally I don't think this will do any good for using similar cards compared to existing SLI and Crossfire. Using Radeon and GeForce cards at the same time... hmm, dunno, probably not worth it...
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  18. #18
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Nvidia will disable the ability for this in their drivers, sooner or later.

  19. #19
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by villa1n View Post
    I read that, but thats what i m asking, if it communicates to the cpu at pci-e 16... doesnt that necessarily limit the amount of instructions it can send to the gpus at 16lanes.. ie 1x16, 2x8, or 4x4?
    You are not going to saturate a PCIe x16 connection with a Hydra chip.


    It's own technology talks to as many GPU as you want given those lanes... how the PCIe bus works doesn't change... thats the function of the Hydra chip. Seamless!

  20. #20
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    Quote Originally Posted by Calmatory View Post
    Nvidia will disable the ability for this in their drivers, sooner or later.
    You're right. Sad thing is if Nvidia does this, they will undermine the adoption of their own tech. It will be a horrible business decision on their part.
    Core i7 2600K@4.6Ghz| 16GB G.Skill@2133Mhz 9-11-10-28-38 1.65v| ASUS P8Z77-V PRO | Corsair 750i PSU | ASUS GTX 980 OC | Xonar DSX | Samsung 840 Pro 128GB |A bunch of HDDs and terabytes | Oculus Rift w/ touch | ASUS 24" 144Hz G-sync monitor

    Quote Originally Posted by phelan1777 View Post
    Hail fellow warrior albeit a surat Mercenary. I Hail to you from the Clans, Ghost Bear that is (Yes freebirth we still do and shall always view mercenaries with great disdain!) I have long been an honorable warrior of the mighty Warden Clan Ghost Bear the honorable Bekker surname. I salute your tenacity to show your freebirth sibkin their ignorance!

  21. #21
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by Calmatory View Post
    Nvidia will disable the ability for this in their drivers, sooner or later.
    Really bro.. watch a few videos, or read some stuff. It's been mentioned that how it works, is agnostic to the OS and drivers.

    Here: http://www.xtremesystems.org/forums/...d.php?t=235138Video

  22. #22
    Xtreme Addict
    Join Date
    Jul 2002
    Location
    [M] - Belgium
    Posts
    1,744
    Quote Originally Posted by Xoulz View Post
    is agnostic to the OS and drivers.
    that's what I thought too; there is a software side of things though which might be used to limit


    Belgium's #1 Hardware Review Site and OC-Team!

  23. #23
    Xtreme Addict
    Join Date
    Jan 2007
    Location
    Michigan
    Posts
    1,785
    Just like the disablement of Physix when an ATi card is present in the system... But I hope they don't and if nvidia does I hope they're unsuccessful.
    Current: AMD Threadripper 1950X @ 4.2GHz / EK Supremacy/ 360 EK Rad, EK-DBAY D5 PWM, 32GB G.Skill 3000MHz DDR4, AMD Vega 64 Wave, Samsung nVME SSDs
    Prior Build: Core i7 7700K @ 4.9GHz / Apogee XT/120.2 Magicool rad, 16GB G.Skill 3000MHz DDR4, AMD Saphire rx580 8GB, Samsung 850 Pro SSD

    Intel 4.5GHz LinX Stable Club

    Crunch with us, the XS WCG team

  24. #24
    Xtreme Member
    Join Date
    Jun 2008
    Posts
    208
    Quote Originally Posted by Mechromancer View Post
    You're right. Sad thing is if Nvidia does this, they will undermine the adoption of their own tech. It will be a horrible business decision on their part.
    They already did this with crappy SLI chipsets on the 6XX and 7XX mobos. X58 is the best thing that ever happened to Nvidia honestly.

    I hope it is a winner, however I buy my graphic cards in pairs. It will be interesting to see if there is any performance hit when using it with 2 of the same cards.

  25. #25
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Guys, as has already been stated the Hydra operates independently from the OS and thus is not affected by drivers or at least not much anyways. As such, Nvidia / ATI cannot block it on the software side of things. They may be able to limit it however. At least this is how it works to my understanding.
    Last edited by SKYMTL; 09-23-2009 at 08:14 AM.

Page 1 of 10 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •