Page 1 of 3 123 LastLast
Results 1 to 25 of 61

Thread: Anandtech: Lucid Hydra 100 - Enabling SLI/CrossFire on Any Platform

  1. #1
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658

    Anandtech: Lucid Hydra 100 - Enabling SLI/CrossFire on Any Platform

    http://anandtech.com/tradeshows/showdoc.aspx?i=3379

    Wow, a new type of multi-GPU interface to make CF/SLI obsolete?

    Basically it uses a more efficient way of load balancing between GPUs which enables it to achieve near linear scaling with additional GPUs, or so they claim.

    Lucid, with their Hydra Engine and the Hydra 100 chip, are going in a different direction. With a background in large data set analysis, these guy are capable of intercepting the DirectX or OpenGL command stream before it hits the GPU, analyzing the data, and dividing up scene at an object level. Rather than rendering alternating frames, or screens split on the horizontal, this part is capable of load balancing things like groups of triangles that are associated with a single group of textures and sending these tasks to whatever GPU it makes the most sense to render on. The scene is composited after all GPUs finish rendering their parts and send the data back to the Lucid chip.
    Thoughts? On paper, at least, it looks very promising!

    EDIT - New article @ Anandtech, a more detailed look at Hydra 100: http://anandtech.com/video/showdoc.aspx?i=3385
    Last edited by Epsilon84; 08-22-2008 at 08:59 PM.

  2. #2
    Banned
    Join Date
    Nov 2005
    Location
    Acreageville, Alberta
    Posts
    1,411
    Bring it on! I am sick of NV holding SLI ransom so they can peddel thier crappy wares. I wont even go SLI on X58 if there is a NV brigde chip on it. If the PCI-E is going through a NV chip I wont buy it at all.

  3. #3

  4. #4
    Xtreme Addict
    Join Date
    Mar 2008
    Location
    川崎市
    Posts
    2,076
    1: I'll believe when its avalaible in retail.
    2: I'll believe when its avalaible in retail.


    seriously, its one of those things that sounds too good to be true.

  5. #5
    Assistant Administrator systemviper's Avatar
    Join Date
    Nov 2007
    Location
    Newtown, CT
    Posts
    2,875
    I love it, it would make the motherboards so much better, (fingers crossed)
    HWbot - Team: XtremeSystems
    XS cruncher - Team: XtremeSystems
    OCN Feedback
    HEAT


    *** Being kind is sometimes better then being right.

  6. #6
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Damn, I must have missed that thread.

    Well, more discussion is welcome in this thread, if the mods allow.

  7. #7
    Xtreme Legend
    Join Date
    Sep 2002
    Location
    Finland
    Posts
    1,707
    I'll go see it live today, lets see..
    Favourite game: 3DMark
    Work: Muropaketti.com - Finnish hardware site
    Views and opinions about IT industry: Twitter: sampsa_kurri

  8. #8
    Xtreme Enthusiast
    Join Date
    Jan 2005
    Location
    Frederick, MD
    Posts
    513
    will this thing microstutter lol?
    Core i5 750 3.8ghz, TRUE 120 w/Panaflo M1A 7v
    ASRock P55 Deluxe
    XFX 5870
    2x2GB GSkill Ripjaw DDR3-1600
    Samsung 2233RZ - Pioneer PDP-5020FD - Hyundai L90D+
    Raptor WD1500ADFD - WD Caviar Green 1.5TB
    X-FI XtremeMusic w/ LN4962
    Seasonic S12-500
    Antec P182

  9. #9
    Registered User
    Join Date
    Jan 2007
    Posts
    24
    As sweet as this sounds, we're never going to see it, due to two possible scenarios:
    A: performance is much less than the respective SLI/CF chipset
    B: it is the same or better and NVIDIA/ATI sues them into the ground.
    Q6600(w/ nautilus 500)@3.2ghz
    P35 DS3r rev.1(w/ jing ting on NB)
    Buffalo Firestix PC2-6400 CL5
    PCS 4870 512MB
    X-FI xtrememusic
    150GB raptor
    BFG 800w
    Samsung SH-S183L

  10. #10
    Coat It with GOOOO
    Join Date
    Aug 2006
    Location
    Portland, OR
    Posts
    1,608
    I believe this may be Intel's way of enabling dual GPU solutions for Larabee, but this is my own personal opinion and not coming from anything I've seen internally.
    Main-- i7-980x @ 4.5GHZ | Asus P6X58D-E | HD5850 @ 950core 1250mem | 2x160GB intel x25-m G2's |
    Wife-- i7-860 @ 3.5GHz | Gigabyte P55M-UD4 | HD5770 | 80GB Intel x25-m |
    HTPC1-- Q9450 | Asus P5E-VM | HD3450 | 1TB storage
    HTPC2-- QX9750 | Asus P5E-VM | 1TB storage |
    Car-- T7400 | Kontron mini-ITX board | 80GB Intel x25-m | Azunetech X-meridian for sound |


  11. #11
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    I don't think people understand the magnitude of this chip. If Intel is able to harness this technology, it will not need to buy the Nvidia chip to allow SLI on its motherboards.

    Ideally, the universality of the project means Larrabee, Rx00 and GTx00 will be supported.

    Perkam

  12. #12
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Calgary, AB
    Posts
    2,219
    Whoever thought of this is going to be very rich if it works.
    MB Reviewer for HWC
    Team OCX Bench Team

  13. #13
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by Eldonko View Post
    Whoever thought of this is going to be very rich if it works.
    Or get sued...and then purchased by...nVidia!
    or Intel
    Stop looking at the walls, look out the window

  14. #14
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    525
    i would imagine that it will be blocked through drivers much in the same way that nvidia blocks sli... if there is no nvidia sli chip, then disable vidiocard 1...

    i hope there is no way to block this new chip, and we can use what ever we want on any motherboard... wonder if you could multi gpu different manufacturers?..... one nvidia and one larabee.. and maybe a third ati card to "mix" things up a bit..

  15. #15
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Sillicon Valley, California
    Posts
    1,261
    Seen it live at IDF exhibit last night.

    Basically the current SLi method is like a 2-4 stage pipeline with very frequent stalling and flushing overhead, thus the scalability is far from linear. Hydra does it by distributing tasks per frame to different GPUs instead of different frames to different GPUs. How they divide and arbitrate the tasks are, of course, proprietary.

    The chip itself sits between the NB and GPU, taking one PCI-express 16 lane.

    One of my biggest concerns, technically, is choking of the PCI express bandwidth for those ultra powerful GPUs. The chip takes in 16 lanes and divide up the task to either 2 GPUs x 8 lanes each or 4 GPUs x 4 lanes each if I remembered correctly.

    The other disappointment is that the chip is designed to distribute graphical tasks using better algorithms so its application to GPU data crunching (CUDA, Larabee, etc) is limited.

    The chip runs at 5W only, so that's a big positive compare to those incompetent Nvidia dumb overclocking phucks. Wasn't hard to see Nvidia imploding when they went down the path of P4...

    All and all, they sampling to bunch of mobo venders and reviewers right now, product expected to be on market in 1H 2009.
    Athlon 64 3200+ | ASUS M2A-VM 0202 | Corsair XMS2 TWIN2X2048-6400 | 3ware 9650SE 4LPML | Seasonic SS-380HB | Antec Solo
    Core 2 Quad Q6600 @ 3.0GHz | ASUS P5WDG2-WS Pro 1001 | Gigabyte 4850HD Silent | G.Skill F2-6400PHU2-2GBHZ | Samsung MCCOE64G5MPP-0VA SLC SSD | Seasonic M12 650 | Antec P180
    Core i7-2600K @ 4.3 GHz @ 1.30V | ASUS P8P67 Pro | Sparkle GTX 560 Ti | G.Skill Ripjaw X F3-12800CL8 4x4GB @ 933MHz 9-10-9-24 2T | Crucial C300 128GB | Seasonic X750 Gold | Antec P183


    Quote Originally Posted by Shintai View Post
    DRAM production lines are simple and extremely cheap in a ultra low profit market.

  16. #16
    the jedi master
    Join Date
    Jun 2002
    Location
    Manchester uk/Sunnyvale CA
    Posts
    3,884
    Quote Originally Posted by Blauhung View Post
    I believe this may be Intel's way of enabling dual GPU solutions for Larabee, but this is my own personal opinion and not coming from anything I've seen internally.
    wouldn't that just add latency when you don't need it if they opened up the drivers ?

    Intel won't need a "bridging chip" for its own GPU solution ...will they?
    Got a problem with your OCZ product....?
    Have a look over here
    Tony AKA BigToe


    Tuning PC's for speed...Run whats fast, not what you think is fast

  17. #17
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    City of Lights, The Netherlands
    Posts
    2,381
    Quote Originally Posted by Blauhung View Post
    I believe this may be Intel's way of enabling dual GPU solutions for Larabee, but this is my own personal opinion and not coming from anything I've seen internally.
    Larrabee is already a lot better suited to multi-GPU than ATI's or NVIDIA's cards are and that's mainly because of the way it renders. It still does need a interconnect between the cards, but nothing too outlandish and a board that has an X58 chipset on it should to the trick. That's just my guess from what I've read though and I'm no expert on these things.
    "When in doubt, C-4!" -- Jamie Hyneman

    Silverstone TJ-09 Case | Seasonic X-750 PSU | Intel Core i5 750 CPU | ASUS P7P55D PRO Mobo | OCZ 4GB DDR3 RAM | ATI Radeon 5850 GPU | Intel X-25M 80GB SSD | WD 2TB HDD | Windows 7 x64 | NEC EA23WMi 23" Monitor |Auzentech X-Fi Forte Soundcard | Creative T3 2.1 Speakers | AudioTechnica AD900 Headphone |

  18. #18
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    Quote Originally Posted by vitaminc View Post
    One of my biggest concerns, technically, is choking of the PCI express bandwidth for those ultra powerful GPUs. The chip takes in 16 lanes and divide up the task to either 2 GPUs x 8 lanes each or 4 GPUs x 4 lanes each if I remembered correctly.
    PCI-Express in crossfire/sli is also used for data exchange between GPUs. With this architecture you don't need that because apparently the work between GPUs is distributed beforehand.
    Since the chip is coming embedded on motherboards (and high-end ones of course), it will only use PCI-express 2.0+, and that's 16GB/s bandwidth. There's no way a 3d game would use more than 16GB/s bandwidth for the graphics card, not in the next few years. That would be like transferring the whole game to the graphics cards every second... if your hard drive / optical media ever allowed it.


    Quote Originally Posted by vitaminc View Post
    The other disappointment is that the chip is designed to distribute graphical tasks using better algorithms so its application to GPU data crunching (CUDA, Larabee, etc) is limited.
    Well they do market the chip for gaming and professional rendering.. I don't really see the point in using it for data crunching.




    And how about merging the two threads about Lucid?

  19. #19
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    601
    Quote Originally Posted by naokaji View Post
    1: I'll believe when its avalaible in retail.
    2: I'll believe when its avalaible in retail.


    seriously, its one of those things that sounds too good to be true.
    1: +1
    2: +1
    Completely right , too good to be true
    2600K working in 4.8 GHZ so far

    2600k @(4600 Ghz) 1.42v : (under water)
    Asus B2 mobo
    4 GiG DDR 2400 MHZ
    GTX 570 @ 1.063v ( 910/1820/2001 Mhz) .(under water).
    1020 W PSU

    Hobie's : Overclocking .... Overclocking .... Overclocking

  20. #20
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063

    Old news..

    Yeah, some of us were talking about this a month ago. I'm with Perkam in my thinking. There was some speculation whether or not Intel was going to let Nvidia make chipsets for the new platform. I think Intel did this on purpose so later when Nvidia loses it's SLI chipsets to open source SLI that this Lucid chip will provide. They can't be seen as exclusionists or being monopolistic.

    This is exactly what will eliminate micro stutter or any multi GPU inherent problem.

  21. #21
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Quote Originally Posted by JohnZS View Post
    Or get sued...and then purchased by...nVidia!
    or Intel
    Get sued for what?

    Besides Lucid is apparently largely funded by Intel, so if they get sued they're practically suing Intel.

  22. #22
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by JohnZS View Post
    Or get sued...and then purchased by...nVidia!
    or Intel
    Please understand what your talking about before you make such *lacking in knowledge* statements. Lucid hold 50 patents on this Hydra chip....!



    .

  23. #23
    Xtreme Addict
    Join Date
    Aug 2004
    Location
    Minneapolis, MN
    Posts
    2,187
    Sounds amazing and would definitely be worthy of an upgrade. I wonder with Larabee and this you could really put a lot of gpu's in one machine.

  24. #24
    Banned
    Join Date
    Feb 2008
    Posts
    696
    I give this about a -.5% chance of actually working, at least with ATI/NV GPUs. If anyone knows their products, it's the GPU vendors. If they can't get more than the scaling they get now, there's no way someone who is abstracted from the hardware/drivers by one level will be able to do any better. Not to mention there are always particulars in each vendor's implementation which can't just seamlessly be made to work together. Then what happens in games where there are elements from one frame read to the next? The same thing that happens now.. no scaling.

    The only hacky method they could do would add a ton of latency to the process, and no one wants to sacrifice responsiveness for a higher framerate. This also sounds like a recipe for microstuttering.. mismatched cards with different amounts of processing power, spanning multiple vendors and generations? 1 frame rendered take 5ms while the next takes 25ms on an older card?
    Last edited by Sr7; 08-20-2008 at 09:25 PM.

  25. #25
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Philippines
    Posts
    793
    Hope this one really works.


    Rig Specs
    Intel Core 2 Extreme QX9650 4.0ghz 1.37v - DFI Lanparty UT P35 TR2 - 4x1GB Team Xtreem DDR2-1066 - Palit 8800GT Sonic 512MB GDDR3 256-bit
    160GB Seagate Barracuda 7200RPM SATA II 8MB Cache - 320GB Western Digital Caviar 7200RPM SATA II 16MB Cache - Liteon 18X DVD-Writer /w LS
    640GB Western Digital SE16 7200RPM SATA II 16MB Cache - Corsair HX 620W Modular PSU - Cooler Master Stacker 832
    Auzen 7.1 X-Plosion - Zalman ZM-DS4F - Sennheiser HD212 Pro - Edifier M2600



    Custom Water Cooling
    Dtek Fusion Extreme CPU Block - Swiftech MCR-220 - Swiftech MCP655-B - Swiftech MCRES-MICRO Reservior - 7/16" ID x 5/8" OD Tubings
    Dual Thermaltake A2018s 120mm Blue LED Smart fans.


    www.mni-photography.site88.net

Page 1 of 3 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •