lol i'd go with its broken ;)
trinibwoy if you desperately need to find my source just shoot me a PM :)
So, because their tech isn't disseminated, it's "nonesense"..? One would think Intel's involvement, 33 patents are enough... but now your saying MSI are retards :rolleyes:
It's quite obvious Lucidlogix doesn't want anyone to knowhow it works... yet!
Lastly, the delay could easily be as something simple as availability..!
or total lack of interest among mobo makers to go in full production beyond quantities they got for super-sweet deal!?
Xoulz I can see you're in to all this Hydra thing, but honestly and with market reality in mind do you really see the point of this tech for desktop users?
Mobo makers know very well how much they sell SLi and CFX mobos, and they know very well how big is this multi-card market, and how much of it would go for mixed red-green systems, especially when there are no comparable (DX11) green card, and where only handfull of hardcore fanboys are getting second NV card for playing their three PhysX games!
See... if mobo makers need to make choice to put USB 3.0, or Hydra chip on board, and rise the price! They'll go for USB 3.0!
Hydra is DOA, and only chance for it is in GPU cloud farms - market that's yet to be created.
My friend, there is no longer denying it, HYDRA is here. It's agnostic, you don't need SLI or Crossfire. You be able to plop down a Larrabee next to your Radeon 5970 and have everything you want. It makes things simple and modular. How can that not be a boon for desktop users..?
Secondly, MSI has exclusive right to the Hydra chip... for how long, we don't know.
The H200 chip uses like 3 watts and as the name suggest, scalable.. (3 exact video cards are no longer a requirement. You can have 9800, grab another, then 2 months later find a great buy on a GTX 275 and have all three in your rig). Or, later.. mix and match across vendors.
How is this poo poo technology relegated to just fanboy's or non-desktop users..? :shrug:
well fellas this here is a whole lot to do about NOTHING. There is nothing there. When they release it (finally) then there will be something to talk about. Votes for Vapourware of the year?
Do you have a point? maybe, to an extent. But as stated so many times before, 33 patents, intel funding, 4 years + of development. I very much doubt that, that is even remotely the issue.
Your logic is sound until 1 issue. If your right, SLI would have been vapor ware, Xfire would also be and the many other enthusiast high end products on the market, not everything is based on desktop users my friend, and there is and always will be an enthusiast market, and people to push that market. especially for money robbing solution like this. If this turns out to be what its said it will be, believe me, money will flow like the waters that surround our continents.
Your logic has the most holes, from all the theories of this supposedly failing, and my post is exactly what i was talking about, its like you guys just want it to die, for no reason, like you think Lucid is a bad thing or something. Who wouldn't want this??!!
Thank god the people that drive the foundries of technology and invent new and amazing things dont have attitudes like this, we would still be in horse drawn wagons right now.
I do not think that there will be a point of arguing about how it is going to work as of now. There isn't any more news except for the MSI BIG BANG P55 w/e claim that is made by them. There isn't any single test or even a shot/screenie of this thing actually working. That's ridiculous, and unfortunately, it does sound like it's "too good to be true."
If it actually does work, I don't think this technology will surface in 30 DAYS. More like a couple more months, and let alone the time for this new technology to be matured. Don't get me wrong, I will seriously love to have this project to become real, but I don't think it will happen anytime soon.
Tests or it never happened.
It is now up to Lucid to prove and live up to its promise.
Xoulz, all your huffing and puffing and putting words in my mouth changes nothing. There is no product and no info. Lucid doesn't want people to know it works? Lol, so they want to surprise us with the fact that it works at all...great marketing gimmick.
They might not want to release anything because NVIDIA might try to block it some kind of way.:eek:
Hope hydra can deliver because combine power of my gtx 275 and a 5870 would be amm gr8 tough both will be unequal but hydra will find a way to get them to work. This is the thing but how will hydra first run its own bench on the card to see how much power it has and then offload the graphics in certain potions or will it have knowledge of this before like GTX275 power level 6 and 5870 power level 10, so offload 40% to GTX275 and rest to 5870??
If the second way is used how will it support new cards, also if two cards are used which are high end and thus have similar power levels will hydra make them work at 50% each ?? If so that would not be that productive since 2 * 5870 @ 50% load eat more than a single 5870 @ 100% :shrug:
If you are talking about the amp thing MSI made it them self's and it was not that big a thing. It was a sound card with a simple PRL caps amplifier, it was not bad but not a break through either.
No, no, no!! :)
Different hardware has different gamma settings and slight differences in precision at various points in the pipeline. You're going to get lighting and geometry differences between cards and the artifacts are going to be very apparent if you try to mix them. There's a reason why mixing hardware from multiple generations from the same vendor isn't allowed.
If they allowed us to pick the primary gpu used for display on each title then the other gpu can crunch without conflicting with how things get displayed.:shocked:
Exactly.
I for one don't care about mixing and matching Nvidia and ATI chips together, or even the same branded chips from different generations. I just want to see Lucid prove its "near linear" GPU scaling claim. Two HD5870's with near 100% scaling is what's catching my interest at the moment.
If the mixing and matching of Nvidia and ATI chips is what's causing the delay of Hydra, then frankly they should forget about it and move on.
So was it just me or did everyone say this MSI big bang board was supposed to have Hydra? I just read that it doesn't so?
http://www.fudzilla.com/content/view/16189/1/
My friend, I have not put words in your mouth. :rolleyes:
Secondly, There is a chip... we've seen it. Just because it's no RELEASED yet doesn't mean it doesn't exist.
Thirdly, yes... that is why there was a COUNTDOWN to the release of this technology... that was delayed. So it's quite apparent they are trying to generate excitement and mystery about this new platform. What I don't understand is your apparent need to dismiss or suggest it's not real... :ROTF:
It is, we just don't know it's actual performance and internal workings. But I'm sure it is safe-to-say... that Hydra will work as intended. That doesn't mean it won't have bugs or hiccups, but you'll definitely be able to use 1~5 video cards with very little loss of scaling.
That alone makes Xfire and SLI moot!
We could hash this back and forth forever and still get no farther than the typical ATI vs Nvidia type conversations until more is divulged about it. The way I see it, there must be some degree of good potential at the very least for Intel to invest in it as much as they have. I also see it's potential in desktop use as quite good if it works as claimed, esp concerning easy upgrade paths without wasting prior GPU purchases and/or bottle necking the system.
One thing I don't see getting mentioned regarding Hydra though is whether or not games have to be written for it like SLI and Crossfire. Seems to me it doesn't since they've demoed it on older games like FEAR 2 and Bioshock. If games don't have to be written for it, that 's a huge advantage in itself, let alone the claimed better scaling. Needed support at the development end has shot down other techs, like In2Games' cross platform motion sensing "Fusion" controller.
It's certainly a chat worthy subject, but as usual when there's little known about a new tech, it often degrades into more of a personal opinion and speculation shouting match than a discussion based on actual benchmarks and real world feedback. Personally I'm psyched about the idea of a better mouse trap for multi GPU scaling, IF in fact it is one. I don't see SLI and Crossfire having improved much over the years. They're still using fixed profile AF scaling, dev support is needed, and performance is hit or miss depending on title and GPU, esp dual GPU cards.
At the very least, I think a stand alone tech that needs no dev support (IF that's what this is) would eliminate a lot of the hype in games whereby the devs side with one camp over another. Then again, this could be just a 3rd cog in the wheel development wise, and if it is, they'll most certainly lose out over ATI and Nvidia support wise in game titles. If it's not stand alone, I almost wonder if Intel took interest in it not just as an Nvidia partnership alternative concerning SLI, but to eventually by LucidLogix and perfect the chip themselves.