I am sure if Hydra really works, it will be VERY popular. The act of Nvidia trying to block it will be a financial failure for them. I don't think they will do that.
I think the speculation about Nvidia possibly trying to block Hydra is a bit overblown. Legit Reviews did an article Sep 30 about Hydra, at the end of which they interviewed separately Lucid, ATI and Nvidia. Nowhere in the interview did Nvidia sound intent on blocking Hydra. If anything the only thing I recall of Nvidia talking negatively about it was when Jen-Sun Huang mentioned he thought running both Nvidia and ATI cards simultaneously is not a good idea, mainly due to driver issues. I think it's safe to say though as I'm sure they're probably thinking, that the vast majority of Hydra users won't be mixing green with red anyway. http://www.legitreviews.com/article/1093/6/
For you guys all afraid about artifacts and different gamma in the rendering process between different cards of different generations and manufacturers. You just dont understand. yes, you are completely right, this stuff does happen from card to card in the rendering process of the chip.
What you dont understand is its minuscule at best, some will notice the difference between cards mixing them, but thats only really going to be apparent based on the age/tech of the card.
There has been 1,000,000,000,000 reviews of people trying to compare the differences in quality between cards since the age of time visually, (especially between ATI and Nvidia) and beleive me, to see the difference between the cards and have all this artifacting your talking about, you would have to be 20 times smaller and have a 20x magnifying glass, and be standing
on a monitor thats laid flat on a table.
You guys are making it out to be like your going to see a screen that looks like a birthday cake that exploded at a party, and thats just not the case. Its pretty much the same as micro stutter only not as nearly catchable by the eye.
Why would mixing cards even be an option when you think this is going to happen, a product isnt going to sell like that and i'm shure testing was done
about this before Lucid even considered this feature in the chip.
look up some ati vs Nvidia quality tests. The screenshots look almost identical until you zoom in by 50 times. Sometimes there are very slight differences in gamma, but
how do you know there isnt a process to fight this as well in the chip?
Im shure though it becomes a problem if the cards are too far off age, as these things increase with age.
bout time.
MSI's Trinergy shows up in retail
This is nf200, no Hydra ofc. But I love the design personally.
The price hurts, though, why bother with Hydra to get decent bandwidth for your gfx cards on P55 if X58 is cheaper? Total waste of money.
Wow you really want to believe don't you? Comparing two different cards running in isolation is vastly different from having those two cards co-operating on the same frame. Obviously you won't see differences between ATi and Nvidia because any quirks will be uniform across each screenshot. Your eyes aren't sensitive at all to instantaneous artifacts, but they're sensitive as hell to regular noise like that which would appear if various portions of the screen change as you pan around (like we see with crawling textures).
entirely true, im betting still minuscule at best though, in most situations. Again I say, a product is not going to sell like that, and thats one of their biggest selling points. I very much doubt someone able to create a chip such as this would sell it, and use it as a marketing selling point, with issues as such as that. Its a retarded claim, no offense.
I'm not saying it will not be a problem, I am just saying it will not be as problematic as everyone seems to make it out to be.
How can you even argue this point based on the fact that 1. you have no idea how the chip works and 2. its one of the companies biggest selling point.
Seriously are they gonna be like. "yeah you can mix cards" and then release they are like "hahah Got you, you can but it looks like you barfed on the screen", yeah i dont think so.
Its not about believing or not believing by the way, Lucid isn't the Yeti, or god. Thats a comment for 2 years ago.
Ill admit, not yet enough data to not doubt the thing. But based on Lucid's recent actions, its working, and its on its way.
See, im just basing my opinions on the little facts taht we have now. Your basing your "not believing" on what again?
Wait so the basis of your argument is that Lucid wouldn't claim something that's not true? Lol, have you been following technology long?
We have no "facts" about Lucid. If you have some please share them. What we have are facts about how current graphics hardware operates. Yet you seem to want to ignore those and put all your faith in make-believe promises.
man, neither one of us have pure facts to base anything either way. What I am saying is a company, with a possible massive breakthrough in the industry funded by one of the richest companies in the hardware business, isnt going to have a marketing strategy thats complete bullcrap. If you believe so, then whatever. Point is as well, there isn't proof either way of this supposed quality/artifact problem that might be an issue, that in itself is just as make believe as your saying the product is.
Here are the first benchmarks from PcPerspective, and I have to say, it looks pretty impressive!
http://pcper.com/article.php?aid=815
Some quotes from the Pc Perspective article:
Again, no sign of a retail product.Quote:
We had expected to have final, retail ready hardware in our hands by now but obviously that hasn’t happened.
Something is fishy in Lucid land.Quote:
MSI stated to us that some driver issues on Lucid’s side of things were holding back the release while Lucid clearly told us that the driver was ready to go and that they didn’t know what the hold was.
So while it works and is a improvement over sli / cf scaling with different cards is not as good as the hype made it out to be.Quote:
Starting with the 3DMark Vantage results, you will see that the HYDRA scaling method with the pair of GTX 260+ cards pushed performance up by 83% - definitely a competitive solution to SLI! Considering the fact that this motherboard, in its theoretical construction, didn’t have to pay for any type of SLI licensing, I would say that THIS is the reason NVIDIA might have put pressure on MSI to delay the Big Bang motherboard.
In Call of Juarez the scaling going from a single card to dual cards was around 80% while on Far Cry 2 it was just over 51%. Both results are again pretty impressive.
When we take a look at the GTX 260+ combined with the GTX 285, the results are not what I initially expected. As most of you, I would assume that a configuration with a single GTX 260+ and a much more powerful GPU (the GTX 285) would produce a higher combined framerate than the pair of GTX 260+ graphics cards but that was not the case. Instead, the performance of the 260/285 combination happened to be nearly identical to the 260/260 results. Why is this?
Lucid tells us that the software based algorithms for separating workloads across identical GPUs differs greatly from the one required for load balancing with non-identical GPUs and thus scaling will in fact differ. There is some software overhead and load balancing overhead that has to take place and that costs us some performance. We had to assume this would be the case but it just kind of goes against standard logic: 2 + 2 = 4 but also 2 + 3 = 4 in the case.
you missed another MAJOR piece
and overall the performance isnt that impressive yet. 80% scaling for identical GPUs, and looks to be using only about 70% of each GPU for other configurations. so if i had a 4850, and add in a 5850, the difference between them might be too much, resulting in a 10-20% gain (the 285 test got 32 and 52 FPS in their benchmarks, adding in a 250 they go up to 38 and 65 FPS.) so it looks like if you have one card, and plan to get another, it might offer better performance to just get an identical one.Quote:
As of this writing, dual GPU graphics cards like the GeForce GTX 295 are not going to work with Lucid’s HYDRA technology in the way you want it to. Essentially, because the two GPUs behind the PCI Express bridge chips on the graphics boards (both NVIDIA and ATI) are “hidden” to the system, the HYDRA driver will only be able access one of them. This is a bit of a letdown for us as the idea of having a single GTX 295 and then throwing in an HD 5870 sounded very appealing, it doesn’t appear Lucid will be able to get that working.
i would like to see more tests where its compared with Xfire, since they can mix and match cards better than SLI allows
That performance wouldn't be far from SLI/CF was clear from the beginning. AFR already is a very efficient approach from a performance perspective, although not so good in terms of gaming experience.
The point about Lucid's solution is whether the same performance improvements can be accomplished without the input lag and microstuttering current multiple gpu solutions bear. And that remains to be seen.
If Nvidia will stop being :banana::banana::banana::banana::banana::banana::b anana:S, maybe we can get on with this technology. It apparently works great in this early form. Let's get on with it.
My loins are aching: http://www.pcper.com/article.php?aid=815
Hmm, seem to have some potential :)
Edit: However, didn't they say that they wouldn't need driver updates for new games etc?
Quote:
the Hydra will require continual driver support in order to maintain compatibility with future games
meh same crappy scaling only positive ting is that you can use nvidia and ati together