For you guys all afraid about artifacts and different gamma in the rendering process between different cards of different generations and manufacturers. You just dont understand. yes, you are completely right, this stuff does happen from card to card in the rendering process of the chip.
What you dont understand is its minuscule at best, some will notice the difference between cards mixing them, but thats only really going to be apparent based on the age/tech of the card.
There has been 1,000,000,000,000 reviews of people trying to compare the differences in quality between cards since the age of time visually, (especially between ATI and Nvidia) and beleive me, to see the difference between the cards and have all this artifacting your talking about, you would have to be 20 times smaller and have a 20x magnifying glass, and be standing
on a monitor thats laid flat on a table.
You guys are making it out to be like your going to see a screen that looks like a birthday cake that exploded at a party, and thats just not the case. Its pretty much the same as micro stutter only not as nearly catchable by the eye.
Why would mixing cards even be an option when you think this is going to happen, a product isnt going to sell like that and i'm shure testing was done
about this before Lucid even considered this feature in the chip.
look up some ati vs Nvidia quality tests. The screenshots look almost identical until you zoom in by 50 times. Sometimes there are very slight differences in gamma, but
how do you know there isnt a process to fight this as well in the chip?
Im shure though it becomes a problem if the cards are too far off age, as these things increase with age.





Reply With Quote
Bookmarks