Hahaha, that's what I call an AA-technique without sacrificing performance. :rofl:
The rest of the thread's discussion is so zZZzzZZzzZZz.... Move on people.
Its not like nvidia has not been quality of doing similar shady things in the past. Whats the problem.
It's a reference renderer which defines what exactly the scene should look like?
Software rasterization by it's very nature does use the same optimizations as hardware rasterization does - depending on the method used of course. The only difference between the two is that the CPU does exactly the same as the ROPs and TMUs do hardwired, and at the same time the CPU emulates the shaders.
So you think that after they went trough the process of developing, testing and implementing their own version in ForceWare and of course asking reviewers formally to kindly test with it on, nVIDIA will disable it (losing about ~10% free performance is selected titles) for end-users whom have no visible indication nor in the ForceWare CP, nor in games for that matter what the status of this optimization is. We shall believe this, because they said so?
Actually an article would be nice testing the last 2 or 3 drivers from nVIDIA with the switching utility to see whats what. Any reviewers up for it? :)
If I'm proven wrong, shame on me. On the other hand if they are pulling a "Hey look!" pointing in the other direction while doing the same, then shame on them.
Call me what you want, but I'm skeptical... situation kind of reminds me of the good ol' 3DMark03 times.
I was a bit surprised when I replaced my Radeon HD5870 with a GTX 580. Surprised at the difference in image quality. I went from a GTX 7900 ( I think ) --> Radeon 9800 --> 4870 --> 4870x2 --> 5870 and then a GTX 580.
I switched to a GTX 580 due to the "lack" of general performance increase in the 6000 series and also due to the debate about image quality.
So when I first fired up the GTX 580 to play some COD Black ops I was really surprised at what I saw. I didn't expect any major difference going in but boy was I surprised. The image is just so much sharper and the colors so much fuller. Maybe there is a better technical explanation, but I'm just describing what I see on he screen. Thinking that it could just be COD Black ops I tried another game ... Left 4 Dead 2. I play L4D2 a lot and since it's not a TWIMTBP title (and steam complains about my Graphics adapter being unknown) I didn't expect much difference to show. But boy was I wrong AGAIN !!! The same thing goes ... much much sharper picture, more vibrant colors etc. and I'm talking about a game I have logged hundreds and hundreds of hours in playing on high end Radeon Hardware. No matter what game I try it's the same story, much better image quality and more vibrant colors. You could argue that it's just a matter of adjusting settings in the control panel and it could be, but since I just got the GTX 580 I haven't changed any settings in the control panel and it's at default settings.
I'm a bit surprised about this and also a bit disappointed, because since the GTX 7xxx days and the GeForce FX + 3DMark 2003 days ATI has always been a guarantee for image quality for me. I hope that AMD/ATI will find their way onto the right path again and not play these games of cat and mouse where they do AF correctly in AF tester (when it's detected) but differently in games etc.. Picture quality is more important to me than FPS and I hope AMD will realize that too. So what if they loose 10% performance by rendering things the right way, it's a lot better than the bad press they get on this.
That's just absolute nonsense and based on a placebo effect at best. There is no difference in colours and sharpness whatsoever.
About the AF issue: Hardly anyone would notice a difference in a blind test that does not pick particular scenes. However I do agree that AMD has to improve the AF quality.
You are incorrect.
The default settings in the respective control panels for NVIDIA and AMD cause a difference in the overall color saturation, sharpness and gamma. AMD's settings tend to be on the cooler side of the spectrum while NVIDIA's are slightly warmer.
You wouldn't see this if you went from one AMD product to another or one NVIDIA card to another. However, it is quite evident when going from one company to another.
From my experience based on switching out cards on a regular basis, the statement you quoted is bang on.
As mentioned in the other topic:
If you think there is some truth about this - XS has a section at Off-Topic -> Tech Talk ... but don't trash news section with their politics intended for bashing one each-other by trash-talking products from an "adversary (as in competition)".
:rofl::rofl: so pls tell me what lack of performance increase you are referring to? you mean the update of the 57xx series which is now 68xx series... to me it looks like you just didn't look/read any decent review and didn't search enough threads here to really know that you should wait a bit longer to purchase a card unless you are a real nv fanboy, certainly a 580..... within 2 weeks it will be 150$ cheaper :D ;) does these 2 smileys look also green and blue on your new nv580 or do they have much more brigther color now.......
now to the point, yes you could see a difference but that has more to do with profile settings, drivers and cards then anything else, just like monitors have a big influence...
This discussion here is about the anisotrope filtering quality. So it is just ridicules to state on this topic that IHVx delivers "much sharper" images with "colors so much fuller" than IHVy here.
Yeah, there are some differences in the default (uncalibrated) video signal profiles. SO WHAT?!
Argh... I really should stop wasting my time here!
Compared to my old HD5870 I don't think the 6870 was that big a deal and yes I did read reviews and lots of them. You can mock me all you want, but saying I'm a NV fanboy after all the ATI cards I have owned is just plain stupid on your part. And why should I wait ? If I have the money to burn why shouldn't I buy a new graphics card whenever I want to ?
You can turn and twist it all you want, there is a difference and in some cases it's HUGE and judging by your comments you didn't read what I wrote.
Fact of he matter is I didn't write what I did to taunt anyone or to mock anyone, but simply to report what I observed going from a long line of Radeon cards to a Nvidia card.
That's absolute nonsense. There is a difference in the default settings on a NV card vs an ATI card. You should try using a Radeon card for some weeks and then switch to a NV card and you would see for yourself. I'm not saying the ATI card can't be made to look like the nVidia card but it doesn't by default and far from it. And when ATI starts to detect certain AF test programs and then use a better AF method than the driver normally do that's when things start to get out of control.
All right. Then let us determine this in a controlled test. :cool:
Which screenshot has been taken by a RV870 and which by a GF110?
http://img94.imageshack.us/img94/6471/dirt2a.jpg
http://img220.imageshack.us/img220/48/dirt2b.jpg
http://img819.imageshack.us/img819/8263/metaf.jpg
http://img254.imageshack.us/img254/4315/metb.jpg
http://img38.imageshack.us/img38/2775/mwabz.jpg
http://img202.imageshack.us/img202/7807/mwby.jpg
http://img841.imageshack.us/img841/7946/vana.jpg
http://img534.imageshack.us/img534/1826/vanbi.jpg
Oh and please don't forget to show us where exactly "The image is just so much sharper and the colors so much fuller." based on these comparisons! :rofl:
PS.: Note that the image quality settings used here are equal to Radeaon 5870 default and GeForce 580 default as you stated in your comparison.
first dirt 2 is 870, second gtx 460, comparison not possible because 460 lacks tesselation
second game - don't see a difference
MW - no clue (gamma difference?! / can't compare IQ due to different brightniess settings
vantage: both screenshots show banding in certain areas...
the comment I provided was you pointing out that there is no added value going from 5870 to 6870 ---> off course not that is the hole point, 6870 is a new price range in the market it is not intended to replace the 58xx series, you still don't get it....and if you already own a 5870 i can't think of any reason to spend another +500$ to buy a new card without waiting 2-3 more weeks, that is what I call :down: consumer.
One complaint about ATI I've had for a while is that their games are substantially darker for the same ingame Source gamma setting. Made some dark, gritty HL2 mods hard to play.