Hahaha, that's what I call an AA-technique without sacrificing performance. :rofl:
The rest of the thread's discussion is so zZZzzZZzzZZz.... Move on people.
Its not like nvidia has not been quality of doing similar shady things in the past. Whats the problem.
It's a reference renderer which defines what exactly the scene should look like?
Software rasterization by it's very nature does use the same optimizations as hardware rasterization does - depending on the method used of course. The only difference between the two is that the CPU does exactly the same as the ROPs and TMUs do hardwired, and at the same time the CPU emulates the shaders.
So you think that after they went trough the process of developing, testing and implementing their own version in ForceWare and of course asking reviewers formally to kindly test with it on, nVIDIA will disable it (losing about ~10% free performance is selected titles) for end-users whom have no visible indication nor in the ForceWare CP, nor in games for that matter what the status of this optimization is. We shall believe this, because they said so?
Actually an article would be nice testing the last 2 or 3 drivers from nVIDIA with the switching utility to see whats what. Any reviewers up for it? :)
If I'm proven wrong, shame on me. On the other hand if they are pulling a "Hey look!" pointing in the other direction while doing the same, then shame on them.
Call me what you want, but I'm skeptical... situation kind of reminds me of the good ol' 3DMark03 times.
I was a bit surprised when I replaced my Radeon HD5870 with a GTX 580. Surprised at the difference in image quality. I went from a GTX 7900 ( I think ) --> Radeon 9800 --> 4870 --> 4870x2 --> 5870 and then a GTX 580.
I switched to a GTX 580 due to the "lack" of general performance increase in the 6000 series and also due to the debate about image quality.
So when I first fired up the GTX 580 to play some COD Black ops I was really surprised at what I saw. I didn't expect any major difference going in but boy was I surprised. The image is just so much sharper and the colors so much fuller. Maybe there is a better technical explanation, but I'm just describing what I see on he screen. Thinking that it could just be COD Black ops I tried another game ... Left 4 Dead 2. I play L4D2 a lot and since it's not a TWIMTBP title (and steam complains about my Graphics adapter being unknown) I didn't expect much difference to show. But boy was I wrong AGAIN !!! The same thing goes ... much much sharper picture, more vibrant colors etc. and I'm talking about a game I have logged hundreds and hundreds of hours in playing on high end Radeon Hardware. No matter what game I try it's the same story, much better image quality and more vibrant colors. You could argue that it's just a matter of adjusting settings in the control panel and it could be, but since I just got the GTX 580 I haven't changed any settings in the control panel and it's at default settings.
I'm a bit surprised about this and also a bit disappointed, because since the GTX 7xxx days and the GeForce FX + 3DMark 2003 days ATI has always been a guarantee for image quality for me. I hope that AMD/ATI will find their way onto the right path again and not play these games of cat and mouse where they do AF correctly in AF tester (when it's detected) but differently in games etc.. Picture quality is more important to me than FPS and I hope AMD will realize that too. So what if they loose 10% performance by rendering things the right way, it's a lot better than the bad press they get on this.
That's just absolute nonsense and based on a placebo effect at best. There is no difference in colours and sharpness whatsoever.
About the AF issue: Hardly anyone would notice a difference in a blind test that does not pick particular scenes. However I do agree that AMD has to improve the AF quality.
You are incorrect.
The default settings in the respective control panels for NVIDIA and AMD cause a difference in the overall color saturation, sharpness and gamma. AMD's settings tend to be on the cooler side of the spectrum while NVIDIA's are slightly warmer.
You wouldn't see this if you went from one AMD product to another or one NVIDIA card to another. However, it is quite evident when going from one company to another.
From my experience based on switching out cards on a regular basis, the statement you quoted is bang on.
As mentioned in the other topic:
If you think there is some truth about this - XS has a section at Off-Topic -> Tech Talk ... but don't trash news section with their politics intended for bashing one each-other by trash-talking products from an "adversary (as in competition)".
:rofl::rofl: so pls tell me what lack of performance increase you are referring to? you mean the update of the 57xx series which is now 68xx series... to me it looks like you just didn't look/read any decent review and didn't search enough threads here to really know that you should wait a bit longer to purchase a card unless you are a real nv fanboy, certainly a 580..... within 2 weeks it will be 150$ cheaper :D ;) does these 2 smileys look also green and blue on your new nv580 or do they have much more brigther color now.......
now to the point, yes you could see a difference but that has more to do with profile settings, drivers and cards then anything else, just like monitors have a big influence...
This discussion here is about the anisotrope filtering quality. So it is just ridicules to state on this topic that IHVx delivers "much sharper" images with "colors so much fuller" than IHVy here.
Yeah, there are some differences in the default (uncalibrated) video signal profiles. SO WHAT?!
Argh... I really should stop wasting my time here!
Compared to my old HD5870 I don't think the 6870 was that big a deal and yes I did read reviews and lots of them. You can mock me all you want, but saying I'm a NV fanboy after all the ATI cards I have owned is just plain stupid on your part. And why should I wait ? If I have the money to burn why shouldn't I buy a new graphics card whenever I want to ?
You can turn and twist it all you want, there is a difference and in some cases it's HUGE and judging by your comments you didn't read what I wrote.
Fact of he matter is I didn't write what I did to taunt anyone or to mock anyone, but simply to report what I observed going from a long line of Radeon cards to a Nvidia card.
That's absolute nonsense. There is a difference in the default settings on a NV card vs an ATI card. You should try using a Radeon card for some weeks and then switch to a NV card and you would see for yourself. I'm not saying the ATI card can't be made to look like the nVidia card but it doesn't by default and far from it. And when ATI starts to detect certain AF test programs and then use a better AF method than the driver normally do that's when things start to get out of control.
All right. Then let us determine this in a controlled test. :cool:
Which screenshot has been taken by a RV870 and which by a GF110?
http://img94.imageshack.us/img94/6471/dirt2a.jpg
http://img220.imageshack.us/img220/48/dirt2b.jpg
http://img819.imageshack.us/img819/8263/metaf.jpg
http://img254.imageshack.us/img254/4315/metb.jpg
http://img38.imageshack.us/img38/2775/mwabz.jpg
http://img202.imageshack.us/img202/7807/mwby.jpg
http://img841.imageshack.us/img841/7946/vana.jpg
http://img534.imageshack.us/img534/1826/vanbi.jpg
Oh and please don't forget to show us where exactly "The image is just so much sharper and the colors so much fuller." based on these comparisons! :rofl:
PS.: Note that the image quality settings used here are equal to Radeaon 5870 default and GeForce 580 default as you stated in your comparison.
first dirt 2 is 870, second gtx 460, comparison not possible because 460 lacks tesselation
second game - don't see a difference
MW - no clue (gamma difference?! / can't compare IQ due to different brightniess settings
vantage: both screenshots show banding in certain areas...
the comment I provided was you pointing out that there is no added value going from 5870 to 6870 ---> off course not that is the hole point, 6870 is a new price range in the market it is not intended to replace the 58xx series, you still don't get it....and if you already own a 5870 i can't think of any reason to spend another +500$ to buy a new card without waiting 2-3 more weeks, that is what I call :down: consumer.
One complaint about ATI I've had for a while is that their games are substantially darker for the same ingame Source gamma setting. Made some dark, gritty HL2 mods hard to play.
Sheesh, one person posts his own views and experiences when swapping a card and gets absolutely shot to pieces. What the hell happened to this forum seriously. It's getting worse and worse.
Hmmm seems to me it's always been like that and I've been here a while.
I'm with ya Toysoldier :up: After over 2 years of using Nivida gpus, 8400GS > 7900GTX > 9800GT > GTS 250, it was easy for me to the differences between the two chipsets/drivers. One of theory I have is, it has something to to with the way AA is applied. Nvidia gives you more options such as, enhance, force and disable. I always had it set to enhance and 16Q ;)
Most noticeable was Need for Speed Shift. I had to play around with CCC a fair bit to get it to run the game more smoothly. I think AI was mainly to blame there.
NEVER be afraid to speak your mind. Those who feel they need to make an example of you for doing so are all to common and should be ignored. :cool:
pEACe:D
You dont have to create a game profile.
You can change global colour settings from CCC once and for all.
Honestly, the two cards produce very different looking images on screen.
Its personal preference, I like detailed, sharper, better contrast (by default) of ATI. I miss it sometimes on my current 470, but its not a huge deal to be honest. With little adjustment, both can be made to look similar.
Some will hate this as always. In fact, bad monitors are to blame as well in some cases.
Personally, when i try my friend's GTX 460, the IQ sucks compared to my HD 4870. So, nVidia might have overoptimised its driver according to my personal experience, right ? That HAS to be right !
:ROTF:
So much hysteria, so much soap opera. :down:
IMO, the bottom line is that lowering quality, even just default quality, isn't cool - no matter what company does it.
AMD puts image quality debate to bed
There's been a lot of talk about GPUs and image quality lately, and as the party on the receiving end of some of the accusations, AMD felt the need to set the record straight. That's why we were invited to talk to Senior Manager of Software Engineering Andy Pomianowski and Technical Marketing Manager Dave Nalasco about image quality and the ruckus that NVIDIA kicked off last week.
The settings, they are a-changin'
Dave explained to us that there had been some changes to the Catalyst drivers to coincide with the release of the HD 6000-series GPUs, and that image quality had been a big part of that. At the heart of all this is Catalyst AI, which controls a whole host of different settings via a single slider.
Responding to feedback, this single slider was divided into a number of different settings in the latest release, giving users a bit more control. One of the new additions was a slider to control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.
High Quality turns off all optimisations and lets the software run exactly as it was originally intended to. Quality - which is now the default setting - applies some optimisations that the team at AMD believes - after some serious testing, benchmarking and image comparisons - will maintain the integrity of the image while increasing the application performance. Lastly, the Performance setting applies even more of these optimisations to squeeze out a few more frames, but risks degrading the image quality just a bit.
What do you see?
Dave acknowledged that some sources had observed visual anomalies when running a few games and benchmarks. He explained that the algorithms that the drivers run - notably anisotropic filtering - are very complex and that despite their best efforts, the image wasn't going to be perfect 100 per cent of the time, even on default settings.
What he stressed was that, in the opinions of the whole driver development team, the default settings and optimisations still offered the best performance with no noticeable drop in quality for the vast majority of users the vast majority of the time. And for those who were experiencing any problems, High Quality mode would always be there to allow a picture perfect image. This, he made clear, wasn't going to change any time soon.
And then something strange happened - Andy asked us what we thought. These guys seemed genuinely concerned about what we felt were the best settings to use, whether we'd experienced any problems, and what we would change if we were designing the Catalyst tools. They're clearly committed to delivering the best product that they can, and that means listening to feedback and taking on board what the press, as well as average gamers, think.
Hopefully, this whole image quality debate can now be put to bed. At least for the time being.
Source:
http://www.hexus.net/content/item.php?item=27786
So basically they admit that they have turned it down a tad too much, but in the team's opinion it's still great for the majority of people? Hmm,not sure I fully agree with that....
The color settings in CCC have been a big problem on CF systems since...maybe...10.6 or so. Move any of those sliders on a multi gpu system (58xx at least) and you get a nice pink screen. Or, that's my experience at least (as well as the experience of others I've spoken with).
--Matt
I can honestly say that I'm noticing less texture shimmering on a GTX 580 with texture quality set to high quality compared to my previous HD 5870 with Catalyst AI set to high quality (10.10e hotfix ) I'd still like to see how the 5800 compare to the 6800s when they both have acess to these newer Catalyst AI options.
i like AMDs response. thanks to all the complaints, we now have more tools to play with in CCC.
But the question remains: should reviewers use the HQ setting in their articles?
I won't venture my opinion just yet since I want to hear what you guys have to say.
High quality only,show's who has best IQ and performance @ that IQ level.Other settings for IQ testing only.
omg why does this matter so much? this thread has gone on for six pages...about quality settings?
:yepp:
You can't make a judgment based on benchmarks alone.
Quote:
Did they check the texture sharpness in the comparison? When I follow the test of rage3d regarding 6800 series and AF, I notice that shimmering disappear (on my 5870) when I increase the lod bias to a positive value.
I haven't tried this myself but here is the link to this discussion.Quote:
If you bump LoD to +0.65 the shimmer disappears. On NV cards nudge LoD to -0.65 and it appears.
I also saw this posted today.
And what else is the point of looking for a better video card? If we didn't care about image quality, we'd still be on ancient hardware. But in the end, the whole reason we even care about benchmarks is because we want to know "how smooth, and how good-looking can I make my games?)
Have you seen any visual difference in games that you play between ATI and Nvidia?
Can you control AF quality on Nvidia cards?
Are you sure the AF quality is the same on Nvidia and ATI?
Do you know that AF HQ quality on AMD disables ANY driver optimization made (even the ones without visual difference)?
That way reviewers should use comperable settings on both vendors and not crank up the settings on one brand (Radeon) that allows control of AF filters, while the other can get away with optimizations enabled (GeForce).
I wouldnt base my opinion on one article where one editor has skipped other recent games which would prove otherwise.
Believe me both AMD and Nvidia have optimizations in their drivers and it mostly depends on what games and scenes you choose to show.
But now we have a situation where AMD enables users the control of the filter, while Nvidia can optimize and tweak their drivers till someone screams again bloody murder. Of course that is if they (key word coming) *notice*. :shrug:
Edit: To make it more clear, reviewers should ask for a clear statement which AMD vs NV modes to use while comparing performance or in the near future AMD could disable AF quality filtering settings in their drivers to have the same level of control over applications (and performance) as Nvidia, which is currently hidden away from the end user.
http://www.guru3d.com/article/explor...optimizations/
By: Hilbert Hagedoorn | Edited by Editor | Published: December 3, 2010
wt this ? Exploring , show one game image Quality sorry Hilbert But :down:Quote:
Exploring ATI Image Quality Optimizations
OK, interesting article.
I've been staring at these ME2 screenshots to no avail.
My verdict is:
This optimisation is not a big deal... 5% performance traded for an almost-impossible-to-see difference in picture quality (at least in actual gaming scenarios). So I may as well leave this option on for a tiny bit of extra FPS, since I will never notice the difference anyway (as long as I do not stay still and stare at the screen for ages instead of actually playing the game).
On a side note, Nvidia's implementation still looks better... The gradient transaction is very smooth, while for AMD it goes from sharp to blurry with quite a noticeable edge.
No no the only exemple he shown, is in ME there's no difference between quality ..
Nvidia on left, and AMD + Catalyst with optimisation on the right .... (set on quality not HQ ) ( AA is not applied the same can be disturbing but we speak about AF optimisation, not AA )
Global scene ..
http://img535.imageshack.us/img535/6253/global1.png
Then i have zoom to the back and on specific zone.. ( for see if the AF optimisation was made far of the first scene ) ..
http://img259.imageshack.us/img259/5493/zoom1.png
http://img130.imageshack.us/img130/3381/zoom4.png
http://img32.imageshack.us/img32/3180/zoom3.png
So what the proof in his article ? Trackmania screen from 3D center ?
It's more like 10% across the whole board and the question isn't if people should leave it on, but what should the reviewers do.
Stick to defaults? Which is Quality setting. A lot of people will use the default setting, plus it is nearly impossible to spot the difference in image quality...
So, the difference in quality isn't noticable, for a performance gain that isn't noticable. They get a few reviews showing better numbers while they get some fuzz about lower quality from others.
In the end it all seems so unnecessary. There is really no gain for AMD, it's all just bad publicity in turn for better numbers which people don't trust.
It would also be unfair to have AMD use the high quality in reviews as this disables optimisations all together, and as Shadov said, Nvidia will still be able to use their optimisations.
I lose quite a bit of fps in some games by disabling catalyst AI on my 4890, but who would ever disable catalyst AI? Even reviewers wouldn't. This is the equivalent of what is happening here. But i guess the only thing is, that AMD have added extra optimisations to the 68xx compared to the 4xxx and 5xxx series. Which is another unfair part.
There should be an open standard of compulsory (undetectable but fps gaining) optimisations :mad:
edit: Wait, how would i go about adjusting lod bias for my card? I want standard+ quality, i don't care for the pathetic % of fps loss.
It is in some game, but at same time the article of 3Dcenter mislead a lot, using old games as Trackmania ( a game they use for showe they was allready a problem with the algorythm of ATI till 3 years ), mixing the FP16 demotion who is only able in some game ( DOW, Oblivion,) .. and put all of this together for use it as proof AMD have decrease greatly the image quality on last driver, is a bit strange.. why never use BC2, Dirt2.. well the games who are tested now in review..
Same goes for the fps test, you need use an old driver 10.9 set standard, and then use the last driver set it to quality, and see how much is the difference, if you just test moving the slide on the last driver, you can't claim AMD have change it ...
If you disable all optimisation in the Nvidia driver and you bench a game, you will get the same performance lost..
Forum lag double post.
This has been going on for about a year now and for the longest time I couldn't see much difference in quality.
That was until I got the idea instead of use of using my main monitor an Acer 22" I tried my secondary monitor a Samsung 22" then I could start seeing some of the differences.
For me gaming on my Acer I probably wouldn't notice, unless something was totally FUBAR.
What I'm saying is the quality differences have to show up across different monitors, what you see on a 30" Dell I might not see on my old 22" Acer.
@SKYMTL:
once this settings becomes available on the 5xxx series you have to do a comparison between the old driver with catalys ai on and th enew ones with defaults + HQ
i bet that HQ is going to perform worse than the old AI on and default is going to perform better than AI on;)
Texture shimmering does not appear in screenshots. It requires movement to see.
I don't really care who has better IQ, so long as there is enough quality. But I strongly believe that bechmarks should be run with similar IQ settings. Otherwise, what's the point of benching cards against each other?
NVIDIA Focus Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the Members.
Tsk, tsk - only 75 posts so far Amorphous. Does that mean no new, free card for Xmas? :rofl:
Though seriously, the whole topic is like pot calling the kettle black and various optimizations can be proven to both sides. As long as we don't require a magnifying glass to see the difference or objects in game disappear (hello 3dmark dragon) then it's all fine and guess what those optimizations give us better performance in scenes where the FPS counter is close to 30. :shrug:
I find it funny the number of people complaining that probably dont even know or care if they are using TN panel ;)
True and I just love NV users moaning about the need to "control AF slider settings" on AMD cards, while GeForce cards dont even have the same control option enabled.
Anyway, thats all that can be said to this topic I guess.
STEvil on a side note, im considering a new IPS-S monitor. Anything you would recommend?
IPS or TN, doesn't matter. It's not related to texture filtering and i can have just as sharp textures on a 4 years old TN or on a brand new IPS screeen. It's all just about screen view angles and response times.
As for the image, i'm currently using High Quality mode with Surface format optimization disabled. 16x AF used at all times. The textures are nice and sharp, no shimmering effect due to optimizations, nothing. Just secks textures :D
Sooo....all this hoopla over something your only going to see if your face is literally an inch or two away from the screen. Wow.....Nvidia really will whine about anything.
Dunno, i've not looked into getting a new monitor for a while. I'm happy with my LG W3000H 30" and Toshiba XV648 46"
Thats not the point. The TN will have larger banding effects which is far more annoying and noticeable than texture filtering algorithms, and possibly the banding effects could make the texture filtering more or less visible.
In conclusion; Nvidia should just lower default IQ to compensate for performance disadvantage and be done with it.
@ Guru3D review of GTX570
(http://www.guru3d.com/article/geforce-gtx-570-review/21)Quote:
Speaking of AMD, the ATI graphics team at default driver settings applies an image quality optimization which can be seen, though very slightly and in certain conditions. It gives their cards ~8% extra performance. NVIDIA does not apply such a tweak and opts better image quality. We hope to see that move from AMD/ATI soon as well.
Guru3D is one of the most biased websites ever made:rolleyes: