you could you know change the gamma settings? :ROTF:
Sheesh, one person posts his own views and experiences when swapping a card and gets absolutely shot to pieces. What the hell happened to this forum seriously. It's getting worse and worse.
Hmmm seems to me it's always been like that and I've been here a while.
I'm with ya Toysoldier :up: After over 2 years of using Nivida gpus, 8400GS > 7900GTX > 9800GT > GTS 250, it was easy for me to the differences between the two chipsets/drivers. One of theory I have is, it has something to to with the way AA is applied. Nvidia gives you more options such as, enhance, force and disable. I always had it set to enhance and 16Q ;)
Most noticeable was Need for Speed Shift. I had to play around with CCC a fair bit to get it to run the game more smoothly. I think AI was mainly to blame there.
NEVER be afraid to speak your mind. Those who feel they need to make an example of you for doing so are all to common and should be ignored. :cool:
pEACe:D
You dont have to create a game profile.
You can change global colour settings from CCC once and for all.
Honestly, the two cards produce very different looking images on screen.
Its personal preference, I like detailed, sharper, better contrast (by default) of ATI. I miss it sometimes on my current 470, but its not a huge deal to be honest. With little adjustment, both can be made to look similar.
Some will hate this as always. In fact, bad monitors are to blame as well in some cases.
Personally, when i try my friend's GTX 460, the IQ sucks compared to my HD 4870. So, nVidia might have overoptimised its driver according to my personal experience, right ? That HAS to be right !
:ROTF:
So much hysteria, so much soap opera. :down:
IMO, the bottom line is that lowering quality, even just default quality, isn't cool - no matter what company does it.
AMD puts image quality debate to bed
There's been a lot of talk about GPUs and image quality lately, and as the party on the receiving end of some of the accusations, AMD felt the need to set the record straight. That's why we were invited to talk to Senior Manager of Software Engineering Andy Pomianowski and Technical Marketing Manager Dave Nalasco about image quality and the ruckus that NVIDIA kicked off last week.
The settings, they are a-changin'
Dave explained to us that there had been some changes to the Catalyst drivers to coincide with the release of the HD 6000-series GPUs, and that image quality had been a big part of that. At the heart of all this is Catalyst AI, which controls a whole host of different settings via a single slider.
Responding to feedback, this single slider was divided into a number of different settings in the latest release, giving users a bit more control. One of the new additions was a slider to control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.
High Quality turns off all optimisations and lets the software run exactly as it was originally intended to. Quality - which is now the default setting - applies some optimisations that the team at AMD believes - after some serious testing, benchmarking and image comparisons - will maintain the integrity of the image while increasing the application performance. Lastly, the Performance setting applies even more of these optimisations to squeeze out a few more frames, but risks degrading the image quality just a bit.
What do you see?
Dave acknowledged that some sources had observed visual anomalies when running a few games and benchmarks. He explained that the algorithms that the drivers run - notably anisotropic filtering - are very complex and that despite their best efforts, the image wasn't going to be perfect 100 per cent of the time, even on default settings.
What he stressed was that, in the opinions of the whole driver development team, the default settings and optimisations still offered the best performance with no noticeable drop in quality for the vast majority of users the vast majority of the time. And for those who were experiencing any problems, High Quality mode would always be there to allow a picture perfect image. This, he made clear, wasn't going to change any time soon.
And then something strange happened - Andy asked us what we thought. These guys seemed genuinely concerned about what we felt were the best settings to use, whether we'd experienced any problems, and what we would change if we were designing the Catalyst tools. They're clearly committed to delivering the best product that they can, and that means listening to feedback and taking on board what the press, as well as average gamers, think.
Hopefully, this whole image quality debate can now be put to bed. At least for the time being.
Source:
http://www.hexus.net/content/item.php?item=27786
So basically they admit that they have turned it down a tad too much, but in the team's opinion it's still great for the majority of people? Hmm,not sure I fully agree with that....
The color settings in CCC have been a big problem on CF systems since...maybe...10.6 or so. Move any of those sliders on a multi gpu system (58xx at least) and you get a nice pink screen. Or, that's my experience at least (as well as the experience of others I've spoken with).
--Matt
I can honestly say that I'm noticing less texture shimmering on a GTX 580 with texture quality set to high quality compared to my previous HD 5870 with Catalyst AI set to high quality (10.10e hotfix ) I'd still like to see how the 5800 compare to the 6800s when they both have acess to these newer Catalyst AI options.
i like AMDs response. thanks to all the complaints, we now have more tools to play with in CCC.
But the question remains: should reviewers use the HQ setting in their articles?
I won't venture my opinion just yet since I want to hear what you guys have to say.
High quality only,show's who has best IQ and performance @ that IQ level.Other settings for IQ testing only.
omg why does this matter so much? this thread has gone on for six pages...about quality settings?