http://translate.google.com/translat...d_6850/s09.php
this is not abotu defending; this is about realising that this campaign is more related to the success of barts and a desperate attemp from nvidia to start a mudcampaign because their products in this price range are inferior right now and they aren't happy with their christmas sales...
AMD doesn't take anything away, they give you more options than you had before and TBH i'm happy to take the performance advantage in BFBC2 because i fail to see a difference in this game...
trackmania is another story, in this game i'm happy about the NEW HQ setting which completely eliminates banding (BTW: banding is present on the 8800GTS in my other pc...)
if there is one thing you can critisize them for is that they don't set HQ by default but i'm sure that most users take the extra performance over the IQ because most of them can't even see a difference between mid and high settings (which is sad but true)
so the setting exists, and you just have to turn it on. sounds like such a big deal that we needed a post from nvidia with an exclamation point, after we already had a huge thread about it (and not the fisrt one)
If I had a dollar for every "company x/y uses lower IQ to get higher performance" claim in the last ~10 yrs...
...I'd have founded Company Z
The point is that ATI's default quality is lower then Nvidia's default and once you adjust quality setting for ATI to match Nvidias you incur FPS penelty , according to that article 10%. And after all that PR marketing from ATI that they got superior quality compared to prior 5xxx cards and good performance suddenly becomes a blurred truth.
Thats a simple fact , noone is arguing that you can set to high quality or what nvidia or ati did in the past , its about now and misleading benchmarks, info people get when deciding what to buy.
thats only partially an issue. most reviews test at least 4-5 different iq settings and what you see is what you get - at the highest iq settings and aa the results are as accurate as they can be.
now if you want to complain about 1280x1024 default iq +no aa gaming then i cant relate since i dont play those settings anyway.
Successful troll thread is successful.
:rolleyes:
The problem is review sites. If they don't do a like for like review based on Image quality, then we're always stuck in this situation.
I think [H]ardOCP comes closest with reviews that take into consideration settings, but a review site that lowers/raises settings to give equal amounts of image quality is needed (are there any?).
As for the AMD and Nvidia fanbois in this thread. it's starting to get annoying, as none of you are giving any evidence to back up your arguments. AMD are under no obligation to set "Ultra High IQ" as default, and NVidia isn't either.
That is correct. I was convinced of that when they made a public statement to the effect of "we wanted to bring all products in line with the lower quality in order to be consistent". That took some balls but as evidenced in this thread they knew they could get away with it.
LOL @ AMD Droids defending AMD's shoddy IQ :rofl:
"B-b-but no one can notice it!!" :rolleyes:
Did you guys read the blog post?
Hahahaha, that's so funny. They were "taught" some hard lessons more like it.Quote:
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality.
look at the motives
AMD did it for better performance.
Nvidia pointed it out for marketing gains.
AMD users feel like they dont care about the difference, fix the settings, or feel hurt a little.
Nvidia users are crying so they can justify their overpriced purchase.
that's why they include more than FPS numbers in their reviews. there is more to look for in a card than just performance or IQ.
sure there is no obligation, but is it right to lower image quality to gain performance? we would just end up in a race of degrading IQ.Quote:
AMD are under no obligation to set "Ultra High IQ" as default, and NVidia isn't either.
insightful post.:up:
Well if you wanted to make a list of nvidia fanboys, this thread is a good place to start.
You meant AMD fanboys, right? :yepp:
People can finger point all they want. The video comparisons on the sites linked are pretty much universal in their translation.
So ... the card bands in an AF testing program with an artificial texture that nvidia has ironically decried the use of because it's not real?
Can we get some real games tested, please? And let's try not to only focus on Far Cry or Trackmania.
As long as this is not a problem during normal gameplay it's mostly irrelevant. Show me a comparison video of a recent game that shows this in a noticeable way and we can call it a real problem.