NVIDIA Says AMD Reduced Image Quality Settings w/ Radeon HD 6800 Series For Better Performance
Link http://www.legitreviews.com/news/9482/
Source: Legit Reviews
NVIDIA Says AMD Reduced Image Quality Settings w/ Radeon HD 6800 Series For Better Performance
Link http://www.legitreviews.com/news/9482/
Source: Legit Reviews
This is real
http://img202.imageshack.us/img202/2527/80269813.jpg
Quote:
The anisotropic filtering, we need to exhibit at our detailed review, unfortunately, a bad witness. On the one hand, the texture filtering on the Radeon HD 6000 generation has been improved (the banding problem has been fixed for the most part), on the other hand, the textures flicker but more intense. That's because AMD has the standard anisotropic filtering at the level of AI Advanced lowered the previous generation. An incomprehensible step for us, because modern graphics cards provide enough performance to improve the image quality.
While there are games that show the difference hardly others suffer, however hard to flickering textures, dull the fun. After all, it is with the "High Quality" function possible, the existing AF-quality (usually) get back. Speak the Radeon HD 6800 provides for manual switching is still the quality of the previous generation, the standard quality is worse now!
Since we will not support such practices in 2010, we decided to test in future every Radeon HD 6000 card with about five percent slower high-quality version, so the final result with the default setting from Nvidia in is roughly comparable
http://www.computerbase.de/artikel/g...on-hd-6800/15/
:down:
Why dont they just test with quality options maxed out on both cards if "full quality" is so important?
btw, I agree that stepping back on quality is bad.
The point is the amount of shimmer irregardless of setting. But I agree, if you are too lazy or stupid to have catalyst and nvidia control panel constantly at HQ you have no right to :banana::banana::banana::banana::banana:. The performance gain has always been too little to warrant the degradation in quality.
Sounds like the 6*** series have Nv worried :yepp: :D
Do we need another thread on this?
Original: http://www.xtremesystems.org/forums/...=261588&page=5
Furthermore it has been proven that it depends mainly on games used, where AF quality can be higher on HD 6000 in newer titles and lower on NV. :shrug:
Starting with Catalyst 10.10 (and also including 10.11), the IQ is significantly reduced from previous ATI driver releases. The IQ reduction only affects HD 5800 and up GPUs and HD 6800 GPUs. This reduction gives a significant performance increase to the affected AMD GPUs. For an apples-to-apples comparison against NVIDIA GPUs either NVIDIA's IQ settings need to be dropped, or, ideally, AMD's need to be raised. Even raised, AMD's IQ cannot seem to match NVIDIA's default IQ.
Only video can illustrate the quality difference, but it's discernible: http://www.tweakpc.de/hardware/tests...d_6850/s09.php The videos are split frame, with the left side showing depicting what a GPU produces at a specific setting against what should be generated on the right.
While NVIDIA has posted about this on their blog: http://blogs.nvidia.com/ntersect/201...e-quality.html It's not their work, it's the finding of several major European tech sites.
Amorphous
Oh for god sake, not this :banana::banana::banana::banana: again... The reality is, if no one would point this out, no one would even notice it, so why make a big deal out of something 99% of ppl won't even notice? If you're such a purist, you already know what CCC is and how to adjust things to turn all optimizations OFF. So do that and stop complaining. One thing are optimizations that you cannot circumvent and others that you can't. In this case you can. So to that. Facepalm.
nvidia have done this way before ati so cause now it hurts suddenly iq became important ? and lowering iq is unethical ? lol
Look at the videos and tell me you wouldn't notice the difference between HD 6870 and GTX 470's IQ. It'll be extremely obvious in every title. Even cranked up, the HD 6800 doesn't compare to the GTX 400's default setting.
AMD reducing the default IQ means benchmarkers are going to need to adjust their testing procedures to generate an apples-to-apples result, or it's no longer a remotely fair comparison. Might as well benchmark with widely different AA settings.
Users can and should make their own determination about what level of IQ they desire, and adjust their settings appropriately for their desired gaming experience.
Amorphous
Ohh Nvidia privat fanboy army is here to reveal the truth... :rolleyes:
Save our ignorant souls so all money is spent on the only, true company that did not ever optimize their drivers. Ohh wait?
On a more serious note - Original AF thread is here: http://www.xtremesystems.org/forums/...=261588&page=5
This one should be locked.
fail news
IQ on my 6850 with the new HQ setting is noticeably higher than on my old 5850
AMD = FAIL!
so amd cheat again
but it does not matter when its aMd who cheat???:mad:
amd have failed in so many ways recently
they are much worse than nVidia has ever been
Why can not they just admit they also have lost this round and move forward:yepp:
i fail to see how they fail, they offer their users a higher quality setting and the possibility to get a higher performance if you don't notice any differences yet people claim that amd screws its customers?????
what they should do is make the HQ setting default and offer the high performance setting as an option but your comment still is a mountain of fail
Its always the same story.. Fanboys immediately attack the other side and exaggerate:) Sensible people wouldn't really believe what they are saying so it just feels like they need to encourage themselves
i hope this topic stirrs up a lot of official debate and name calling from both companies so that both put iq at the top of their prioriy and stop worrying so much about only fps and releasing pointless technology like 3d.
well for my part i dont expect people working for nvidia to be unbiased. even non related people have bias, so if you work for them i think that comes with the job no?
anyway if new drivers lower iq default then reviwers should be aware and when cayman is out they should perform all tests in high quality baseline.
And the puppet masters start pulling the strings, Story is contradictory and configuration dependent, point being as normal ATI Image quality is better at default then it was on 5870 as it was 4870>5870.
Both sides have issues with IQ depending on driver OS and game, and other API.
Nvidia maybe opening a can of worms for themselves here if anything.
When i've installed Cat 10.10e, the first thing that i did was move the texture quality slider to High Quality but kept the Optimize surface feature enabled and i play all games with 16x AF and MLAA. I can't really complain about image quality because i don't see any reason to do that. I don't have anything against NVIDIA settings that i was familiar with in the past. Some optimizations exhibited shimmering effect on textures, but other than that if the optimization can give a significant boost and you can only notice the difference on side to side image comparisons, i think the optimization is well justified. But maybe both NVIDIA and AMD should release drivers with big red button that says "TURN EVERYTHING MAXXXXXXXX!!!!!!!!11111" to make all the whiners happy.
this is not abotu defending; this is about realising that this campaign is more related to the success of barts and a desperate attemp from nvidia to start a mudcampaign because their products in this price range are inferior right now and they aren't happy with their christmas sales...
AMD doesn't take anything away, they give you more options than you had before and TBH i'm happy to take the performance advantage in BFBC2 because i fail to see a difference in this game...
trackmania is another story, in this game i'm happy about the NEW HQ setting which completely eliminates banding (BTW: banding is present on the 8800GTS in my other pc...)
if there is one thing you can critisize them for is that they don't set HQ by default but i'm sure that most users take the extra performance over the IQ because most of them can't even see a difference between mid and high settings (which is sad but true)
so the setting exists, and you just have to turn it on. sounds like such a big deal that we needed a post from nvidia with an exclamation point, after we already had a huge thread about it (and not the fisrt one)
If I had a dollar for every "company x/y uses lower IQ to get higher performance" claim in the last ~10 yrs...
...I'd have founded Company Z
The point is that ATI's default quality is lower then Nvidia's default and once you adjust quality setting for ATI to match Nvidias you incur FPS penelty , according to that article 10%. And after all that PR marketing from ATI that they got superior quality compared to prior 5xxx cards and good performance suddenly becomes a blurred truth.
Thats a simple fact , noone is arguing that you can set to high quality or what nvidia or ati did in the past , its about now and misleading benchmarks, info people get when deciding what to buy.
thats only partially an issue. most reviews test at least 4-5 different iq settings and what you see is what you get - at the highest iq settings and aa the results are as accurate as they can be.
now if you want to complain about 1280x1024 default iq +no aa gaming then i cant relate since i dont play those settings anyway.
Successful troll thread is successful.
:rolleyes:
The problem is review sites. If they don't do a like for like review based on Image quality, then we're always stuck in this situation.
I think [H]ardOCP comes closest with reviews that take into consideration settings, but a review site that lowers/raises settings to give equal amounts of image quality is needed (are there any?).
As for the AMD and Nvidia fanbois in this thread. it's starting to get annoying, as none of you are giving any evidence to back up your arguments. AMD are under no obligation to set "Ultra High IQ" as default, and NVidia isn't either.
That is correct. I was convinced of that when they made a public statement to the effect of "we wanted to bring all products in line with the lower quality in order to be consistent". That took some balls but as evidenced in this thread they knew they could get away with it.
LOL @ AMD Droids defending AMD's shoddy IQ :rofl:
"B-b-but no one can notice it!!" :rolleyes:
Did you guys read the blog post?
Hahahaha, that's so funny. They were "taught" some hard lessons more like it.Quote:
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality.
look at the motives
AMD did it for better performance.
Nvidia pointed it out for marketing gains.
AMD users feel like they dont care about the difference, fix the settings, or feel hurt a little.
Nvidia users are crying so they can justify their overpriced purchase.
that's why they include more than FPS numbers in their reviews. there is more to look for in a card than just performance or IQ.
sure there is no obligation, but is it right to lower image quality to gain performance? we would just end up in a race of degrading IQ.Quote:
AMD are under no obligation to set "Ultra High IQ" as default, and NVidia isn't either.
insightful post.:up:
Well if you wanted to make a list of nvidia fanboys, this thread is a good place to start.
You meant AMD fanboys, right? :yepp:
People can finger point all they want. The video comparisons on the sites linked are pretty much universal in their translation.
So ... the card bands in an AF testing program with an artificial texture that nvidia has ironically decried the use of because it's not real?
Can we get some real games tested, please? And let's try not to only focus on Far Cry or Trackmania.
As long as this is not a problem during normal gameplay it's mostly irrelevant. Show me a comparison video of a recent game that shows this in a noticeable way and we can call it a real problem.
Funny cause I have been around long enough to tell you nvidia fanboys defended their fxs and drivers to the teeth even at the pinnacle of nvidias cheating ways claiming just as today that it was ok for nvidia to cheat or perform worse cause their drivers were just so godly and that much better.
Anyway I think I will go for Nvidia on my next system since I dispise iq shortcuts on high end cards. Cayman should put an end to this bull:banana::banana::banana::banana: or Im done buying red just as I was done buying green 4 years ago.
ps: in general nvidia fanboys are far more arrogant and insisting upon themselves. If amd fanboys behaved exactly the same there would be a new thread on the news forum everyday about how the gtx 580 still gets its ass handed by a 1 year old 5970 and is a power hog. But you dont see that, what you see are legions of nvidiots creating huge drama over small deals in order to excuse themsleves for not being the top dog around. Threads about driver, iq, af, and as many milimetrical advantages they can have become top news over here.
Its friggin tiresome as :banana::banana::banana::banana: thats what it is.
amd= performance at any cost even if iq at stake
nvidia= eterneal looping excuses for not being the fastest.
both :banana::banana::banana::banana:ty ways to go about.
+1 I will be going for a GTX 470 next upgrade, or possibly a GTX 560 (can imagine 570 being a bit out of my price range). 6870 was appealing until now, and anyway, they overclock like :banana::banana::banana::banana: when considering overclock %. A 6850 overclocks nicely but it really isn't worth the upgrade over my 4890.
That's partially the fault of most reviews for not analyzing IQ and partially AMD's fault for taking advantage of it. If more reviews actually bothered to bump IQ to its highest on both cards for some/all tests and provided IQ comparisons then this wouldn't be an issue.
Not that I'm happy with AMD for fiddling with the defaults. But I haven't used default settings on either brand since...ever.
did anyone do a perf comparison with the settings yet?
They are very much like cults. It seems like fanboys take on the personalities of their favorite companies, especially the CEO. Nv fanboys are kinda like Jen-Hsun, boisterous, boastful, generally aggressive. Whereas Amd fanboys are more quiet and reserved. Kinda like how we don't hear much from Amd's CEO. I don't even know who the current one is.
As was said: NVIDIA learned their lesson. Now it's AMD's turn.
Personally, I don't think this was necessarily intentional on AMD's part since at its core, angle dependent AF is a great idea. It has just been poorly implemented and the result is higher performance at lower image quality. :(
the issue with nvidia fanboys is that they are blinded by arrogance while ati fanboys are blinded by the need to defend the company they still perceive to be the bullied underdog at any cost.
with the three week delay of cayman and the early launch of gtx580 the nvidia fanboys got a massive boost to their egos and the stench of their trolling lately has been unbareable. Just do a quick search on how many ati hate threads are still live in this forum and youll see.
when your preferred firm does something unlawful or unethical the best thing you can do as a fan is cryticize it and demand their correction. putting them under your wing for anything they do is not only bad for them but is especially bad for yourself.
fanboys aside, facts are on nvidia's side on this one.
That is pretty stupid. You can't complain that someone did something unless you've complained about everyone in the past who also did it? Where do you guys come up with this crap? Besides he's not even complaining so don't know what you're defending.
Can anyone post up links to videos or screenshots displaying this lowering of IQ in games? I can't for the life of me see anything that looks particularly bad while playing Dirt 2 and BFBC2.
Another nice soap opera, me like it. Pass me a bag of popcorn, will you ! :ROTF:
The more Nvidia complains, the more free advertising for AMD. Like in Hollywood "I don't care what you say, just spell my name right!" And the people that listen to the ones that scream the loudest are the ones that lose. Nvidia just sounds like a sore loser no matter what the facts are. Their whole beef depends on the definition of default driver and the manner of testing. Without that they have no argument.
you know what is funny, even with the high quality the ati 68xx series are still the best card to get price/power/performance.... :) against the 460-470, there benchmarks showed that....
Xbitlabs test with High settings:
http://www.xbitlabs.com/articles/vid..._12.html#sect1
ATI Catalyst:
Anti-Aliasing: Use application settings/Standard Filter
Morphological filtering: Off
Texture Filtering Quality: High Quality
Surface Format Optimization: Off
Wait for vertical refresh: Always Off
Anti-Aliasing Mode: Quality
Other settings: default
Nvidia GeForce:
Texture filtering – Quality: High quality
Vertical sync: Force off
Antialiasing - Transparency: Multisampling
CUDA – GPUs: All
Set PhysX configuration: Auto-select
Ambient Occlusion: Off
Other settings: default
Ive got your IQ control right here.....
http://t2.gstatic.com/images?q=tbn:y...sters2.jpg&t=1
Haven't seen a jaggy line yet.:toast:
im sorry if im missing something but were are the SS that shows the default and "fixed" settings in action.
since i have not heard of any1 complaining about quality and amd said said that they added a higher quality setting and made the last one equal to the old high quality makes this seam like a bunch of squabbling over semantics. the only time that i ever remember complaints on images it was for NV cheating in 3d mark, NV lowering IQ when the 4890 came out to boost scores and ati not having alpha textures working properly specifically in the farcry2 and each time that happened there was benching and SS of the problems and comparing card to card and driver to driver, but with this no1 is posting SS and the attack is coming from NV so it seams orchestrated as there would have been angry people before NV said something but all that cam out was benching sites that favored NV re-benching with high quality instead of quality and most of the reputable benching of defaults, and while some sites post SS of the bench that did not lead to anything. I love IQ and i had never seen anything even when the AF testing was going on this did not come out so i dont see this as anything correct.
Agree. It is the reviewers doing the misleading. Like back when Intel Quad first came out and you get the reviews that show how powerful the quad is... ya, @ 640 x 480 ha ha ha. They don't show that a dual core beats it when you raise the resolution.
Nobody... uses default settings. When you start a game you see what you get at MAX settings and then tone it down if you need higher framerate.
So the whole argument is a fabrication.
can not we just agree it is wrong to cheat
and when they do we Protests
this time it was amd next time it's maybe nVidia doing it
they just should not start lowering quality in a race about who is fastest
it only affects us and gaming in a bad way.
you can keep telling this over and over again yet i enjoy a higher IQ on my 6850 than what was possible on the 5850 which was proven over and over againQuote:
Originally Posted by E30M3
it's also proven that the difference between HQ (better than 5xxx series and my old 8800) and normal quality (worse quality in older games, same quality in new games) is around 5% which still puts the 68xx series in front of the gtx460 in terms of performance and price/performance
i don't see how amd lost this round, they own a 3-4 times higher market share than nVidia in the DX11 markt, they dethroned NV in overall market share and their higher priced products make up a significantly higher proportion of their revenue than nvidias...
sure nvidia is still pretty competitive right now but unless you want to buy a card above 400€ (GTX580) amd is the way to go due to output option, video playback quality, performance, price and power consumption (that is from 50€ to 300€) if you don't have any brand preferations at all...
this might change in the next round, and certainly was the other way round in the 8800 times but to claim that amd lost this round is ignorance (just as ignorant as some AMD fanboys continue to claim that AMD didn't loose against i7, they hold on to certain price points and certain workloads but overall they lost just like nvidia...)
In before nvidia people cite ~70% marketshare leftover from 79xx and 88xx series.
And this forum proves once again that an objective opinion is hard to come by.
ive been saying it for a while... direct x should include a default render mode that has to pass a series of tests, so it does NOT get tweaked/optimized...
so people can chose to use this mode or the optimized settings ati and nvidia offer that boost performance by SLIGHTLY reducing image quality...
if im on a mainstream gpu ill appreciate the latter, but if im playing an older game i want max image quality and no optimizations at all...
and how can you sell 500$+ videocards to people that dont render the games the way they were supposed to look but blur things to boost the performance by a few percentage points?
thats just plain stupid...
btw, i dont get those videos... whats the right side of the video?
the flickering on the left side of the 6000 videos is def worse than in the 5000 videos.
BUT, on the nvidia videos is a slight flicker both left and right... the left side looks better than the 6000 but not really better than the 5000.
Not in min FPS. In most reviews i seen 5970 drops way below 580. And you know that's whats makes or brakes a game. I don't care if i cant get 10 fps more on high. If FPS fluctuate like crazy then game is much more unplayable. When i play any first person shooter i want stable FPS and definitely dont want them to drop into single digits. I wont post any slides just check Anand's review. And btwy most crossfire setup suffer from that. You ether agree with me that min FPS is more important then high for good smooth gaming or you a fanboy. That's why I'm not so found of dual gpu cards, and would never get card like 5970 for gaming. Would definitely pick up 5870 instead.
5970 has higher minimum (and avg) than 580 in AvP (dx11), BFBC2 (dx11), Just cause 2 (dx10).
580 wins in Dirt2 (dx11), Lost planet (dx11) and SC2 (dx9)
in Metro (dx11) theire about even in minimum fps....
This is from Hardware Canucks (1 of few review site i trust)
So it depends on wich games you play really.
Edit: Also looked at Anands 580 review. They only show minimum framrates in 2 games, 1 wich is is crysis warhead wich looks like it need more than 1gb ram (normal 5970 has only 1 gb ram) for thoose settings at 2560x1600, so ofc the 580 will do much better there since it got 1,5gb ram (5970 was better than 580 @ 1920x1200)
Edit2: Lab501 wich is also an excellent site wich has avg and minimum framerates.
Theire 580 vs 5970:
5970 wins in SC2, CoD4, just cause 2, AvP, Mafia2, Medal of honor, Cod:black ops
580 wins in Farcry 2, Crysis warhead, Dirt2, Hot pursuit, Metro 2033, Hawx 2 (tho same minimum), warhammer 4000, Battleforge, BFBC2, Darksiders
5970 was 1 fps ahead in resident evil 5 but so close i call that even. Same minimum in Hawx too, but 580 is slightly ahead in avg.
580 quite ahead in in avg on lost planet 2, but 5970 still had higher minimum.
580 slightly ahead in min on Batman Arhkam Asylum, but slightly behind in avg.
You can call anyone who dont agree with you fanboy, but the black and white picture you are painting makes you look way more fanboyish than others. Both card got theire strengths and weaknesses. Looking at just the 2 reviews i looked at now, ive would say 580 pulls the longest straw. Unless you only play certain games where the 5970 shines...
(and just for the record, i wouldnt buy either of them)
It does that with new tweaked driver which supposedly gives 10% boost? That probably has nothing to do with that eh:rolleyes: When you look at a review where they used full IQ picture changes some what. 6800's look pointless as in most cases 5800's takes the cake. Even in tessellation benchmarks which supposedly 6800's were redesign for they are far from shining.
And it's not about d** measuring contest. You cant compare dual gpu to single anyway. And Nvidia has the fastest single gpu card on the marker today.
99% would never purchase 5970 or upcoming 595. Most will probably go Xfire or SLI first. \
Oh and i hear that 5970 is not so smooth gaming card anyway. It has its own share of problems. That was my point. SLI is not perfect ether so i still outright dismiss anything like dual gpu, xfire, sli as a gaming platform. For benchmarking sure but for gaming give me single gpu card under 230W full load.
They both use 10.10 and 262.99 (so does anand)
I dont understand your point, shouldnt you use the latest and best drivers?
You are not making any sence...
Dont let the 68XX naming confuse you, they are not really ment to replace the 58XX. But they still do better than 58XX in tesselation....
If it where, i would only have showed you the numbers from the card i own.
But i dont own either of them. So moot point...
(And by doing so i would only look un-objective and like a huge fanboy)
Now THIS is true fanboy talk :clap:
Ofc you can compare them, they are both graphics card right? They also happen to cost roughly the same and both are power hogs and they are also the best card each maker has atm...
You may ofc do all that if you want, no one is forcing you to do anything.
way to blow things out of proportions, it took a month to find the differences between both parts, not a single person who switched from NV to AMD or an older AMD card noticed a big difference...
Oher sites with quality tests:
http://www.3dcenter.org/artikel/amds...ilterqualitaet
http://ht4u.net/reviews/2010/amd_rad...ded/index9.php
Funny how nVIDIA makes a scene about this, considering they use FP16 demotion too since Rel.260 ForceWare drivers and that you can't turn off in the control panel... Though you can turn it off with a utility that is not available to the public.
Got to love marketing BS. :eh:
Nice try...
http://www.geeks3d.com/20100916/fp16...e-says-nvidia/Quote:
If you wish to test with Need for Speed: Shift or Dawn of War 2, we have enabled support for FP16 demotion – similar to AMD – in R260 drivers for these games. By default, FP16 demotion is off, but it can be toggled on/off with the AMDDemotionHack_OFF.exe and AMDDemotionHack_ON.exe files which can be found on the Press FTP.
It is on and was on for a couple of releases, for at least the games mentioned in the article you've linkend as far as I am aware.
What really baffles me is that I don't know why are they really doing this kind of work, since both AMD and nVIDIA hardware are capable of full-speed FP16 texture filtering. Though I don't know why I am "surprised" that they are at it again after missing shaders and objects in games like Crysis and Far Cry 2...
So you obviously have proof, if you're so adamant about this, even though NV admitted putting it in and said it's off by default and can't be turned on through CP, right?