For those who are saying ATi's AA is better, do you guys set the AA in Catalyst or in the game?
For those who are saying ATi's AA is better, do you guys set the AA in Catalyst or in the game?
I like how this guy goes into the review assuming that ATI "isn't doing as much work" on the scene as nV because the interiors of objects are less blurry. Hi, AAs goal here is to make edges less obvious, not to blur the centers of objects.
Epic, that's kind of the whole point of antialiasing most things in a game.As for the implementation, given that ATi’s images show less interior blurring, one possible explanation is that their hardware is somehow detecting texture edges and applying the most anti-aliasing there, but avoiding working too much inside the edges.
From what I hear ATI is better with AMD and Nvidia Intel. Makes sense since AMD makes the ATI cards now. Both brands are really good and the prices are great. I mean you can get top of the line for $3xx and a dual GPU card for less than $500. I paid over $500 for my single 8800GTX when it was new.
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
it seems nvidia's default texture LOD might have to be blamed for the seemingly blurrier AA since doesn't produce as much texture shimmering as with ati cards at the cost of sharper textures.
especially in crysis you can notice the blurriness in nvidia's SS in the last comparison.
http://www.techenclave.com/998496-post53.html
i wouldnt say objective macadamia, he went into this suspecting ati didnt actually use aa or af in some titles or that they used bad quality aa/af.
in 2 situations he actually found out he was wrong, in 1 regarding the af he thinks hes right and atis filtering is inferior.
and he praises nvidias msaa without comparing it to normal aa and without mentioning the huge perf hit it causes making it mostly useless.
he brings up an interesting point tho, nvidias af and aa both make the frame look blurry, which he says is good cause it causes less flickering during game play. that might actually be true, but i wouldnt say this is better vs worse aa/af, its just different preferences. i dont think ati CANT blurr textures when applying aa and afand i dont think its that easy... nvidia has ALWAYS had blurry textures compared to ati, and they had much worse cases of flickering in game play than ati while they were at it... so making textures slightly blurry does not automatically mean better in game feeling and no flickering textures...
Last edited by saaya; 02-11-2009 at 04:59 AM.
People usually see what they expect to see or what somebody else told them they should see. And switching always has an impact - you either like the old or the new. Same thing happened to me with my recent monitor upgrade, I got so used to my old monitor that the new one looked wrong somehow. But objectively I have no idea which has the more accurate picture. Most of the perceived IQ differences are due to contrast and brightness settings and those are completely subjective and depend on your room setup and lighting conditions and personal preference.
Look at all the clowns who think they see IQ improvements with every incremental beta release from Nvidia![]()
Ok, i'm sure some will complain i'm an AMD fanboy but i'm not sure if these guys understand 3D properly.
Blurier Transparency AA end objects do not mean that graphic card is applying MORE samples. It makes no sense at all.
The more samples you use, the sharper the image. The less you use, the worse results you'll get. Thats why tree leaves look more detailed on Radeon. And so far i haven't noticed any aliasing on vegetation in any game (Half-Life 2, Far Cry and Crysis).
As for the normal maps and AF test. They say aliasing doesn't affect that part. But thats not exactly true.
You can get aliasing on normal and parallax maps just because they are represented as 3D objects (with depth) even though they are flat. I've seen numerous cases with bad aliasing on such objects. Only way to solve them really is to use supersampling. Which is very very power hungry.
The AF flower test, well i'm not sure the test was done properly. Partially by AMD's fault, partialy by tester.
NVIDIA is using HQ mode while AMD is using regular mode. The problem is that HQ mode is disabled by default for HD4850 cards and control for it is not enabled in CCC. Not sure why but it's not there. Ray Adams said HQ mode is still valid in ATT. Anyone that could ask AMD why HQ AF mode is not available on their high end cards while HQ is available on older series (i know x1950 Pro had this).
Can someone give me a working download for this AF tester tool? I used to have it but i can't find it anymore. I'll check if enabling HQ through tweaking boosts AF quality...
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
yeah, i remember when i switched from crt to tft, took me a while to get adjusted...
and when i switched from geforce4 to 9500 i didnt notice any differences in image quality, but when i tried a 6600 later on i hated the image quality and sold it quickly. i havent compared iq on cards recently, im using both nvidia and ati atm and dont notice a notable difference, just that nvidia is more blurry with aa.
When you go into testing with preconceived ideas, it's quite easy to find exactly what you want to find, and even easier to justify your own biases. Seems with the "testing" that was done in the linked article, that's exactly what happened.....the "reviewer" wanted to find that the nVidia card was faster (and why wouldn't it be.....comparing two cards, one of which costs around $60-$100 more, depending on brand, and what would you expect. If the testing were to be fair, a 4870 1GB card should have been used to compare against his GTX 260 Core 216.......referring back to the first section of the testing. I'd also think the 4870 would have done as well or better in his "image quality" testing........but who am I to talk...........
Never mind, i saw the difference in leaves. The 4850 has some kind of border around the leaves. (page 4 of the quality test)
Last edited by TurboDiv; 02-11-2009 at 08:01 AM.
i9 9900K/1080 Ti
I have 4870, in 2D mode, colors are more closer to 6500k daylight, thats why HD movies n all more vibrant on ATi. but when browsing etc, it irrtates too, yellowish color tint. Without any tweaking in control panel-gamma settings etc, ATi is better.
My take on it is (and correct me if I am wrong) is that high quality AF (AKA Trilinear Filtering) maybe tied in with MipMap setting which is on High Quality as default. I've tested this by enabling the HQ feature in CCC then enabling it. Then I compared it to default settings of CCC and noticed no discernible difference. When set to Quality I did notice that the surfaces of some textures were not as sharp as before. Is this just LOD going from negative to postive or something else I am still not sure? If there is another method I am more the willing to read about it.
Last edited by Eastcoasthandle; 02-11-2009 at 11:08 AM.
[SIGPIC][/SIGPIC]
This is true, but the difference between Nvidia and ATI's default config isn't subjective in my opinion.
The default settings for gamma, contrast and brightness with Nvidia Forceware drivers are too high, which is what leads inexperienced/incompetent people into believing that Nvidia has blurry AA, washed out colors etc..
With ATI on the other hand, you don't really need to tweak the contrast, brightness and gamma settings. Everything looks good right off the bat.
Intel Core i7 6900K
Noctua NH-D15
Asus X99A II
32 GB G.Skill TridentZ @ 3400 CL15 CR1
NVidia Titan Xp
Creative Sound BlasterX AE-5
Sennheiser HD-598
Samsung 960 Pro 1TB
Western Digital Raptor 600GB
Asus 12x Blu-Ray Burner
Sony Optiarc 24x DVD Burner with NEC chipset
Antec HCP-1200w Power Supply
Viewsonic XG2703-GS
Thermaltake Level 10 GT Snow Edition
Logitech G502 gaming mouse w/Razer Exact Mat
Logitech G910 mechanical gaming keyboard
Windows 8 x64 Pro
The statements are logical. The 260+ is faster than the 4850, so naturally one would scratch their head when the 4850 starts performing better with heavy AA applied.
He never quite touched down on the reason, though. It is because the HD4000 cards have immense shader power with the 800 SPs, and by design, it is difficult to code a driver that maintains a full workload on them. So when you have AA, you can just let it run on the unused shader power. And while 800 ATI SPs =/= 800 nvidia SPs (nvidia still has a slight edge here if you do the conversion), nvidia does not do AA in the shaders like ATI, it does it in the framebuffer.
And regarding the "blury" AF on nvidia cards... it is also a lot more uniform in its application (lines are straighter), and, in motion, it looks better than ATI AF.
I have found this to be true, and has been verified on Anandtech comparing the G80 to the R600 (still applicable, GT200 = more powerful G80; RV770 = more powerful R600, the microarchitectures are the same)
Here is G80 AF:
Here is R600 AF:
source
Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
—Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.
The 4800 cards can do 8x AA more efficiently than Nvidia cards because the AA work has been moved to the shaders since the R600 generation. 2900 and 3800 cards only had 320 shaders, which were not enough, and provided abysmal AA performance, while 800 shaders are so much better now, with the 4850 even beating a GTX 216+ in many games when it comes to 8x AA.
I actually liked the AF image quality much better on the 4850 as shown here http://alienbabeltech.com/main/?p=3188&page=9
Nvidia's IQ is just too blurry for my tastes--it looks like turning down the Level of Detail all the way down. UGH! I'd rather play Doom 3 with a crisp picture!
--two awesome rigs, wildly customized with
5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
--SONY GDM-FW900 24" widescreen CRT, overclocked to:
2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)![]()
Updated List of Video Card GPU Voodoopower Ratings!!!!!
Bookmarks