Page 2 of 3 FirstFirst 123 LastLast
Results 26 to 50 of 53

Thread: Geforce GTX 260+ vs Radeon 4850 (Image Quality)

  1. #26
    Xtreme Member
    Join Date
    Aug 2008
    Location
    Australia
    Posts
    373
    For those who are saying ATi's AA is better, do you guys set the AA in Catalyst or in the game?

  2. #27
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Quote Originally Posted by SKYMTL View Post
    QFT. IMO, debating this is a complete waste of time.
    lol wait for it smoothness between the two, like AMD vs. Intel next hahaha
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  3. #28
    Xtreme Mentor
    Join Date
    May 2008
    Location
    cleveland ohio
    Posts
    2,879
    Quote Originally Posted by Glow9 View Post
    lol wait for it smoothness between the two, like AMD vs. Intel next hahaha

    Quote Originally Posted by FUGGER View Post
    I have identified the source of the problem and it is an easy fix, running it by Intel first for clarification.

    This was an exploited weak point on certain setups, not all.

    only some systems show this.

    not all.

    he never said what though, we're still waiting for the answer.
    HAVE NO FEAR!
    "AMD fallen angel"
    Quote Originally Posted by Gamekiller View Post
    You didn't get the memo? 1 hour 'Fugger time' is equal to 12 hours of regular time.

  4. #29
    Xtreme Enthusiast
    Join Date
    Dec 2005
    Location
    Peoples Republic of Berkeley (PRB), USA
    Posts
    928
    I like how this guy goes into the review assuming that ATI "isn't doing as much work" on the scene as nV because the interiors of objects are less blurry. Hi, AAs goal here is to make edges less obvious, not to blur the centers of objects.

    As for the implementation, given that ATi’s images show less interior blurring, one possible explanation is that their hardware is somehow detecting texture edges and applying the most anti-aliasing there, but avoiding working too much inside the edges.
    Epic, that's kind of the whole point of antialiasing most things in a game.

  5. #30
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    If you want "blurring" IMHO ATI does that even better!

    Wide tent and narrow tent are good in SOME games like Burnout Paradise, Oblivion and Fallout 3, where it makes the jump from realtime graphics level to something CG-esque. Pretty impressive.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  6. #31
    Xtreme Addict
    Join Date
    May 2005
    Location
    Sugar Land, TX
    Posts
    1,418
    Quote Originally Posted by Glow9 View Post
    lol wait for it smoothness between the two, like AMD vs. Intel next hahaha
    From what I hear ATI is better with AMD and Nvidia Intel. Makes sense since AMD makes the ATI cards now. Both brands are really good and the prices are great. I mean you can get top of the line for $3xx and a dual GPU card for less than $500. I paid over $500 for my single 8800GTX when it was new.

  7. #32
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Frontl1ne View Post
    For those who are saying ATi's AA is better, do you guys set the AA in Catalyst or in the game?
    i use both i set narrow tent in CCC (No AA mode) then set AA in game, and i cant see in game without SS unless there is cell shading with 8xMSAA i can see 4x but its not big things just look off a little on NV cards. but in video playback i would notice instantly
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  8. #33
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    115
    it seems nvidia's default texture LOD might have to be blamed for the seemingly blurrier AA since doesn't produce as much texture shimmering as with ati cards at the cost of sharper textures.
    especially in crysis you can notice the blurriness in nvidia's SS in the last comparison.
    http://www.techenclave.com/998496-post53.html

  9. #34
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    i wouldnt say objective macadamia, he went into this suspecting ati didnt actually use aa or af in some titles or that they used bad quality aa/af.
    in 2 situations he actually found out he was wrong, in 1 regarding the af he thinks hes right and atis filtering is inferior.
    and he praises nvidias msaa without comparing it to normal aa and without mentioning the huge perf hit it causes making it mostly useless.

    he brings up an interesting point tho, nvidias af and aa both make the frame look blurry, which he says is good cause it causes less flickering during game play. that might actually be true, but i wouldnt say this is better vs worse aa/af, its just different preferences. i dont think ati CANT blurr textures when applying aa and af and i dont think its that easy... nvidia has ALWAYS had blurry textures compared to ati, and they had much worse cases of flickering in game play than ati while they were at it... so making textures slightly blurry does not automatically mean better in game feeling and no flickering textures...
    Last edited by saaya; 02-11-2009 at 04:59 AM.

  10. #35
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    People usually see what they expect to see or what somebody else told them they should see. And switching always has an impact - you either like the old or the new. Same thing happened to me with my recent monitor upgrade, I got so used to my old monitor that the new one looked wrong somehow. But objectively I have no idea which has the more accurate picture. Most of the perceived IQ differences are due to contrast and brightness settings and those are completely subjective and depend on your room setup and lighting conditions and personal preference.

    Look at all the clowns who think they see IQ improvements with every incremental beta release from Nvidia

  11. #36
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    Ok, i'm sure some will complain i'm an AMD fanboy but i'm not sure if these guys understand 3D properly.
    Blurier Transparency AA end objects do not mean that graphic card is applying MORE samples. It makes no sense at all.
    The more samples you use, the sharper the image. The less you use, the worse results you'll get. Thats why tree leaves look more detailed on Radeon. And so far i haven't noticed any aliasing on vegetation in any game (Half-Life 2, Far Cry and Crysis).

    As for the normal maps and AF test. They say aliasing doesn't affect that part. But thats not exactly true.
    You can get aliasing on normal and parallax maps just because they are represented as 3D objects (with depth) even though they are flat. I've seen numerous cases with bad aliasing on such objects. Only way to solve them really is to use supersampling. Which is very very power hungry.

    The AF flower test, well i'm not sure the test was done properly. Partially by AMD's fault, partialy by tester.
    NVIDIA is using HQ mode while AMD is using regular mode. The problem is that HQ mode is disabled by default for HD4850 cards and control for it is not enabled in CCC. Not sure why but it's not there. Ray Adams said HQ mode is still valid in ATT. Anyone that could ask AMD why HQ AF mode is not available on their high end cards while HQ is available on older series (i know x1950 Pro had this).

    Can someone give me a working download for this AF tester tool? I used to have it but i can't find it anymore. I'll check if enabling HQ through tweaking boosts AF quality...
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  12. #37

  13. #38
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by trinibwoy View Post
    People usually see what they expect to see or what somebody else told them they should see. And switching always has an impact - you either like the old or the new. Same thing happened to me with my recent monitor upgrade, I got so used to my old monitor that the new one looked wrong somehow. But objectively I have no idea which has the more accurate picture. Most of the perceived IQ differences are due to contrast and brightness settings and those are completely subjective and depend on your room setup and lighting conditions and personal preference.
    yeah, i remember when i switched from crt to tft, took me a while to get adjusted...

    and when i switched from geforce4 to 9500 i didnt notice any differences in image quality, but when i tried a 6600 later on i hated the image quality and sold it quickly. i havent compared iq on cards recently, im using both nvidia and ati atm and dont notice a notable difference, just that nvidia is more blurry with aa.

  14. #39
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    640
    When you go into testing with preconceived ideas, it's quite easy to find exactly what you want to find, and even easier to justify your own biases. Seems with the "testing" that was done in the linked article, that's exactly what happened.....the "reviewer" wanted to find that the nVidia card was faster (and why wouldn't it be.....comparing two cards, one of which costs around $60-$100 more, depending on brand, and what would you expect. If the testing were to be fair, a 4870 1GB card should have been used to compare against his GTX 260 Core 216.......referring back to the first section of the testing. I'd also think the 4870 would have done as well or better in his "image quality" testing........but who am I to talk...........

  15. #40
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Oh, and I just discovered that BFG10K was an ex-nVidia focus group member.

    (Aka back then he probably received free nVidia cards for evangelizing)


    Whether this changes any of your opinions regarding the author and the review, is up to you yourself.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  16. #41
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    Quote Originally Posted by Macadamia View Post
    Oh, and I just discovered that BFG10K was an ex-nVidia focus group member.

    (Aka back then he probably received free nVidia cards for evangelizing)


    Whether this changes any of your opinions regarding the author and the review, is up to you yourself.
    ...

  17. #42
    Xtreme Member
    Join Date
    Dec 2004
    Location
    .ca
    Posts
    476
    Never mind, i saw the difference in leaves. The 4850 has some kind of border around the leaves. (page 4 of the quality test)
    Last edited by TurboDiv; 02-11-2009 at 08:01 AM.
    i9 9900K/1080 Ti

  18. #43
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Macadamia View Post
    Oh, and I just discovered that BFG10K was an ex-nVidia focus group member.

    (Aka back then he probably received free nVidia cards for evangelizing)


    Whether this changes any of your opinions regarding the author and the review, is up to you yourself.
    These kinds of statements need some concrete proof.

  19. #44
    Xtreme Member
    Join Date
    Apr 2008
    Posts
    160
    I have 4870, in 2D mode, colors are more closer to 6500k daylight, thats why HD movies n all more vibrant on ATi. but when browsing etc, it irrtates too, yellowish color tint. Without any tweaking in control panel-gamma settings etc, ATi is better.

  20. #45
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Macadamia View Post
    Oh, and I just discovered that BFG10K was an ex-nVidia focus group member.

    (Aka back then he probably received free nVidia cards for evangelizing)


    Whether this changes any of your opinions regarding the author and the review, is up to you yourself.
    Are you able to share any more information about this? Perhaps an old post with that in his sig or something?
    [SIGPIC][/SIGPIC]

  21. #46
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    115
    Quote Originally Posted by RejZoR View Post
    Ok, i'm sure some will complain i'm an AMD fanboy but i'm not sure if these guys understand 3D properly.
    Blurier Transparency AA end objects do not mean that graphic card is applying MORE samples. It makes no sense at all.
    The more samples you use, the sharper the image. The less you use, the worse results you'll get. Thats why tree leaves look more detailed on Radeon. And so far i haven't noticed any aliasing on vegetation in any game (Half-Life 2, Far Cry and Crysis).

    As for the normal maps and AF test. They say aliasing doesn't affect that part. But thats not exactly true.
    You can get aliasing on normal and parallax maps just because they are represented as 3D objects (with depth) even though they are flat. I've seen numerous cases with bad aliasing on such objects. Only way to solve them really is to use supersampling. Which is very very power hungry.

    The AF flower test, well i'm not sure the test was done properly. Partially by AMD's fault, partialy by tester.
    NVIDIA is using HQ mode while AMD is using regular mode. The problem is that HQ mode is disabled by default for HD4850 cards and control for it is not enabled in CCC. Not sure why but it's not there. Ray Adams said HQ mode is still valid in ATT. Anyone that could ask AMD why HQ AF mode is not available on their high end cards while HQ is available on older series (i know x1950 Pro had this).

    Can someone give me a working download for this AF tester tool? I used to have it but i can't find it anymore. I'll check if enabling HQ through tweaking boosts AF quality...
    HQ mode isn't available for 48xx series,afaik the HQ mode was done away with since the 2900xt launched since the default on them was equal to the high on earlier cards.can you post where Ray Adams specify about using HQ option with latest ai cards?

  22. #47
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    My take on it is (and correct me if I am wrong) is that high quality AF (AKA Trilinear Filtering) maybe tied in with MipMap setting which is on High Quality as default. I've tested this by enabling the HQ feature in CCC then enabling it. Then I compared it to default settings of CCC and noticed no discernible difference. When set to Quality I did notice that the surfaces of some textures were not as sharp as before. Is this just LOD going from negative to postive or something else I am still not sure? If there is another method I am more the willing to read about it.
    Last edited by Eastcoasthandle; 02-11-2009 at 11:08 AM.
    [SIGPIC][/SIGPIC]

  23. #48
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by trinibwoy View Post
    Most of the perceived IQ differences are due to contrast and brightness settings and those are completely subjective and depend on your room setup and lighting conditions and personal preference.
    This is true, but the difference between Nvidia and ATI's default config isn't subjective in my opinion.

    The default settings for gamma, contrast and brightness with Nvidia Forceware drivers are too high, which is what leads inexperienced/incompetent people into believing that Nvidia has blurry AA, washed out colors etc..

    With ATI on the other hand, you don't really need to tweak the contrast, brightness and gamma settings. Everything looks good right off the bat.
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  24. #49
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Particle View Post
    It's interesting that the reviewer keeps talking about how he thought ATi must have just been cheating in all the instances where the 4850 was faster prior to doing his review. Then any time the nVidia results are blurrier, instead of noting inferior IQ he chalks it up to the nVidia card doing more work than the ATi one instead. *rolls eyes*
    The statements are logical. The 260+ is faster than the 4850, so naturally one would scratch their head when the 4850 starts performing better with heavy AA applied.

    He never quite touched down on the reason, though. It is because the HD4000 cards have immense shader power with the 800 SPs, and by design, it is difficult to code a driver that maintains a full workload on them. So when you have AA, you can just let it run on the unused shader power. And while 800 ATI SPs =/= 800 nvidia SPs (nvidia still has a slight edge here if you do the conversion), nvidia does not do AA in the shaders like ATI, it does it in the framebuffer.

    And regarding the "blury" AF on nvidia cards... it is also a lot more uniform in its application (lines are straighter), and, in motion, it looks better than ATI AF.

    I have found this to be true, and has been verified on Anandtech comparing the G80 to the R600 (still applicable, GT200 = more powerful G80; RV770 = more powerful R600, the microarchitectures are the same)

    Here is G80 AF:


    Here is R600 AF:


    source
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  25. #50
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    The 4800 cards can do 8x AA more efficiently than Nvidia cards because the AA work has been moved to the shaders since the R600 generation. 2900 and 3800 cards only had 320 shaders, which were not enough, and provided abysmal AA performance, while 800 shaders are so much better now, with the 4850 even beating a GTX 216+ in many games when it comes to 8x AA.

    I actually liked the AF image quality much better on the 4850 as shown here http://alienbabeltech.com/main/?p=3188&page=9
    Nvidia's IQ is just too blurry for my tastes--it looks like turning down the Level of Detail all the way down. UGH! I'd rather play Doom 3 with a crisp picture!

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

Page 2 of 3 FirstFirst 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •