MMM
Page 6 of 8 FirstFirst ... 345678 LastLast
Results 126 to 150 of 188

Thread: NVIDIA Says AMD Reduced Image Quality Settings HD 6800 Series For Better Performance

  1. #126
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Portugal
    Posts
    107
    Quote Originally Posted by cegras View Post
    One complaint about ATI I've had for a while is that their games are substantially darker for the same ingame Source gamma setting. Made some dark, gritty HL2 mods hard to play.
    you could you know change the gamma settings?
    Don't take life too seriously.....no-one's getting out alive.

  2. #127
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Sheesh, one person posts his own views and experiences when swapping a card and gets absolutely shot to pieces. What the hell happened to this forum seriously. It's getting worse and worse.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  3. #128
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Hmmm seems to me it's always been like that and I've been here a while.

  4. #129
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by Scorpio[pt] View Post
    you could you know change the gamma settings?
    I'm talking about each card respectively on their brightest possible ingame settings. Setting them in the driver is a no-no; I don't want the hassle of creating game profiles and that I shouldn't have to do.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  5. #130
    Registered User
    Join Date
    Nov 2010
    Posts
    91
    Quote Originally Posted by Toysoldier View Post
    That's absolute nonsense. There is a difference in the default settings on a NV card vs an ATI card. You should try using a Radeon card for some weeks and then switch to a NV card and you would see for yourself. I'm not saying the ATI card can't be made to look like the nVidia card but it doesn't by default and far from it. And when ATI starts to detect certain AF test programs and then use a better AF method than the driver normally do that's when things start to get out of control.
    I'm with ya Toysoldier After over 2 years of using Nivida gpus, 8400GS > 7900GTX > 9800GT > GTS 250, it was easy for me to the differences between the two chipsets/drivers. One of theory I have is, it has something to to with the way AA is applied. Nvidia gives you more options such as, enhance, force and disable. I always had it set to enhance and 16Q
    Most noticeable was Need for Speed Shift. I had to play around with CCC a fair bit to get it to run the game more smoothly. I think AI was mainly to blame there.
    NEVER be afraid to speak your mind. Those who feel they need to make an example of you for doing so are all to common and should be ignored.
    pEACe

  6. #131
    Registered User
    Join Date
    Feb 2010
    Location
    NVIDIA HQ
    Posts
    76
    Texture shimmer doesn't appear in a screenshot. Motion is required. Your entire comparison is invalid.

    Quote Originally Posted by Katzenschleuder View Post
    All right. Then let us determine this in a controlled test.

    Which screenshot has been taken by a RV870 and which by a GF110?
    http://img94.imageshack.us/img94/6471/dirt2a.jpg
    http://img220.imageshack.us/img220/48/dirt2b.jpg
    http://img819.imageshack.us/img819/8263/metaf.jpg
    http://img254.imageshack.us/img254/4315/metb.jpg
    http://img38.imageshack.us/img38/2775/mwabz.jpg
    http://img202.imageshack.us/img202/7807/mwby.jpg
    http://img841.imageshack.us/img841/7946/vana.jpg
    http://img534.imageshack.us/img534/1826/vanbi.jpg

    Oh and please don't forget to show us where exactly "The image is just so much sharper and the colors so much fuller." based on these comparisons!

    PS.: Note that the image quality settings used here are equal to Radeaon 5870 default and GeForce 580 default as you stated in your comparison.
    NVIDIA Forums Administrator

  7. #132
    Xtreme Member
    Join Date
    Feb 2006
    Location
    La La Land.
    Posts
    250
    Quote Originally Posted by cegras View Post
    I'm talking about each card respectively on their brightest possible ingame settings. Setting them in the driver is a no-no; I don't want the hassle of creating game profiles and that I shouldn't have to do.
    You dont have to create a game profile.
    You can change global colour settings from CCC once and for all.

    Honestly, the two cards produce very different looking images on screen.
    Its personal preference, I like detailed, sharper, better contrast (by default) of ATI. I miss it sometimes on my current 470, but its not a huge deal to be honest. With little adjustment, both can be made to look similar.
    Some will hate this as always. In fact, bad monitors are to blame as well in some cases.

    Primary Rig
    Intel Xeon W3520 @4200Mhz 24x7, 1.200v load (3845A935)
    Gigabyte X58A-UD7
    Patriot Viper II DDR3 2000 CL8
    Tagan BZ1300
    DeepCool Gamer Storm with 2x120mm DeepCool fans.
    MSI GTX 470 Twin Frozr II
    Zotac GTX 470 AMP edition.
    GPU collection : http://www.xtremesystems.org/forums/...5&postcount=64




    Rig2
    Phenom II x4 965
    MSI 790GX-GD65
    2GBx2 Corsair DDR3 1333
    Tagan tg500-u37
    Arctic Cooling Freezer 7 Pro Rev.2
    XFX 9600GT


  8. #133
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by Funky View Post
    You dont have to create a game profile.
    You can change global colour settings from CCC once and for all.

    Honestly, the two cards produce very different looking images on screen.
    Its personal preference, I like detailed, sharper, better contrast (by default) of ATI. I miss it sometimes on my current 470, but its not a huge deal to be honest. With little adjustment, both can be made to look similar.
    Some will hate this as always. In fact, bad monitors are to blame as well in some cases.
    It's not a colour problem, it's a gamma problem. Brightness is simply too low relative to nvidia at the same setting. I mean, it's not terrible, but it's a minor annoyance.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  9. #134
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Posts
    800
    Quote Originally Posted by cegras View Post
    It's not a colour problem, it's a gamma problem. Brightness is simply too low relative to nvidia at the same setting. I mean, it's not terrible, but it's a minor annoyance.
    Colour settings also has gamma settings.

  10. #135
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    Personally, when i try my friend's GTX 460, the IQ sucks compared to my HD 4870. So, nVidia might have overoptimised its driver according to my personal experience, right ? That HAS to be right !



    So much hysteria, so much soap opera.

  11. #136
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    IMO, the bottom line is that lowering quality, even just default quality, isn't cool - no matter what company does it.

  12. #137
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Katzenschleuder View Post
    All right. Then let us determine this in a controlled test.

    Which screenshot has been taken by a RV870 and which by a GF110?
    http://img94.imageshack.us/img94/6471/dirt2a.jpg
    http://img220.imageshack.us/img220/48/dirt2b.jpg
    http://img819.imageshack.us/img819/8263/metaf.jpg
    http://img254.imageshack.us/img254/4315/metb.jpg
    http://img38.imageshack.us/img38/2775/mwabz.jpg
    http://img202.imageshack.us/img202/7807/mwby.jpg
    http://img841.imageshack.us/img841/7946/vana.jpg
    http://img534.imageshack.us/img534/1826/vanbi.jpg

    Oh and please don't forget to show us where exactly "The image is just so much sharper and the colors so much fuller." based on these comparisons!

    PS.: Note that the image quality settings used here are equal to Radeaon 5870 default and GeForce 580 default as you stated in your comparison.
    with dirt 2 it was number one (u can tell by the better AA on the box but it need a 400% zoom), i dont know on the others but i cannot see anything that is not AA/AF related
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  13. #138
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Aarhus, Denmark
    Posts
    314
    Quote Originally Posted by duploxxx View Post
    the comment I provided was you pointing out that there is no added value going from 5870 to 6870 ---> off course not that is the hole point, 6870 is a new price range in the market it is not intended to replace the 58xx series, you still don't get it....and if you already own a 5870 i can't think of any reason to spend another +500$ to buy a new card without waiting 2-3 more weeks, that is what I call consumer.
    So in your opinion there is no gain in going from a Radeon HD5870 to a GeForce GTX 580 ?
    AMD Ryzen 9 5900X
    ASRock Radeon RX 7900 XTX Phantom Gaming OC
    Asus ROG Strix B550-F Gaming Motherboard
    Corsair RM1000x SHIFT PSU
    32 GB DDR4 @3800 MHz CL16 (4 x 8 GB)

    1x WD Black SN850 1 TB
    1 x Samsung 960 250 GB
    2 x Samsung 860 1 TB
    1x Segate 16 TB HDD

    Dell G3223Q 4K UHD Monitor
    Running Windows 11 Pro x64 Version 23H2 build 22631.2506

    Smartphone : Samsung Galaxy S22 Ultra

  14. #139
    Xtreme Enthusiast
    Join Date
    Oct 2008
    Posts
    678
    Quote Originally Posted by Toysoldier View Post
    So in your opinion there is no gain in going from a Radeon HD5870 to a GeForce GTX 580 ?
    More like you should wait three weeks so you can be sure if GTX 580 is the right choice.

  15. #140
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Quote Originally Posted by Toysoldier View Post
    So in your opinion there is no gain in going from a Radeon HD5870 to a GeForce GTX 580 ?
    IMO from 5870 to gtx580 is upgradeable but from gtx480 to gtx580 is a waste of money. Anything a 580 can do a 480 can also do in terms of playing games.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  16. #141
    Registered User
    Join Date
    Dec 2009
    Posts
    63
    AMD puts image quality debate to bed

    There's been a lot of talk about GPUs and image quality lately, and as the party on the receiving end of some of the accusations, AMD felt the need to set the record straight. That's why we were invited to talk to Senior Manager of Software Engineering Andy Pomianowski and Technical Marketing Manager Dave Nalasco about image quality and the ruckus that NVIDIA kicked off last week.

    The settings, they are a-changin'

    Dave explained to us that there had been some changes to the Catalyst drivers to coincide with the release of the HD 6000-series GPUs, and that image quality had been a big part of that. At the heart of all this is Catalyst AI, which controls a whole host of different settings via a single slider.

    Responding to feedback, this single slider was divided into a number of different settings in the latest release, giving users a bit more control. One of the new additions was a slider to control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.

    High Quality turns off all optimisations and lets the software run exactly as it was originally intended to. Quality - which is now the default setting - applies some optimisations that the team at AMD believes - after some serious testing, benchmarking and image comparisons - will maintain the integrity of the image while increasing the application performance. Lastly, the Performance setting applies even more of these optimisations to squeeze out a few more frames, but risks degrading the image quality just a bit.

    What do you see?

    Dave acknowledged that some sources had observed visual anomalies when running a few games and benchmarks. He explained that the algorithms that the drivers run - notably anisotropic filtering - are very complex and that despite their best efforts, the image wasn't going to be perfect 100 per cent of the time, even on default settings.

    What he stressed was that, in the opinions of the whole driver development team, the default settings and optimisations still offered the best performance with no noticeable drop in quality for the vast majority of users the vast majority of the time. And for those who were experiencing any problems, High Quality mode would always be there to allow a picture perfect image. This, he made clear, wasn't going to change any time soon.

    And then something strange happened - Andy asked us what we thought. These guys seemed genuinely concerned about what we felt were the best settings to use, whether we'd experienced any problems, and what we would change if we were designing the Catalyst tools. They're clearly committed to delivering the best product that they can, and that means listening to feedback and taking on board what the press, as well as average gamers, think.

    Hopefully, this whole image quality debate can now be put to bed. At least for the time being.

    Source:
    http://www.hexus.net/content/item.php?item=27786
    Asus Crosshair IV | AMD 1090t @ 4.0ghz | 2x2gb G-Skill Trident 1800mhz cl7 |XFX AMD 6970 2gb | 128gb Crucial m4 | Corsiar AX850
    Silverstone SST-FT02B-WRI Fortress | Yamaha A-S500 | Monitor Audio BX2 | ASUS Xonar Essence STX

  17. #142
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    So basically they admit that they have turned it down a tad too much, but in the team's opinion it's still great for the majority of people? Hmm,not sure I fully agree with that....

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  18. #143
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    Quote Originally Posted by Tim View Post
    So basically they admit that they have turned it down a tad too much, but in the team's opinion it's still great for the majority of people? Hmm,not sure I fully agree with that....
    i only see a difference in very few games (Trackmania is an example) and i'm happy that i have this new option for all other games i play
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  19. #144
    Xtreme Addict
    Join Date
    Jul 2008
    Location
    US
    Posts
    1,379
    Quote Originally Posted by Funky View Post
    You dont have to create a game profile.
    You can change global colour settings from CCC once and for all.

    Honestly, the two cards produce very different looking images on screen.
    Its personal preference, I like detailed, sharper, better contrast (by default) of ATI. I miss it sometimes on my current 470, but its not a huge deal to be honest. With little adjustment, both can be made to look similar.
    Some will hate this as always. In fact, bad monitors are to blame as well in some cases.
    The color settings in CCC have been a big problem on CF systems since...maybe...10.6 or so. Move any of those sliders on a multi gpu system (58xx at least) and you get a nice pink screen. Or, that's my experience at least (as well as the experience of others I've spoken with).

    --Matt
    My Rig :
    Core i5 4570S - ASUS Z87I-DELUXE - 16GB Mushkin Blackline DDR3-2400 - 256GB Plextor M5 Pro Xtreme

  20. #145
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,264
    I can honestly say that I'm noticing less texture shimmering on a GTX 580 with texture quality set to high quality compared to my previous HD 5870 with Catalyst AI set to high quality (10.10e hotfix ) I'd still like to see how the 5800 compare to the 6800s when they both have acess to these newer Catalyst AI options.
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

  21. #146
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    i like AMDs response. thanks to all the complaints, we now have more tools to play with in CCC.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  22. #147
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    But the question remains: should reviewers use the HQ setting in their articles?

    I won't venture my opinion just yet since I want to hear what you guys have to say.

  23. #148
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by SKYMTL View Post
    But the question remains: should reviewers use the HQ setting in their articles?

    I won't venture my opinion just yet since I want to hear what you guys have to say.
    I think an article showing the performance and quality difference would be a nice start, and quite necessary to form an opinion in this case.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  24. #149
    Xtreme Addict
    Join Date
    Oct 2007
    Location
    Chicago,Illinois
    Posts
    1,183
    High quality only,show's who has best IQ and performance @ that IQ level.Other settings for IQ testing only.



  25. #150
    Xtreme Addict
    Join Date
    Jan 2009
    Posts
    1,445
    omg why does this matter so much? this thread has gone on for six pages...about quality settings?
    [MOBO] Asus CrossHair Formula 5 AM3+
    [GPU] ATI 6970 x2 Crossfire 2Gb
    [RAM] G.SKILL Ripjaws X Series 16GB (4 x 4GB) 240-Pin DDR3 1600
    [CPU] AMD FX-8120 @ 4.8 ghz
    [COOLER] XSPC Rasa 750 RS360 WaterCooling
    [OS] Windows 8 x64 Enterprise
    [HDD] OCZ Vertex 3 120GB SSD
    [AUDIO] Logitech S-220 17 Watts 2.1

Page 6 of 8 FirstFirst ... 345678 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •