Page 2 of 8 FirstFirst 12345 ... LastLast
Results 26 to 50 of 188

Thread: NVIDIA Says AMD Reduced Image Quality Settings HD 6800 Series For Better Performance

  1. #26
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Gener_AL (UK) View Post
    And the puppet masters start pulling the strings, Story is contradictory and configuration dependent, point being as normal ATI Image quality is better at default then it was on 5870 as it was 4870>5870.

    Both sides have issues with IQ depending on driver OS and game, and other API.

    Nvidia maybe opening a can of worms for themselves here if anything.
    http://translate.google.com/translat...d_6850/s09.php

  2. #27
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    112
    Quote Originally Posted by XSAlliN View Post
    Check his signature...
    LOL
    Check all who defend AMD include the one you Quote

    AMD fanboys are the most
    double standard fans in the World
    Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
    8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
    MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24Ļin nVidia Surround

  3. #28
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Quote Originally Posted by E30M3 View Post
    LOL
    Check all who defend AMD include the one you Quote

    AMD fanboys are the most
    double standard fans in the World
    nvidia fanboy defends no iq optimizations and calls ati fanboys doubled standarded

    DOUBLE oxymoron...
    Last edited by Dimitriman; 11-20-2010 at 04:29 AM.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  4. #29
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    Quote Originally Posted by Dimitriman View Post
    well for my part i dont expect people working for nvidia to be unbiased. even non related people have bias, so if you work for them i think that comes with the job no?

    anyway if new drivers lower iq default then reviwers should be aware and when cayman is out they should perform all tests in high quality baseline.

    yes it was sarcasm ... or an attempt at that .. anyway i agree on what you said
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  5. #30
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    Quote Originally Posted by E30M3 View Post
    LOL
    Check all who defend AMD include the one you Quote

    AMD fanboys are the most
    double standard fans in the World
    this is not abotu defending; this is about realising that this campaign is more related to the success of barts and a desperate attemp from nvidia to start a mudcampaign because their products in this price range are inferior right now and they aren't happy with their christmas sales...

    AMD doesn't take anything away, they give you more options than you had before and TBH i'm happy to take the performance advantage in BFBC2 because i fail to see a difference in this game...
    trackmania is another story, in this game i'm happy about the NEW HQ setting which completely eliminates banding (BTW: banding is present on the 8800GTS in my other pc...)

    if there is one thing you can critisize them for is that they don't set HQ by default but i'm sure that most users take the extra performance over the IQ because most of them can't even see a difference between mid and high settings (which is sad but true)
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  6. #31
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    so the setting exists, and you just have to turn it on. sounds like such a big deal that we needed a post from nvidia with an exclamation point, after we already had a huge thread about it (and not the fisrt one)
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  7. #32
    Xtreme Addict
    Join Date
    Jan 2007
    Location
    Brisbane, Australia
    Posts
    1,264
    If I had a dollar for every "company x/y uses lower IQ to get higher performance" claim in the last ~10 yrs...

    ...I'd have founded Company Z

  8. #33
    Xtreme Member
    Join Date
    Oct 2009
    Posts
    241

    Thumbs down

    The point is that ATI's default quality is lower then Nvidia's default and once you adjust quality setting for ATI to match Nvidias you incur FPS penelty , according to that article 10%. And after all that PR marketing from ATI that they got superior quality compared to prior 5xxx cards and good performance suddenly becomes a blurred truth.

    Thats a simple fact , noone is arguing that you can set to high quality or what nvidia or ati did in the past , its about now and misleading benchmarks, info people get when deciding what to buy.
    .:. Obsidian 750D .:. i7 5960X .:. EVGA Titan .:. G.SKILL Ripjaws DDR4 32GB .:. CORSAIR HX850i .:. Asus X99-DELUXE .:. Crucial M4 SSD 512GB .:.

  9. #34
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Quote Originally Posted by -=DVS=- View Post
    The point is that ATI's default quality is lower then Nvidia's default and once you adjust quality setting for ATI to match Nvidias you incur FPS penelty , according to that article 10%. And after all that PR marketing from ATI that they got superior quality compared to prior 5xxx cards and good performance suddenly becomes a blurred truth.

    Thats a simple fact , noone is arguing that you can set to high quality or what nvidia or ati did in the past , its about now and misleading benchmarks, info people get when deciding what to buy.
    thats only partially an issue. most reviews test at least 4-5 different iq settings and what you see is what you get - at the highest iq settings and aa the results are as accurate as they can be.
    now if you want to complain about 1280x1024 default iq +no aa gaming then i cant relate since i dont play those settings anyway.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  10. #35
    XS_THE_MACHINE
    Join Date
    Jun 2005
    Location
    Denver
    Posts
    932
    Successful troll thread is successful.



    xtremespeakfreely.com

    Semper Fi

  11. #36
    Xtreme Enthusiast
    Join Date
    Jun 2006
    Location
    Space
    Posts
    769
    The problem is review sites. If they don't do a like for like review based on Image quality, then we're always stuck in this situation.

    I think [H]ardOCP comes closest with reviews that take into consideration settings, but a review site that lowers/raises settings to give equal amounts of image quality is needed (are there any?).

    As for the AMD and Nvidia fanbois in this thread. it's starting to get annoying, as none of you are giving any evidence to back up your arguments. AMD are under no obligation to set "Ultra High IQ" as default, and NVidia isn't either.

  12. #37
    Xtreme Enthusiast
    Join Date
    Jun 2006
    Location
    Space
    Posts
    769
    Quote Originally Posted by rogueagent6 View Post
    Successful troll thread is successful.

    +1

    something needs to be done about the constant fanboi arguments. It's starting to get tiresome, if you can't get off page 1.

  13. #38
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by E30M3 View Post
    but it does not matter when its aMd who cheat???
    That is correct. I was convinced of that when they made a public statement to the effect of "we wanted to bring all products in line with the lower quality in order to be consistent". That took some balls but as evidenced in this thread they knew they could get away with it.

  14. #39
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    LOL @ AMD Droids defending AMD's shoddy IQ

    "B-b-but no one can notice it!!"
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  15. #40
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Did you guys read the blog post?

    For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality.
    Hahahaha, that's so funny. They were "taught" some hard lessons more like it.

  16. #41
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Carfax View Post
    LOL @ AMD Droids defending AMD's shoddy IQ

    "B-b-but no one can notice it!!"
    look at the motives

    AMD did it for better performance.
    Nvidia pointed it out for marketing gains.
    AMD users feel like they dont care about the difference, fix the settings, or feel hurt a little.
    Nvidia users are crying so they can justify their overpriced purchase.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  17. #42
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Motiv View Post
    The problem is review sites. If they don't do a like for like review based on Image quality, then we're always stuck in this situation.
    that's why they include more than FPS numbers in their reviews. there is more to look for in a card than just performance or IQ.

    AMD are under no obligation to set "Ultra High IQ" as default, and NVidia isn't either.
    sure there is no obligation, but is it right to lower image quality to gain performance? we would just end up in a race of degrading IQ.
    Quote Originally Posted by Carfax View Post
    LOL @ AMD Droids defending AMD's shoddy IQ

    "B-b-but no one can notice it!!"
    insightful post.

  18. #43
    Xtreme Addict
    Join Date
    Jul 2005
    Posts
    1,646
    Well if you wanted to make a list of nvidia fanboys, this thread is a good place to start.

  19. #44
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    You meant AMD fanboys, right?

  20. #45
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    Quote Originally Posted by Vardant View Post
    You meant AMD fanboys, right?
    loll he meant the fanboy's that think amd is always doing bad even tho they arent
    Last edited by Sn0wm@n; 11-20-2010 at 08:52 AM.
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  21. #46
    Registered User
    Join Date
    Jun 2010
    Posts
    61
    Quote Originally Posted by Vardant View Post
    You meant AMD fanboys, right?
    There's a diffrence?

  22. #47
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    People can finger point all they want. The video comparisons on the sites linked are pretty much universal in their translation.

  23. #48
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    So ... the card bands in an AF testing program with an artificial texture that nvidia has ironically decried the use of because it's not real?

    Can we get some real games tested, please? And let's try not to only focus on Far Cry or Trackmania.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  24. #49
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    As long as this is not a problem during normal gameplay it's mostly irrelevant. Show me a comparison video of a recent game that shows this in a noticeable way and we can call it a real problem.
    "No, you'll warrant no villain's exposition from me."

  25. #50
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    Quote Originally Posted by EvilOne View Post
    There's a diffrence?
    There's certainly a huge difference in numbers for one, but it is the amount of bias they're capable of, that really sets them apart. Case in point, the AF issue.

Page 2 of 8 FirstFirst 12345 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •