Page 4 of 8 FirstFirst 1234567 ... LastLast
Results 76 to 100 of 188

Thread: NVIDIA Says AMD Reduced Image Quality Settings HD 6800 Series For Better Performance

  1. #76
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by PaganII View Post
    Agree. It is the reviewers doing the misleading. Like back when Intel Quad first came out and you get the reviews that show how powerful the quad is... ya, @ 640 x 480 ha ha ha. They don't show that a dual core beats it when you raise the resolution.
    Nobody... uses default settings. When you start a game you see what you get at MAX settings and then tone it down if you need higher framerate.
    So the whole argument is a fabrication.
    u change the driver settings for each game, other than the OGL forcing triple and Z buffering i dont change anything on the performance settings, and i assume that most do the same. your speaking of games this thread is about the driver control panel
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  2. #77
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    Quote Originally Posted by E30M3 View Post
    AMD = FAIL!
    so amd cheat again
    but it does not matter when its aMd who cheat???
    amd have failed in so many ways recently
    they are much worse than nVidia has ever been

    Why can not they just admit they also have lost this round and move forward
    what's more interesting is why you're having so much trouble putting together a coherent sentence, yet you're posting this on a >9000$ machine.

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  3. #78
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    112
    can not we just agree it is wrong to cheat
    and when they do we Protests
    this time it was amd next time it's maybe nVidia doing it

    they just should not start lowering quality in a race about who is fastest
    it only affects us and gaming in a bad way.
    Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
    8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
    MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24¨in nVidia Surround

  4. #79
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    112
    Quote Originally Posted by biohead View Post
    what's more interesting is why you're having so much trouble putting together a coherent sentence, yet you're posting this on a >9000$ machine.
    Hmm you can try to write in a language,
    which is not your native language.
    it is not so easy
    not for me at least

    Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
    8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
    MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24¨in nVidia Surround

  5. #80
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    Quote Originally Posted by E30M3
    AMD = FAIL!
    so amd cheat again
    but it does not matter when its aMd who cheat???
    amd have failed in so many ways recently
    they are much worse than nVidia has ever been

    Why can not they just admit they also have lost this round and move forward.
    you can keep telling this over and over again yet i enjoy a higher IQ on my 6850 than what was possible on the 5850 which was proven over and over again

    it's also proven that the difference between HQ (better than 5xxx series and my old 8800) and normal quality (worse quality in older games, same quality in new games) is around 5% which still puts the 68xx series in front of the gtx460 in terms of performance and price/performance

    i don't see how amd lost this round, they own a 3-4 times higher market share than nVidia in the DX11 markt, they dethroned NV in overall market share and their higher priced products make up a significantly higher proportion of their revenue than nvidias...

    sure nvidia is still pretty competitive right now but unless you want to buy a card above 400€ (GTX580) amd is the way to go due to output option, video playback quality, performance, price and power consumption (that is from 50€ to 300€) if you don't have any brand preferations at all...

    this might change in the next round, and certainly was the other way round in the 8800 times but to claim that amd lost this round is ignorance (just as ignorant as some AMD fanboys continue to claim that AMD didn't loose against i7, they hold on to certain price points and certain workloads but overall they lost just like nvidia...)
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  6. #81
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    In before nvidia people cite ~70% marketshare leftover from 79xx and 88xx series.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  7. #82
    Xtreme Addict
    Join Date
    Feb 2007
    Location
    Denmark
    Posts
    1,450
    Quote Originally Posted by E30M3 View Post
    AMD = FAIL!
    so amd cheat again
    but it does not matter when its aMd who cheat???
    amd have failed in so many ways recently
    they are much worse than nVidia has ever been

    Why can not they just admit they also have lost this round and move forward
    HAHAHA! Remember good old GeForce FX?

  8. #83
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Frontl1ne View Post
    Can anyone post up links to videos or screenshots displaying this lowering of IQ in games? I can't for the life of me see anything that looks particularly bad while playing Dirt 2 and BFBC2.
    I too would like an answer to this question.
    [SIGPIC][/SIGPIC]

  9. #84
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Eastcoasthandle View Post
    I too would like an answer to this question.
    I have seen the minor flickering mentioned in the articles within Just Cause 2, CoD: MW2 and Dragon Age: Origins with the 10.10 drivers. Videos and pics to come after I get my main system up and running again.

  10. #85
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    And this forum proves once again that an objective opinion is hard to come by.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  11. #86
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    ive been saying it for a while... direct x should include a default render mode that has to pass a series of tests, so it does NOT get tweaked/optimized...
    so people can chose to use this mode or the optimized settings ati and nvidia offer that boost performance by SLIGHTLY reducing image quality...
    if im on a mainstream gpu ill appreciate the latter, but if im playing an older game i want max image quality and no optimizations at all...
    and how can you sell 500$+ videocards to people that dont render the games the way they were supposed to look but blur things to boost the performance by a few percentage points?
    thats just plain stupid...

    btw, i dont get those videos... whats the right side of the video?
    the flickering on the left side of the 6000 videos is def worse than in the 5000 videos.
    BUT, on the nvidia videos is a slight flicker both left and right... the left side looks better than the 6000 but not really better than the 5000.
    Last edited by saaya; 11-21-2010 at 04:28 AM.

  12. #87
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by saaya View Post
    btw, i dont get those videos... whats the right side of the video?

    BUT, on the nvidia videos is a slight flicker both left and right... the left side looks better than the 6000 but not really better than the 5000.

    Left is a GPU rendering the scene. At the right is the software renderer which my its very nature doesn't have ANY optimizations.

  13. #88
    Xtreme Enthusiast
    Join Date
    Oct 2007
    Location
    Hong Kong
    Posts
    526
    Quote Originally Posted by saaya View Post
    ive been saying it for a while... direct x should include a default render mode that has to pass a series of tests, so it does NOT get tweaked/optimized...
    so people can chose to use this mode or the optimized settings ati and nvidia offer that boost performance by SLIGHTLY reducing image quality...
    if im on a mainstream gpu ill appreciate the latter, but if im playing an older game i want max image quality and no optimizations at all...
    and how can you sell 500$+ videocards to people that dont render the games the way they were supposed to look but blur things to boost the performance by a few percentage points?
    thats just plain stupid...

    btw, i dont get those videos... whats the right side of the video?
    the flickering on the left side of the 6000 videos is def worse than in the 5000 videos.
    BUT, on the nvidia videos is a slight flicker both left and right... the left side looks better than the 6000 but not really better than the 5000.
    This is useless.
    The drivers from AMD and NVIDIA can detect the "default quality test" and use standard algorithm.

  14. #89
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Wild West, USA
    Posts
    655
    Quote Originally Posted by Dimitriman View Post
    If amd fanboys behaved exactly the same there would be a new thread on the news forum everyday about how the gtx 580 still gets its ass handed by a 1 year old 5970 and is a power hog.
    Not in min FPS. In most reviews i seen 5970 drops way below 580. And you know that's whats makes or brakes a game. I don't care if i cant get 10 fps more on high. If FPS fluctuate like crazy then game is much more unplayable. When i play any first person shooter i want stable FPS and definitely dont want them to drop into single digits. I wont post any slides just check Anand's review. And btwy most crossfire setup suffer from that. You ether agree with me that min FPS is more important then high for good smooth gaming or you a fanboy. That's why I'm not so found of dual gpu cards, and would never get card like 5970 for gaming. Would definitely pick up 5870 instead.
    Abit IC7 P4 2.8a @4.21 | P4 3.4e @4.9 | Gainward 6800GT GS @486/1386
    Asus P4P800 SE Dothan 730-PM @ 2900 | EVGA 6800 Ultra GS @521/1376

    e8400@4.3G & 8800GTS G92 800/1932/1132 as gaming rig 24/7

    Custom self build chillbox with watercooling @-28c 24/7 | chilled wc " cpu -18c idle/-3c load
    3DMark 2005 Score Dothan & 6800U
    3DMark 2005 Score p4 & 6800GT

  15. #90
    Xtreme Mentor
    Join Date
    Mar 2006
    Location
    Evje, Norway
    Posts
    3,419
    5970 has higher minimum (and avg) than 580 in AvP (dx11), BFBC2 (dx11), Just cause 2 (dx10).
    580 wins in Dirt2 (dx11), Lost planet (dx11) and SC2 (dx9)
    in Metro (dx11) theire about even in minimum fps....

    This is from Hardware Canucks (1 of few review site i trust)

    So it depends on wich games you play really.

    Edit: Also looked at Anands 580 review. They only show minimum framrates in 2 games, 1 wich is is crysis warhead wich looks like it need more than 1gb ram (normal 5970 has only 1 gb ram) for thoose settings at 2560x1600, so ofc the 580 will do much better there since it got 1,5gb ram (5970 was better than 580 @ 1920x1200)

    Edit2: Lab501 wich is also an excellent site wich has avg and minimum framerates.
    Theire 580 vs 5970:
    5970 wins in SC2, CoD4, just cause 2, AvP, Mafia2, Medal of honor, Cod:black ops
    580 wins in Farcry 2, Crysis warhead, Dirt2, Hot pursuit, Metro 2033, Hawx 2 (tho same minimum), warhammer 4000, Battleforge, BFBC2, Darksiders

    5970 was 1 fps ahead in resident evil 5 but so close i call that even. Same minimum in Hawx too, but 580 is slightly ahead in avg.
    580 quite ahead in in avg on lost planet 2, but 5970 still had higher minimum.
    580 slightly ahead in min on Batman Arhkam Asylum, but slightly behind in avg.

    You can call anyone who dont agree with you fanboy, but the black and white picture you are painting makes you look way more fanboyish than others. Both card got theire strengths and weaknesses. Looking at just the 2 reviews i looked at now, ive would say 580 pulls the longest straw. Unless you only play certain games where the 5970 shines...
    (and just for the record, i wouldnt buy either of them)
    Last edited by eXa; 11-21-2010 at 10:31 PM.
    Quote Originally Posted by iddqd View Post
    Not to be outdone by rival ATi, nVidia's going to offer its own drivers on EA Download Manager.
    X2 555 @ B55 @ 4050 1.4v, NB @ 2700 1.35v Fuzion V1
    Gigabyte 890gpa-ud3h v2.1
    HD6950 2GB swiftech MCW60 @ 1000mhz, 1.168v 1515mhz memory
    Corsair Vengeance 2x4GB 1866 cas 9 @ 1800 8.9.8.27.41 1T 110ns 1.605v
    C300 64GB, 2X Seagate barracuda green LP 2TB, Essence STX, Zalman ZM750-HP
    DDC 3.2/petras, PA120.3 ek-res400, Stackers STC-01,
    Dell U2412m, G110, G9x, Razer Scarab

  16. #91
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Wild West, USA
    Posts
    655
    It does that with new tweaked driver which supposedly gives 10% boost? That probably has nothing to do with that eh When you look at a review where they used full IQ picture changes some what. 6800's look pointless as in most cases 5800's takes the cake. Even in tessellation benchmarks which supposedly 6800's were redesign for they are far from shining.

    And it's not about d** measuring contest. You cant compare dual gpu to single anyway. And Nvidia has the fastest single gpu card on the marker today.
    99% would never purchase 5970 or upcoming 595. Most will probably go Xfire or SLI first. \

    Oh and i hear that 5970 is not so smooth gaming card anyway. It has its own share of problems. That was my point. SLI is not perfect ether so i still outright dismiss anything like dual gpu, xfire, sli as a gaming platform. For benchmarking sure but for gaming give me single gpu card under 230W full load.
    Last edited by railer; 11-22-2010 at 02:02 AM.
    Abit IC7 P4 2.8a @4.21 | P4 3.4e @4.9 | Gainward 6800GT GS @486/1386
    Asus P4P800 SE Dothan 730-PM @ 2900 | EVGA 6800 Ultra GS @521/1376

    e8400@4.3G & 8800GTS G92 800/1932/1132 as gaming rig 24/7

    Custom self build chillbox with watercooling @-28c 24/7 | chilled wc " cpu -18c idle/-3c load
    3DMark 2005 Score Dothan & 6800U
    3DMark 2005 Score p4 & 6800GT

  17. #92
    Xtreme Mentor
    Join Date
    Mar 2006
    Location
    Evje, Norway
    Posts
    3,419
    Quote Originally Posted by railer View Post
    It does that with new tweaked driver which supposedly gives 10% boost? That probably has nothing to do with that eh
    They both use 10.10 and 262.99 (so does anand)
    I dont understand your point, shouldnt you use the latest and best drivers?
    You are not making any sence...

    Quote Originally Posted by railer View Post
    When you look at a review where they used full IQ picture changes some what. 6800's look pointless as in most cases 5800's takes the cake. Even in tessellation benchmarks which supposedly 6800's were redesign for they are far from shining.
    Dont let the 68XX naming confuse you, they are not really ment to replace the 58XX. But they still do better than 58XX in tesselation....

    Quote Originally Posted by railer View Post
    And it's not about d** measuring contest.
    If it where, i would only have showed you the numbers from the card i own.
    But i dont own either of them. So moot point...
    (And by doing so i would only look un-objective and like a huge fanboy)

    Quote Originally Posted by railer View Post
    You cant compare dual gpu to single anyway. And Nvidia has the fastest single gpu card on the marker today.
    Now THIS is true fanboy talk
    Ofc you can compare them, they are both graphics card right? They also happen to cost roughly the same and both are power hogs and they are also the best card each maker has atm...

    Quote Originally Posted by railer View Post
    99% would never purchase 5970 or upcoming 595. Most will probably go Xfire or SLI first. \

    Oh and i hear that 5970 is not so smooth gaming card anyway. It has its own share of problems. That was my point. SLI is not perfect ether so i still outright dismiss anything like dual gpu, xfire, sli as a gaming platform. For benchmarking sure but for gaming give me single gpu card under 230W full load.
    You may ofc do all that if you want, no one is forcing you to do anything.
    Quote Originally Posted by iddqd View Post
    Not to be outdone by rival ATi, nVidia's going to offer its own drivers on EA Download Manager.
    X2 555 @ B55 @ 4050 1.4v, NB @ 2700 1.35v Fuzion V1
    Gigabyte 890gpa-ud3h v2.1
    HD6950 2GB swiftech MCW60 @ 1000mhz, 1.168v 1515mhz memory
    Corsair Vengeance 2x4GB 1866 cas 9 @ 1800 8.9.8.27.41 1T 110ns 1.605v
    C300 64GB, 2X Seagate barracuda green LP 2TB, Essence STX, Zalman ZM750-HP
    DDC 3.2/petras, PA120.3 ek-res400, Stackers STC-01,
    Dell U2412m, G110, G9x, Razer Scarab

  18. #93
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    way to blow things out of proportions, it took a month to find the differences between both parts, not a single person who switched from NV to AMD or an older AMD card noticed a big difference...
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  19. #94
    Xtreme X.I.P. Particle's Avatar
    Join Date
    Apr 2008
    Location
    Kansas
    Posts
    3,219
    Quote Originally Posted by Amorphous View Post
    Look at the videos and tell me you wouldn't notice the difference between HD 6870 and GTX 470's IQ. It'll be extremely obvious in every title. Even cranked up, the HD 6800 doesn't compare to the GTX 400's default setting.
    I can't tell a difference among any of the videos on that page. Each video card under each quality setting looks the same as the others and the ALU reference as well to me. Is there a better test that shows an easily discernible difference?
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Rule 3:
    When it comes to computer news, 70% of Internet rumors are outright fabricated, 20% are inaccurate enough to simply be discarded, and about 10% are based in reality. Grains of salt--become familiar with them.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

    Random Tip o' the Whatever
    You just can't win. If your product offers feature A instead of B, people will moan how A is stupid and it didn't offer B. If your product offers B instead of A, they'll likewise complain and rant about how anyone's retarded cousin could figure out A is what the market wants.

  20. #95
    Xtreme Member
    Join Date
    May 2009
    Location
    Italy
    Posts
    328
    Last edited by Gilgamesh; 11-22-2010 at 07:01 AM. Reason: edit repost

  21. #96
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Budaors, Hungary.
    Posts
    143
    Funny how nVIDIA makes a scene about this, considering they use FP16 demotion too since Rel.260 ForceWare drivers and that you can't turn off in the control panel... Though you can turn it off with a utility that is not available to the public.

    Got to love marketing BS.

    "We are going to hell, so bring your sunblock..."

  22. #97
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    Nice try...
    If you wish to test with Need for Speed: Shift or Dawn of War 2, we have enabled support for FP16 demotion – similar to AMD – in R260 drivers for these games. By default, FP16 demotion is off, but it can be toggled on/off with the AMDDemotionHack_OFF.exe and AMDDemotionHack_ON.exe files which can be found on the Press FTP.
    http://www.geeks3d.com/20100916/fp16...e-says-nvidia/

  23. #98
    Extreme BodyBuilder SnW's Avatar
    Join Date
    Aug 2008
    Location
    Holland,NH
    Posts
    271
    Quote Originally Posted by slaveondope View Post
    Ive got your IQ control right here.....



    Haven't seen a jaggy line yet.
    I was going to skip this thread but this made me go reply
    I am a NvidiA fanboy simply because of F@H ...

  24. #99
    Xtreme Member
    Join Date
    Apr 2010
    Location
    Budaors, Hungary.
    Posts
    143
    Quote Originally Posted by Vardant View Post
    It is on and was on for a couple of releases, for at least the games mentioned in the article you've linkend as far as I am aware.

    What really baffles me is that I don't know why are they really doing this kind of work, since both AMD and nVIDIA hardware are capable of full-speed FP16 texture filtering. Though I don't know why I am "surprised" that they are at it again after missing shaders and objects in games like Crysis and Far Cry 2...

    "We are going to hell, so bring your sunblock..."

  25. #100
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    So you obviously have proof, if you're so adamant about this, even though NV admitted putting it in and said it's off by default and can't be turned on through CP, right?

Page 4 of 8 FirstFirst 1234567 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •