MMM
Page 1 of 2 12 LastLast
Results 1 to 25 of 39

Thread: Nvidia DX11 AA-bits petition

  1. #1
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594

    Nvidia DX11 AA-bits petition

    As many of you know, Nvidia has AA-bits that can be used to force antialiasing in (some) games where AA is not natively supported or to improve the quality of AA vs. the ingame solution.

    The problem: These bits are only available for DX9. There is a handful of DX10 games with predefined bits and for DX11 there is nothing. Many DX10/11 games exhibit visible aliasing even when the ingame-AA is enhanced to SGSSAA since we can no longer influence how and where the game applies antialiasing.

    Therefore a petition was started today to get Nvidia to make efforts to supply those AA-bits for DX10/11, too.
    https://forums.geforce.com/default/t...iver-profiles/

    Pictures illustrating the shortcomings of ingame-AA and the possibilities of force AA are available in the petition thread over at Nvidia forums.
    Your support of this petition would be greatly appreciated.

  2. #2
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Just a heads up:
    Yes. I'm working on it. If you guys can just keep on posting why this is important to you, it will help out a lot. Our preference is for in game menus to provide fully advanced AA modes but we understand that doesn't always work out.
    https://forums.geforce.com/default/t...91774/#3891774

    So if you haven't already, please support this petition

  3. #3
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    I recently got a GTX 760. I do not recall having this problem with my AMD GPUs. Does AMD have this in their drivers already?
    Core i7 2600K@4.6Ghz| 16GB G.Skill@2133Mhz 9-11-10-28-38 1.65v| ASUS P8Z77-V PRO | Corsair 750i PSU | ASUS GTX 980 OC | Xonar DSX | Samsung 840 Pro 128GB |A bunch of HDDs and terabytes | Oculus Rift w/ touch | ASUS 24" 144Hz G-sync monitor

    Quote Originally Posted by phelan1777 View Post
    Hail fellow warrior albeit a surat Mercenary. I Hail to you from the Clans, Ghost Bear that is (Yes freebirth we still do and shall always view mercenaries with great disdain!) I have long been an honorable warrior of the mighty Warden Clan Ghost Bear the honorable Bekker surname. I salute your tenacity to show your freebirth sibkin their ignorance!

  4. #4
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    There is no problem per se. Look at Max Payne 3 for example. The ingame-MSAA is extremely poor, not smoothing all edges. Under DX9 you could force AA with custom bits that was perfect. But that isn't allowed anymore for DX10+.
    This concerns AMD and Nvidia both.
    Now Nvidia has had AA compatibility bits since forever in their drivers, but due to DirectX policy only for DX9 and lower. We want to change that and get full control over AA again in modern titles. AMD doesn't have any compatibility bits, not even for DX9. They have AA profiles, but those are not that flexible.

  5. #5
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Mechromancer View Post
    I recently got a GTX 760. I do not recall having this problem with my AMD GPUs. Does AMD have this in their drivers already?
    not for DX11, and it is not that simple like it was before. you could only do it before in the frame buffer that was for the outgoing signal when no effects were being done, but with DX11 they do effects in the buffer and it is protected content.
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  6. #6
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Good idea, I expressed my interest on their forums as well.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  7. #7
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Yea, it would be foolish by Nvidia to ignore this petition. I made some good arguments as to why in the forums:

    Quote Originally Posted by RPGWiZaRD
    Signed, in the following I will also make valid arguments as to why this should really be in the best interest for NVIDIA to fix/implement:

    1. It slows down adaption of next-gen graphics since users are by their own will "downgrading" to DX9 to get proper AA support (proper AA support outweighs the added benefits of DX10/11 exclusive features & benefits). This is a bad thing from NVIDIA marketing standpoint, if anything, you want the graphics and technologies to advance as much as possible in order to fully benefit the features and get full optimization advantage of new-generation NVIDIA graphics cards. PC gaming industry and NVIDIA are quite bound together, if PC gaming development slows down on the graphics front, NVIDIA's business also will suffer because customers can't justify spending money on new expensive cards that won't be fully utilized anyway.

    2. Full & high quality AA support has been one of those important differences that has set console gaming and PC gaming apart marketing and appeal-wise. In an already slowly dimishing PC gaming market, we don't need further catalysators that speed-up the process and let console gaming industry gain even higher market share as PC gaming starts offering less and less advantages/differences.

    3. A lot of users currently posess very high compute graphics horses in the Geforce 700 series cards especially that they can many times not even fully take advantage of. Advanced or high quality AA rendering would be one very good use of that extra compute power that resides in these cards that will benefit A LOT of users as no one really likes jagged edges!

    4. Think of all that positive feedback and marketing possibilites by being able to say, "users requested & we delivered!". "Cooperative development interaction between users and NVIDIA lead to expanded marketing growth!". NVIDIA has always had a bit better reputation when it comes to drivers and communities, it should be in your best interest to keep it that way or even improve upon that!

    5. This would be quite a nail to the coffin for the competitor, you already from earlier have had good marks when it comes to developing features that the competitor doesn't even support or possibly won't support either, these exclusive features are often big reasons your customers turned to the "Green" side.

    6. Proper AA support has been there before, why should it now be left out? That's quite illogical to go backwards in the development, instead it should be focused on being able to utilize it as much as possible as AA is one of the very most important graphics quality functions with the graphics cards.

    7. Just think about all the positive reactions it will get around the Internet among PC gaming enthusiasts and the word will then also spread elsewhere. This will have big impact not necessarily instantly but definitely in the long-term!

    8. You can make very realistic looking graphics but as long as jagged edges will be present, you cannot fully capture the true sense of realism as we're not used to see jagged edges in real-life and it therefore becomes an ever increasing annoyance the further the graphics develop!
    Last edited by RPGWiZaRD; 08-20-2013 at 04:31 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  8. #8
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    These advanced aa modes are one of the few things keeping me with Nvidia cards now that AMD has been moving towards fixing microstutter with crossfire.

  9. #9
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    Quote Originally Posted by zanzabar View Post
    not for DX11, and it is not that simple like it was before. you could only do it before in the frame buffer that was for the outgoing signal when no effects were being done, but with DX11 they do effects in the buffer and it is protected content.
    so nvidia may not accomplish this by itself only?


    When i'm being paid i always do my job through.

  10. #10
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by kromosto View Post
    so nvidia may not accomplish this by itself only?
    Really we should petition Microsoft to patch DX 11+ to allow for nVidia & AMD to better use their hardware capabilities. I don't doubt that given enough time, money & manpower nVidia and AMD could get it to work in a fashion but only Microsoft can do it quickly & cheaply.

  11. #11
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    Quote Originally Posted by DeathReborn View Post
    I don't doubt that given enough time, money & manpower nVidia and AMD could get it to work in a fashion but only Microsoft can do it quickly & cheaply.


    When i'm being paid i always do my job through.

  12. #12
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Be careful what you ask for, I recently tried using Nvidia Inspector to force sparse grid super sample AA to make use of my GPU power, and my temperature shot up to 100 degrees on one of my cards, and 90 in the other, compared in regular use with driver / ingame AA only with which it stays 20-25 degrees cooler.

    I only tried that in games which didnt respond to forced driver AA (a couple of MMOs), and found that it wasnt worth having both my cards running at 90%+ GPU usage and temperatures like that even though it was the most perfect AA I have seen so far. It was equivalent to running furmark without the voltage GPU clock reductions that happen with graphics cards today when running that much stress.

    Most games do work with 64x SLI AA + supersample transparency enabled in the settings, but wish that I could use all these new and previous AA modes that have been announced, but they are never put into the drivers or available in game settings. Oh, and FXAA is completely rubbish, it blurs text and doesnt make much difference.

    I remember how people used to say that furmark was so bad because it over stressed the GPUs and that no game would ever stress the GPUs that much. I can pretty much guarantee that forcing 4-8x sparse grid AA in any current game will not only destroy your FPS, but also eventually destroy your GPUs because it loads them up just as much.

    And SLI AA is a very good way to use two cards, Im not sure if that reduces / eliminates microstutter but it disables SLI performance and makes the second card handle the AA. It would be interesting to see benchmarks comparing SLI perfomance mode with regular AA to SLI AA with the performance benefits of 2 cards disabled and see which one actually provides better framerates.

  13. #13
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    You should buy better cards that dont overheat when you're using them to their full potential

    All along the watchtower the watchmen watch the eternal return.

  14. #14
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Surely MSI lightnings are among the best GTX 680s you could get?

    Sparse Grid AA alone makes the cards run vastly hotter than lesser AA methods do, not something I realized would have happened when I followed a guide on how to activate it.

  15. #15
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by bhavv View Post
    Surely MSI lightnings are among the best GTX 680s you could get?

    Sparse Grid AA alone makes the cards run vastly hotter than lesser AA methods do, not something I realized would have happened when I followed a guide on how to activate it.
    apparently not if they're running that hot.

    All along the watchtower the watchmen watch the eternal return.

  16. #16
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by STEvil View Post
    apparently not if they're running that hot.
    Yeah, pretty much.

    Those types of coolers are awful in sli or with any obstructions in the area.

    Quote Originally Posted by bhavv View Post
    Sparse Grid AA alone makes the cards run vastly hotter than lesser AA methods do, not something I realized would have happened when I followed a guide on how to activate it.
    It shouldn't. Maybe you just aren't pushing the cards very hard with most games. I bet that Far Cry 3 would do it. I'm also sure that a higher resolution would do the same.

    I use a pair of 680 lightnings too but under water. Those cards got loud and hot in sli with the stock cooler.
    Last edited by BababooeyHTJ; 08-23-2013 at 06:19 PM.

  17. #17
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Its only sparse grid AA that makes them run that hot, otherwise temps are in the 70s.

    That maybe why Nvidia don't add driver support for that mode of AA.

    I do need a new case though, my Antec P182 is too out of date and cant fit enough fans.
    Last edited by Mungri; 08-23-2013 at 09:31 PM.

  18. #18
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Then you're not GPU bottlenecked without SGSSAA. It doesn't matter how the GPU load is generated when it comes to heat.

  19. #19
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by boxleitnerb View Post
    Then you're not GPU bottlenecked without SGSSAA. It doesn't matter how the GPU load is generated when it comes to heat.
    Exactly

  20. #20
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Right, but the only thing I'm aware of that causes that much GPU strain previously was Furmark / Kombustor, and both Nvidia and AMD screwed with their drivers to recuce clock speeds and voltages when running that. I've not played any games that pushed the GPUs that much.

    My current case has virtually no airflow over he GPU area anyway, but I have a new one ordered that should be a lot better (Corsair 540).

  21. #21
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Lol, SGSSAA doesn't put out kombuster or furmark type loads.

    You're cpu bottlenecked at your resolution in a lot of games. Thats why its not putting a large load on the gpus in most games that you play. Let me guess, you're using 1080p? Lets face it nehalem is five years old now.

    My current case has virtually no airflow over he GPU area anyway,
    That is a big part of the problem and those twin fan style coolers like the lightning aren't optimal in sli or with anything anywhere near the fan like the midplate in your case.

  22. #22
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Actually I'm thinking its just something wrong with one of my cards. Its 10-20 degrees hotter regardless of which slot its placed in, and as either the primary or secondary card.

    After just 5-10 minutes of running around in GW2 with SGSSAA, one card reached 89 degrees while the other is only at 67 with up to 95% GPU usage:

    http://i.imgur.com/e3SUDPA.jpg

    And after turning on FXAA in the graphics setting which disables Nvidia Inspectors SGSSAA (at the red line on the graph), the temperatures drop to 80 / 63, and GPU usage into the 60s:

    http://i.imgur.com/hazaFi2.jpg

    Originally when I noticed the temperatures, the cards were installed the other way around, the top one reached 90 degrees, the bottom one 100 on a much hotter day and after over an hour of playing GW2 with SGSSAA, so I swapped them around as the top card was nearer to the intake fan and had more space.

    I thought of reinstalling the coolers with Arctic MX4, but theres a warranty void if removed stickers on of the screws, so I've emailed MSI technical support first to ask what to do. I think its daft if I'd have to send it back to get it 'fixed' when I could just reseat the coolers myself in a few minutes.

    Quote Originally Posted by BababooeyHTJ View Post
    Lets face it nehalem is five years old now.
    Age alone doesnt matter, theres barely any difference between a clocked 980 and the latest Intel CPUs, the improvements since then have only been like 10% per generation, not really much that an I7 980 would already be bottlenecking games. Theres still hardly any games out there that can yet fully utilize any Intel I5 CPU running at 3.5 Ghz.
    Last edited by Mungri; 08-29-2013 at 07:21 AM.

  23. #23
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by bhavv View Post
    Age alone doesnt matter, theres barely any difference between a clocked 980 and the latest Intel CPUs, the improvements since then have only been like 10% per generation, not really much that an I7 980 would already be bottlenecking games.
    Thats not at all true. I was able to measure a 50% difference in minimums in some games when going from a 4ghz i5 760 to an i5 2500k at 4.8ghz.

    Here are some benchmarks that I did with a pair of 6950s in crossfire when I switched cpus. Now your cards are much faster than those 6950s that I had at the time.



    Theres still hardly any games out there that can yet fully utilize any Intel I5 CPU running at 3.5 Ghz.
    IPC performance is much lower on Nehalem than on a more modern cpu.
    Last edited by BababooeyHTJ; 08-30-2013 at 01:34 AM.

  24. #24
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    also, using vsync or anything that limits FPS? That will reduce load on the GPU's.

    All along the watchtower the watchmen watch the eternal return.

  25. #25
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Quote Originally Posted by BababooeyHTJ View Post
    I was able to measure a 50% difference in minimums in some games when going from a 4ghz i5 760 to an i5 2500k at 4.8ghz.
    [/img]
    2 games in your comparison out of which only one became playable =/= 'CPU bottlenecked in a lot of games'. Stop spreading misinformation based on your biases. If I notice crap minimums like those I upgrade, so far GPU upgrading is sufficient.

    Quote Originally Posted by STEvil View Post
    also, using vsync or anything that limits FPS? That will reduce load on the GPU's.
    Vsync is always on, IMO people here could try enabling SGSSAA in their games and seeing the spike in GPU usage and temperatures themselves, even the guides I read on how to enable it warned that GPU heat would increase significantly with SGSSAA enabled.
    Last edited by Mungri; 08-30-2013 at 01:10 PM.

Page 1 of 2 12 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •