Page 1 of 5 1234 ... LastLast
Results 1 to 25 of 111

Thread: dx10 to use software aa resolve

  1. #1
    Banned
    Join Date
    Oct 2006
    Posts
    963

    dx10 to use software aa resolve

    the first instance of this is with the call of juraz dx10 bench,
    buy the gts they said, the hd2900xt is rubbish they said, it will never compete with the gtx they said.
    i went with ati to see for myself and thank god i did. let the nvidia backlash begin....
    And this is what developers of CoJ have to say regarding the issue at hand:

    In this message we would like to comment some disputable information that was recently published by nVidia and that is related to the DirectX 10 benchmark mode in Call of Juarez.

    Before the arrival of DirectX 10, previous graphics APIs only allowed automatic Multi-Sample Anti-Aliasing (MSAA) resolves to take place in interactive gaming applications. This automatic process always consisted of a straight averaging operation of the samples for each pixel in order to produce the final, anti-aliased image. While this method was adequate for a majority of graphic engines, the use of advanced High Dynamic Range rendering and other techniques such as Deferred Rendering or anti-aliased shadow buffers require programmable control over this operation due to the nature of the mathematical operations involved. I.e. The previous approach using a simple average can be shown to be mathematically and visually incorrect (and in fact it produces glaring artefacts on occasions).

    All DirectX 10 graphics hardware which supports MSAA is required to expose a feature called 'shader-assisted MSAA resolves' whereby a pixel shader can be used to access all of the individual samples for every pixel. This allows the graphics engine to introduce a higher quality custom MSAA resolve operation. The DirectX 10 version of 'Call of Juarez' leverages this feature to apply HDR-correct MSAA to its final render, resulting in consistently better anti-aliasing for the whole scene regardless of the wide variations in intensity present in HDR scenes. Microsoft added the feature to DirectX 10 at the request of both hardware vendors and games developers specifically so that we could raise final image quality in this kind of way, and we are proud of the uncompromising approach that we have taken to image quality in the latest version of our game.

    "ExtraQuality" is a visual quality setting enabled by default in the DX10 version of Call of Juarez. In benchmark mode, "ExtraQuality" mode does two things. First, it increases shadow generation distance in order to apply shadowing onto a wider range of pixels on the screen, resulting in better quality throughout the benchmark run. Second, it increases the number of particles rendered with the geometry shader in order to produce more realistic-looking results, like for example waterfall, smoke and falling leaves. The attached screenshot illustrates those differences when ExtraQuality is disabled. ExtraQuality is designed as a default setting to reflect the visual improvements made possible by DX10 cards and is not meant to be disabled in any way.

    All updates to shaders made in the final version of the Call of Juarez benchmark were made to improve performance or visual quality or both, for example to allow anisotropic texture filtering on more surfaces than before. This includes the use of more complex materials for a wider range of materials. At the same time we implemented shader code to improve performance on the more costly computations associated with more distant pixels,. Some materials were also tweaked in minor ways to improve overall image quality. One of the key strengths of NVIDIA's hardware is its ability to perform anisotropic filtering at high performance so we are puzzled that NVIDIA complains about this change when in effect it plays to their strengths.

    Default settings were chosen to provide an overall good user experience. Users are encouraged to modify the settings in the CoJDX10Launcher as required. Using larger shadow maps is one option that we would encourage users to experiment with, and in our experience changing this setting does not affect NVIDIA's comparative benchmark scores greatly.

    We are disappointed that NVIDIA have seen fit to attack our benchmark in any way. We are proud of the game that we have created, and we feel that NVIDIA can also be proud of the hardware that they have created. Nonetheless these artistic decisions about the settings of a game rightly belong in the hands of the games developer, not the hardware manufacturer.

    Thank you and don't hesitate to contact us if you have any questions.

  2. #2
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    I believe that's DX 10.1 from what I read around the net. My only concern is if it's widely adapted throughout the PC gaming community. Using software AA resolve sounds good on paper and it makes sense. However, the practicality, performance increase over traditional methods and adaptation of it all is really a question mark right now. Do you have a link?
    Last edited by Eastcoasthandle; 06-27-2007 at 03:43 PM.
    [SIGPIC][/SIGPIC]

  3. #3
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Um, in other words, in case you missed it, MSAA does not work properly with HDR and Anti-Aliased Shadowing.

    IN other words, G80 is not really a DX10 chip. Sure, it has most of the functionality, but just barely, and slow like molasses. Call of Juarez merely warns of what's to come.

    Let the flames begin.

    Anyway, the whole point of DX10 is to break the programming limits imposed by DX9 and other earlier versions of DX. This includes features which allow for proper "deferred rendering", which is apllying lighting as a final step(HDR or otherwise). Deffered rendering also allows for geoemtry deformation, and lots of other nice visual goodies that jsut aren't possible in DX9. This is one step closer to real-time raytracing, boys and girls...

    However, in order to use deferred rendering, you need TONS of bandwidth.

    Wait. R600 has that. G80 does not.

    So, in case you missed it, R600 is much better for high-quality, photo-realistic images...G80 can do it too, but not very well.

    ANd if you don't get it, check out what deferred rendering really is. It's not new, it's actually quite old (KyroII, anyone?). You'll find some big names asking for functionality like this years ago(Tim Sweeny, of Epic/Unreal)...
    Last edited by cadaveca; 06-27-2007 at 03:48 PM.

  4. #4
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Not all DX10 titles are to use software AA resolve, just incase you didn't know. COH's DX10 patch uses hardware AA, as does Lost Planet.

    This is like the 8th time you've brought this benchmark up purecain. We've all read the testimony(which, in a way, NVidia's statement rings true because ONLY the benchmark performance changed in the latest version, the game performance is higher than the benchmark shows). Fact is, until a built from the ground up DX10 title shows up, you can't be certain, but thus far 2 out of 3 DX10 titles didn't force software aa resolves.

    DX10.1 will likely be when developers really start forcing it, thanks to DX10.1's enhancement to shader-based AA.

    As for deffered rendering, it can be done just fine on the G80, just check out Roboblitz.
    Last edited by DilTech; 06-27-2007 at 03:56 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  5. #5
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    They are not "pure" DX10 titles tho, are they? They are DX9 titles with a couple of added DX10 features. Very few shaders push DX9, even, in either of those apps.


    C'mon now, DilTech, DX10 isn't here yet, at all. Sure, we got feature-specific apps that show one or two new things, but not a single release makes use of the geometry shader for any real physics affect. And no, the Lost Planet "snow" is not anything DX9 could not have done.


    DX10 isn't just about better performance. Anyone who says such clearly isn't writing DX10 programming.

  6. #6
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by cadaveca View Post
    They are not "pure" DX10 titles tho, are they? They are DX9 titles with a couple of added DX10 features. Very few shaders push DX9, even, in either of those apps.


    C'mon now, DilTech, DX10 isn't here yet, at all. Sure, we got feature-specific apps that show one or two new things, but not a single release makes use of the geometry shader for any real physics affect. And no, the Lost Planet "snow" is not anything DX9 could not have done.


    DX10 isn't just about better performance. Anyone who says such clearly isn't writing DX10 programming.
    I know DX10 isn't quite here yet, that's why I specifically stated "Fact is, until a built from the ground up DX10 title shows up, you can't be certain".

    DX10 is mainly about performance, but it also did add a few things. Deffered rendering isn't one of those things though, as it can be done in DX9.

    I just find it funny that pure is using one of the three "dx10" titles as his claim that DX10 will force software aa resolves when the other two current ones do not. That's all.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  7. #7
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Ireland
    Posts
    940
    I just find it funny that pure is using one of the three "dx10" titles as his claim that DX10 will force software aa resolves when the other two current ones do not. That's all.
    and if they do i say you eat your hat and an 8800gtx

  8. #8
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Some will, some won't. It all comes down to who's hardware a title is optimized for, which is what I've been saying for awhile now.

    The one's that are going to matter this year are Hellgate:London, Crysis, Bioshock, and UT3.

    Those will be the ones to watch the results from. Especially the UE3 engine, because that's going to decide a lot then and there, as we all know it's to be used on a lot of titles. I will add note though that AA works on R6:V on the 8800's, but not on the HD2900xt, but that's a DX9 title. Of course, Epic are members of the TWIMTBP group, so I have a strong feeling on where that's going to end up.

    As for eating a GTX...why? Even with shader AA the GTX still keeps up(infact, the GTX's low fps was higher than the HD2900's), and when hardware aa is used the gtx cleaves by much larger margins. Again though, the GTX and hd2900xt aren't meant to compete.
    Last edited by DilTech; 06-27-2007 at 04:17 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  9. #9
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Ireland
    Posts
    940
    Again though, the GTX and hd2900xt aren't meant to compete.
    QFT

    pardon me, i ment Gts anyways, its late out here i think il head to bed , anyways i agree , crysis and ut3 will be important ones however with what crysis is offering both cards may need to have AA switched off to get playable frame rates because of the huge draw distance and buckets of hdr but i may be wrong , who knows lets just wait and see

  10. #10
    Banned
    Join Date
    Oct 2006
    Posts
    963
    hardware resolve doesnt offer the quality that shader based aa does and does not look realistic in dx10. i've spent the last 6 hours well all day really reading up on this and it is quite safe to say that dx10 will use shader based aa as the standard... yipeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee! ati said they had worked very closely with microsoft in the development stages and it looks like it has paid off.... dil please dont try to say other devs are going to use the poorer less programmable method(hardware based aa resolve) because it isnt logical that they would choose to make their game look worse than is possible. like i said i've been reading up alllllll day as i didnt want to post this info only to make myself look stupid. i couldnt believe it myself at first but the more you dig the more it is confirmed and backed up.
    all those people that have gone out and bought gts's and gtx's will be kicking themselves and when i tried to point out the relationship ati had stated they had in the development of dx10 i got flamed and in the end banned for not accepting nvidia propergander. there is a good chance devs might give you the chioce of method but if they do you will lose many features that make dx10 look so good.
    Last edited by purecain; 06-29-2007 at 05:48 PM.

  11. #11
    I am Xtreme
    Join Date
    Oct 2005
    Location
    Grande Prairie, AB, CAN
    Posts
    6,140
    Quote Originally Posted by purecain View Post
    all those people that have gone out and bought gts's and gtx's will be kicking themselves and when i tried to point out the relationship ati had stated they had in the development of dx10 i got flamed and in the end banned for not accepting nvidia propergander. there is a good chance devs might give you the chioce of method but if they do you will lose many features that make dx10 look so good.
    A.T.I. A.T.I. A.T.I. A.T.I. A.M.D. A.M.D. A.M.D. A.M.D. A.T.I. A.T.I. A.T.I. A.T.I. A.M.D. A.M.D. LOL COME ON SHOVE IT UP EM.....
    gimme a break? We've been playing games for 6 months already on nice detail. How long have you had your 2900s, a month. The majority of the people here buy new videocards every 6-10 months here anyways, so i hardly doubt anyone who bought a 8800 GTS/GTX is regretting their purchase.

    By the time DX 10.1 and/or real DX10 games comes around anyways, NVIDIA will have newer cards out.

  12. #12
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by purecain View Post
    hardware resolve doesnt offer the quality that shader based aa does and does not look realistic in dx10. i've spent the last 6 hours well all day really reading up on this and it is quite safe to say that dx10 will use shader based aa as the standard... yipeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee! ati said they had worked very closely with microsoft in the development stages and it looks like it has paid off.... dil please dont try to say other devs are going to use the poorer less programmable method(hardware based aa resolve) because it isnt logical that they would choose to make their game look worse than is possible. like i said i've been reading up alllllll day as i didnt want to post this info only to make myself look stupid. i couldnt believe it myself at first but the more you dig the more it is confirmed and backed up.
    all those people that have gone out and bought gts's and gtx's will be kicking themselves and when i tried to point out the relationship ati had stated they had in the development of dx10 i got flamed and in the end banned for not accepting nvidia propergander. there is a good chance devs might give you the chioce of method but if they do you will lose many features that make dx10 look so good.
    A.T.I. A.T.I. A.T.I. A.T.I. A.M.D. A.M.D. A.M.D. A.M.D. A.T.I. A.T.I. A.T.I. A.T.I. A.M.D. A.M.D. LOL COME ON SHOVE IT UP EM.....
    You might want to start by talking with sense and not the gibberish that got you banned from EOCF, as it can get you the same treatment here. That's what you were banned for, not "not accepting nvidia propaganda", it was for your gibberish posting like you ended this post with.

    There's nothing illogical about using hardware AA. Why? Performance. Presently, the performance is much worse using software AA instead of hardware AA, and that's a fact... It's the same reason the effects found in the DX9 path for Crysis haven't been used before, or why they waited so long to put motion blur or the like to use, because until recently the performance to use these effects haven't been there. A shining example of companies ignoring better techniques is how long it took before companies besides id software started using unified lights and shadows... We STILL have new games coming out using static lighting! Why? Performance.

    You may have spent "all day" reading, but obviously you're completely ignoring the fact that 2 of the 3 companies to produce something with DX10 have used hardware AA instead of software AA. You're also ignoring the fact that if performance isn't up to par, it's not logical to use a technique that isn't fast enough using the current api. That's why DX10.1 adds so many enhancements for shader aa, because it's too slow in DX10.

    I know you're all excited by one company's words, but in reality have you seen the difference between shader aa and hardware aa? If not, how do you know the difference in image quality? How can you take one company's word(who btw, was a partner for ATi's launching of the HD2900xt, thus why they used that engine to preview the cards) without seeing physical evidence when thus far no other company has followed suit?

    See where I'm going with this?
    Last edited by DilTech; 06-27-2007 at 06:47 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  13. #13
    Xtreme Mentor dengyong's Avatar
    Join Date
    Nov 2006
    Location
    A great place again
    Posts
    2,589
    Well said DilTech.

  14. #14
    Xtreme Addict
    Join Date
    Feb 2007
    Location
    Arizona, USA
    Posts
    1,700
    I have a feeling that our perspective on the current avalible DX10 cards will change once the release the DX10.1 patch.
    I am thinking that ATi will win the war with this patch. Do not flame me, as this is the conclusion that I derived after reading what was said in purecain's first post, and diltech's latest post.


    Core i7 920 D0 B-batch (4.1) (Kinda Stable?) | DFI X58 T3eH8 (Fed up with its' issues, may get a new board soon) | Patriot 1600 (9-9-9-24) (for now) | XFX HD 4890 (971/1065) (for now) |
    80GB X25-m G2 | WD 640GB | PCP&C 750 | Dell 2408 LCD | NEC 1970GX LCD | Win7 Pro | CoolerMaster ATCS 840 {Modded to reverse-ATX, WC'ing internal}

    CPU Loop: MCP655 > HK 3.0 LT > ST 320 (3x Scythe G's) > ST Res >Pump
    GPU Loop: MCP655 > MCW-60 > PA160 (1x YL D12SH) > ST Res > BIP 220 (2x YL D12SH) >Pump

  15. #15
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by ColonelCain View Post
    I have a feeling that our perspective on the current avalible DX10 cards will change once the release the DX10.1 patch.
    I am thinking that ATi will win the war with this patch. Do not flame me, as this is the conclusion that I derived after reading what was said in purecain's first post, and diltech's latest post.
    The DX10.1 update won't change a single thing for current cards.

    To be DX10.1 compliant, you must have every feature of DX10.1. If you lack any of the features, DX tells the application that you can't run any of the features found in DX10.1(it does it in DX10 as well). This is what microsoft meant when they said you either have it all, or you have nothing. They'll still have the huge performance hit already incurred for shader AA.

    In otherwords, DX10.1 isn't going to help anything for the R600.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  16. #16
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    BTW, purecain...

    The 8800GTX actually takes a SMALLER hit in performance for turning on shader aa in COJ then the HD2900xt... So, technically, that'd mean the 8800 is better at doing shader AA then the hd2900xt, right?

    Take a look yourself.
    http://www.extremetech.com/article2/...2147119,00.asp

    Minimum framerate only drops by 3, average by ~4 on the 8800gtx. On the HD2900xt the drop is 10 at 1280x720, and 6 at 1920x1200. Compare that to the GTS's 4 frame drop at 1280x720 for enabling AA, and 5 frame drop at 1920x1200.

    So, technically, the 8800's performance hit was less using a technique that the HD 2900 was "designed and optimized" for.
    Last edited by DilTech; 06-28-2007 at 07:07 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  17. #17
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Then, DilTech, why are they(nV) complaining?


    And again, I really wonder about you "working in 3d" if you think DX10 brings only performance inprovements. With that attitude, nothing groundbreaking will be coming from you!

    Oh, and DilTech, cna you explain what happens differently with HDR shadows when MSAA is used VS. "Software AA"? And why is it that this comes @ a penalty?

    You realize, of course, too, that R600 is far more cpu-limited than G80, too, right? Why is THAT?

  18. #18
    Xtreme Cruncher
    Join Date
    Jul 2006
    Posts
    1,374
    Just to let you know, I imagine the game companies are aware of the hardware that they have to work with. If you think that they are going to release a game that has no hardware it can be run on, you are mistaken. Also, um, I wonder which video cards are being used to test these games? *winks*

  19. #19
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by cadaveca View Post
    Then, DilTech, why are they(nV) complaining?


    And again, I really wonder about you "working in 3d" if you think DX10 brings only performance inprovements. With that attitude, nothing groundbreaking will be coming from you!

    Oh, and DilTech, cna you explain what happens differently with HDR shadows when MSAA is used VS. "Software AA"? And why is it that this comes @ a penalty?

    You realize, of course, too, that R600 is far more cpu-limited than G80, too, right? Why is THAT?
    NVidia complained because they switched AA methods when visually, frame for frame, there was no visual difference in quality between the two versions. The only thing it did was cause a larger hit for NVidia then they previously had with using Hardware AA resolves(while ATi stayed at the same numbers since they already used shader aa), which makes me wonder just how small the hit was with hardware resolves for the 8800's.

    Also, I didn't say DX10 only brought performance enhancements. I said mainly, which is true. Perhaps read my posts before replying to them in a disrespectful manner?

    Your question about HDR shadows, I'm not quite sure exactly what you're trying to ask, as Software AA is still Multi-Sampling AA, it's just using a different method for Resolving AA. It comes at a higher penalty because it's using shader power and bandwidth to perform, in comparison to having it's own hardware dedicated to performing this action and just needing the bandwidth.

    As for the HD2900XT being more cpu limited, this is something I've been curious about myself. I'm thinking perhaps it's because the driver has more to do, as it's trying to keep all the shaders going, and it's telling it how to handle the AA resolves as well. This is just a guess though, no way to really know without asking ati themselves, but I'm thinking we'll see that lower as time goes on. The odd thing is, CF is less CPU limited then SLi according to Kinc. Weird how that works.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  20. #20
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Diltech, first and foremost, you gotta know by now...ignore my "attitude". That's just me, and I'm a jerk, sometimes, without ever meaning to be. We've done this how many times?


    DilTech...MSAA and ShaderAA resolve very differently, yes. It's this resolve that makes ShaderAA so bandwidth starved, but it also allows for properly AA'd shadows with less of a hit, which MSAA does not. In other words, you apply HDR, and then MSAA, but with Shader AA, you can apply AA and then HDR, allowing for shadows THAT DON"T NEED AA(I'm ignoring BLOOM, BLOOM is not HDR in my books, it's jsut simply overdraw). It also allows for SoftShadowing and AA as well, which MSAA does not, and this is why Techland chose to use that rendering method in the demo, but because of the performance hit, they did not in the game.

    I realize you did not say ONLY performance increase, but your tact implies that this is all that we need concern ourselves about, when really, the hardware brings the performance(without restrictions on functionality), and DX allows for no programming limitations for the details you choose to use for your app, like softshadowing, and AA, or maybe you want to toss deferred rendering and MegaTextures in there as well.

    DX9, you'd have to choose which ones you want, as the API only allows for so much precision. Not so with DX10...you can force that data through, fast or not, with no precision limits.


    Now, Diltech, I'm trying to push the hardware out of this discussion. Really, before we can judge what the hardware is doing, we need to know what the software is doing...


    Funny tho..i keep going back to deferred rendering...
    Last edited by cadaveca; 06-28-2007 at 07:56 AM.

  21. #21
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by cadaveca View Post
    Diltech, first and foremost, you gotta know by now...ignore my "attitude". That's just me, and I'm a jerk, sometimes, without ever meaning to be. We've done this how many times?


    DilTech...MSAA and ShaderAA resolve very differently, yes. It's this resolve that makes ShaderAA so bandwidth starved, but it also allows for properly AA'd shadows with less of a hit, which MSAA does not. In other words, you apply HDR, and then MSAA, but with Shader AA, you can apply AA and then HDR, allowing for shadows THAT DON"T NEED AA(I'm ignoring BLOOM, BLOOM is not HDR in my books, it's jsut simply overdraw). It also allows for SoftShadowing and AA as well, which MSAA does not, and this is why Techland chose to use that rendering method in the demo, but because of the performance hit, they did not in the game.
    I think you're still getting it confused... MSAA is multi-sampling AA. Whether AA resolves are done by hardware or by shaders, they're still performing multisampling aa. Thus, the comparison is Hardware AA vs Shader AA.

    As for soft-shadows and hardware AA, it's possible. Oblivion does it. I'd know, I play it all the time.

    Finally, ShaderAA found in DX10 still doesn't allow for full control of the developer to decide what does and doesn't get AA. That's being added in DX10.1. That's going to be a huge addition because it'll allow the developer to decide what areas need higher levels of AA, and what parts of an image don't need AA at all.

    I realize you did not say ONLY performance increase, but your tact implies that this is all that we need concern ourselves about, when really, the hardwatre brings the performance, and DX allows for no limitations in the details you choose to use for your app, like softshadowing, and AA, or maybe you want to toss deferred rendering and MegaTextures in there as well.
    MegaTextures isn't part of DX. Infact, it's presently only being used on an OpenGL engine.

    As for the what I imply, I'm not meaning to imply anything. Making an API more efficient can go about bringing on performance as well, it's not just all in the hardware.

    DX9, you'd have ot choose which ones you want, as the API only allows for so much precision. Not so with DX10...you can force that data through, fast or not, and no precision limits.

    Now, Diltech, I'm trying to push the hardware out of this discussion. really, before we can judge what the hardware is doing, we need to know what the software is doing...


    Funny tho..i keep going back to deferred rendering...
    DX10 adds a few nice things, DX10.1 will be about additional performance with a few more features. Now we just have to wait until developers learn to be efficient with DX10 before we see what it can truly do.

    I too find it funny you keep going back to deferred rendering, considering it can be done in DX9 as well.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  22. #22
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Quote Originally Posted by DilTech View Post


    MegaTextures isn't part of DX. Infact, it's presently only being used on an OpenGL engine.





    Um, do you want to retract this comment, or shall I correct you?





  23. #23
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by cadaveca View Post




    Um, do you want to retract this comment, or shall I correct you?




    No, it stands...
    http://en.wikipedia.org/wiki/MegaTexture

    Megatexture is John Carmacks own technique, first introduced in quake wars. No other game employs this technique currently, and the only engine which supports it presently is the Doom 3 engine(internally known as ID engine 4 by id), though ID engine 5 will also use it to a much greater degree. I don't think I have to tell you that the Doom 3 engine is OpenGL, do I?

    If you'd like to attempt to correct me, feel free, but I promise you it won't end the way you think it will. You aren't by chance thinking of another technique, are you?
    Last edited by DilTech; 06-28-2007 at 02:47 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  24. #24
    Registered User
    Join Date
    Dec 2006
    Posts
    22
    Quote Originally Posted by purecain View Post
    hardware resolve doesnt offer the quality that shader based aa does and does not look realistic in dx10. i've spent the last 6 hours well all day really reading up on this and it is quite safe to say that dx10 will use shader based aa as the standard... yipeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee! ati said they had worked very closely with microsoft in the development stages and it looks like it has paid off.... dil please dont try to say other devs are going to use the poorer less programmable method(hardware based aa resolve) because it isnt logical that they would choose to make their game look worse than is possible. like i said i've been reading up alllllll day as i didnt want to post this info only to make myself look stupid. i couldnt believe it myself at first but the more you dig the more it is confirmed and backed up.
    all those people that have gone out and bought gts's and gtx's will be kicking themselves and when i tried to point out the relationship ati had stated they had in the development of dx10 i got flamed and in the end banned for not accepting nvidia propergander. there is a good chance devs might give you the chioce of method but if they do you will lose many features that make dx10 look so good.
    A.T.I. A.T.I. A.T.I. A.T.I. A.M.D. A.M.D. A.M.D. A.M.D. A.T.I. A.T.I. A.T.I. A.T.I. A.M.D. A.M.D. LOL COME ON SHOVE IT UP EM.....

    Please grow up.

    Why would you want to spend $500+ on a graphics hardware device to use SOFTWARE antialiasing ? Just because your $600 ati card isnt good enough to do hardware AA.....

    And before you call me a fanboy ive owned

    amd athlon rig with nvidia geforce 4600ti for 5 years

    intel c2d with ati x1950xt for half a year

    and now the same rig with nvidia 8800 gtx and i can frankly say nvidias are much better

  25. #25
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Porsche911r101 View Post
    Please grow up.

    Why would you want to spend $500+ on a graphics hardware device to use SOFTWARE antialiasing ? Just because your $600 ati card isnt good enough to do hardware AA.....

    And before you call me a fanboy ive owned

    amd athlon rig with nvidia geforce 4600ti for 5 years

    intel c2d with ati x1950xt for half a year

    and now the same rig with nvidia 8800 gtx and i can frankly say nvidias are much better
    Two wrongs don't make a right, he posted banter, and you flamed him back.

    Can we please keep this all on topic?
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

Page 1 of 5 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •