Page 2 of 5 FirstFirst 12345 LastLast
Results 26 to 50 of 111

Thread: dx10 to use software aa resolve

  1. #26
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Ireland
    Posts
    940
    Originally Posted by Porsche911r101 View Post
    Please grow up.

    Why would you want to spend $500+ on a graphics hardware device to use SOFTWARE antialiasing ? Just because your $600 ati card isnt good enough to do hardware AA.....

    And before you call me a fanboy ive owned

    amd athlon rig with nvidia geforce 4600ti for 5 years

    intel c2d with ati x1950xt for half a year

    and now the same rig with nvidia 8800 gtx and i can frankly say nvidias are much better
    what planet are you on plz? cause it deffo aint earth, now i aint flaming but you ARE Biased....its not fla,e its a fact..

    http://www.newegg.com/Product/Produc...82E16814131054

    i live in europe , i did 10 seconds of googleing to get a us price ...409 dollers isnt 500+ it isnt a 600$ gfx , its half that ....now before you come on here saying your so diversified please get your facts straight and stop spraying b$ all around the place

    youve owned a ati card,
    You OWN a 8800
    you DO NOT own a 2900 so dont jump on here and spread your aligations cause if you turn aa on this card dosent spontaniously combust ...sure it takes a hit all the cards do...im not going to argue about software shaders or hardware shaders cause tbqfh I dont care....ati tooka gamble im goin with them if it goes wrong il buy an nvidia card that does the job , right now for 320 euros its the price i can pay for what im getting......

    plz learn to google
    plz be nicer :P , it makes xs a better place to be for everybody

    //back on topic.....

    software AA vs hardware AA.....gosh....i just dont care
    Last edited by Papu; 06-28-2007 at 05:20 PM.

  2. #27
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Quote Originally Posted by DilTech View Post
    No, it stands...
    http://en.wikipedia.org/wiki/MegaTexture

    Megatexture is John Carmacks own technique, first introduced in quake wars. No other game employs this technique currently, and the only engine which supports it presently is the Doom 3 engine(internally known as ID engine 4 by id), though ID engine 5 will also use it to a much greater degree. I don't think I have to tell you that the Doom 3 engine is OpenGL, do I?

    If you'd like to attempt to correct me, feel free, but I promise you it won't end the way you think it will. You aren't by chance thinking of another technique, are you?

















    Mr. Carmack said that graphics cards drivers have been a big headache for him and it became more complicated to determine real performance of application because of multiply “layers of abstraction on the PC”. The lead programmer of id Software called Xbox 360’s more direct approach “refreshing” and even praised Microsoft’s development environment “as easily the best of any of the consoles, thanks to the company's background as a software provider”. Nevertheless, Mr. Carmack criticized decision of Microsoft and Sony to use PowerPC derivative processors in the next-generation Xbox 360 and PlayStation 3 game consoles.

    John Carmack has been a very sturdy ally of OpenGL and a critic of Microsoft’s DirectX at first. Even with the most-recent title Doom III he continued to claim that Microsoft’s API did not provide him the level of OpenGL feature-set, even though the engine was ported to Microsoft’s Xbox console at the end. It seems right now the situation has changed and Microsoft’s DirectX-like Xbox 360 environment is able to provide Carmack what he wanted.
    -August 2005

    http://www.xbitlabs.com/news/multime...816034824.html


    Notice, Quakewars runs on XBOX360...Xenos does OpenGL?





    Diltech:

    cadaveca:




    One of this year's most eagerly anticipated PC games, Enemy Territory: Quake Wars, has been confirmed for Xbox 360 and PS3.

    Development of the console versions has been out-sourced, Nerve Software handling the 360 version while Z-Axis is looking after the game on PS3. Quake Wars on PC is being developed by Splash Damage.
    http://www.computerandvideogames.com....php?id=157971
    Last edited by cadaveca; 06-28-2007 at 07:18 PM.

  3. #28
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by cadaveca View Post


















    -August 2005

    http://www.xbitlabs.com/news/multime...816034824.html


    Notice, Quakewars runs on XBOX360...Xenos does OpenGL?





    Diltech:

    cadaveca:






    http://www.computerandvideogames.com....php?id=157971
    Just to letcha know, I warned you that this wasn't a discussion you wanted to have with me, and I told you this wasn't going to go the way you had hoped it would.

    http://www.idsoftware.com/business/idtech4/

    Quote Originally Posted by idsoftware
    Platforms supported by id Tech 4 include, at minimum, PC, Mac, Xbox 360 and PS3.
    Last time I checked, Mac didn't support Direct X, and neither does the PS3. Technically, even the xbox360 isn't running Direct X, albeit the specs put it inbetween DX9 compliant and DX10 compliant. Also, technically, the xbox360 can run OpenGL, it did it with quake 4!!!

    Therefore, yes, I'm telling you that the xbox360 can run OpenGL, blowing your entire argument clear out the window.

    Dilly 2, Cadaveca 0.

    Want to see me go 3 for 3, or are you going to get back to the topic! This has nothing to do with Shader AA resolves, or even DX10.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  4. #29
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Um, think what you want, but you'll never stray my belief of the truth over your words.


    Anyway, it does relate. Work in 3D my butt!

    Um, notice quake4 ran poorly, and development was started back when the fisrt link i posted took place(summer 2005). You see, I pay attention to 3D.

    Now, megatexturing runs on a variety of platforms, and is not really API specific, because it does not simply deal with just large texture overlays, really, it deals with virtual video buffers, and taking those 4GB textures for maps and leaving 8MB of virtualization running in local vidram.


    Now, video memory virtualization is one of the major advancements brougth by DX10, and it's one of the key components that makes deferred rendering possible in DX10, when it's not in DX9, due to the performance hit.

    Please be advised tho, that performacne has nothing to do with a gpu's rendering power, and everything to do with the bendwidth of it's framebuffer, and the bus the card sits on. We wouldn't need Megatexturing, Deferred Rendering, OR Video Virtual Memory if we had a bus like Xenos's prorietary link to Xenon.

    By the way, those three "technologies" are all things that R600 excels at, and G80 does not, and this is why we can see a very large performance gap in DX10 applications between G80 and R600 currently. BTW, Megatexturing has ALOT in common with Tessellation.

    Anyway, tesselation is alot like ShaderAA, but in the opposite. ShaderAA simply draws @ a higher resolution to provide the many samples needed for MSAA(rather than dedicated hardware interpolating it)and Tesselation is just like Megatexturing, with a wee bit of deffered rendering tossed in.

    BTW, DilTech, Quake4 for Xenos is not a "wrapper" port. The problem is that OpenGL at the time did not support unified shaders, so Carmack was left with no choice but to almost re-write the entire engine. But he kinda faked it, and we only get 30FPS on Xenos and iDTech4.

    None of these features work well in DX9, in such a way, we might as well call them unsupported, as few applications use them(STALKER uses deferred rendering), but in DX10, they are more of a "common commodity" among the main 3D engine developers.

    If we look back @ titles like FEAR, Quake4, and Unreal2.5, almost 80% of ALU power goes unused, most often due to bandwidth limitations imposed by gpu's memory buffers. Enter ShaderAA to make use of it, provided the bandwidth is still available. And, fortuantely for ATI, and thier advanced scheduler, in most isntances it is, but only in limited amount(bye-bye 6xMSAA, hello tent and box filter).

    This all ties together, as when the other "technologies" that I mentioned earlier are provided the power to run together, thanks to DX10, shaderAA becomes more and more feasible, as framebuffer bandwidth requirements go down. Of course, we need a real DX10 app to take advantage of this, and sadly enough, not even 5 xbox360 titles make proper use of Xenos's unified shaders, being designed for other platforms, so it seem a long time coming for anything to hit the PC front in full force!

    But they are coming, soon enough, I suppose.
    Last edited by cadaveca; 06-28-2007 at 08:00 PM.

  5. #30
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by cadaveca View Post
    Um, think what you want, but you'll never stray my belief of the truth over your words.


    Anyway, it does relate. Work in 3D my butt!

    Um, notice quake4 ran poorly, and development was started back when the fisrt link i posted took place(summer 2005). You see, I pay attention to 3D.

    Now, megatexturing runs on a variety of platforms, and is not really API specific, because it does not simply deal with just large texture overlays, really, it deals with virtual video buffers, and taking those 4GB textures for maps and leaving 8MB of virtualization running in local vidram.


    Now, video memory virtualization is one of the major advancements brougth by DX10, and it's one of the key components that makes deferred rendering possible in DX10, when it's not in DX9, due to the performance hit.

    Please be advised tho, that performacne has nothing to do with a gpu's rendering power, and everything to do with the bendwidth of it's framebuffer, and the bus the card sits on. We wouldn't need Megatexturing, Deferred Rendering, OR Video Virtual Memory if we had a bus like Xenos's prorietary link to Xenon.

    By the way, those three "technologies" are all things that R600 excels at, and G80 does not, and this is why we can see a very large performance gap in DX10 applications between G80 and R600 currently. BTW, Megatexturing has ALOT in common with Tessellation.

    Anyway, tesselation is alot like ShaderAA, but in the opposite. ShaderAA simply draws @ a higher resolution to provide the many samples needed for MSAA(rather than dedicated hardware interpolating it)and Tesselation is just like Megatexturing, with a wee bit of deffered rendering tossed in.

    BTW, DilTech, Quake4 for Xenos is not a "wrapper" port. The problem is that OpenGL at the time did not support unified shaders, so Carmack was left with no choice but to almost re-write the entire engine. But he kinda faked it, and we only get 30FPS on Xenos and iDTech4.

    None of these features work well in DX9, in such a way, we might as well call them unsupported, as few applications use them(STALKER uses deferred rendering), but in DX10, they are more of a "common commodity" among the main 3D engine developers.

    If we look back @ titles like FEAR, Quake4, and Unreal2.5, almost 80% of ALU power goes unused, most often due to bandwidth limitations imposed by gpu's memory buffers. Enter ShaderAA to make use of it, provided the bandwidth is still available. And, fortuantely for ATI, and thier advanced scheduler, in most isntances it is, but only in limited amount(bye-bye 6xMSAA, hello tent and box filter).

    This all ties together, as when the other "technologies" that I mentioned earlier are provided the power to run together, thanks to DX10, shaderAA becomes more and more feasible, as framebuffer bandwidth requirements go down. Of course, we need a real DX10 app to take advantage of this, and sadly enough, not even 5 xbox360 titles make proper use of Xenos's unified shaders, being designed for other platforms, so it seem a long time coming for anything to hit the PC front in full force!

    But they are coming, soon enough, I suppose.
    Got a link showing that the doom 3 engine was re-written for the xbox360? I want to see that in writing, because everything I've read/been told pretty much points it out in the direction of it being a straight port using a wrapper. Never have I seen a single article stating the engine was re-worked for the xbox360.

    As for "G80 not being strong" in said techniques, got evidence to back up such a large claim? I don't care about paper numbers, I'm talking actual benchmarks. This round thus far have proven beyond a shadow of a doubt that specs mean nothing. I don't care what bandwidth numbers are, because NVidia has been showing better performance in titles using bandwidth heavy techniques(see oblivion), with lesser bandwidth. Also, the HD2900xt was suppose to be optimized for Shader AA, yet NVidia's parts take a smaller hit using it.

    Finally, about the whole "unified shaders didn't work in opengl", I find that comical, considering I can load up quake 3, which hasn't been patched, and doesn't need an update to run with unified shaders. Link for that one as well.

    Either way, this is getting far too off topic at this point, you know it and I know it. At this point, this thread will probably be better off if we just agree to disagree and get it done and over with. If you wish to debate this further, we can take it to pm's instead, as honestly this has nothing to do with the thread at hand.
    Last edited by DilTech; 06-28-2007 at 10:36 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  6. #31
    Xtreme Cruncher
    Join Date
    Nov 2006
    Location
    Minnesota
    Posts
    841
    Why take it to PM's? This is probably the best thread to come around in along time. I'm a fanboy of neither but find the discussion extremely interesting. Pleeeease continue.
    QX 9650 | Maximus II Formula | G.SKILL 4GB (2 x 2GB) PC2 8500 | Sapphire 4850 X2 2GB| Auzen X-Fi Prelude 7.1 | 3x's WD 150GB Raptors & 2x's 1TB Seagate 7200.11 | Silverstone OP1200 | SilverStone TJ07 | Dell 2707WFP

    EK-RES 250 --> Swiftech MCP655 --> Apogee GTZ --> Enzotech Sapphire NB --> Feser X-Changer 360


  7. #32
    Banned
    Join Date
    Oct 2006
    Posts
    963
    my statements are my oppinion, and an educated one at that as i have done my research on the matter. why should i be banned for being happy about a technical matter which has altercations in favour of my gpu's architecture. in the first post the devs statement answer's this debate. higher, better quality using shader based aa resolve. gtx isnt as good at this and now puts its performance around the r600 area if not worse. if i had bought a gtx i would be dissapointed reading this but i'm not. i took the gamble and its paid off....
    no attitude, no flame just the honest truth.
    ati r600>gtx in my oppinion like it or not. i'm just going on the non biased information available.
    lets see if any of the big titles offer both methods but the quality is always going to be better using the software based aa resolve. explained in first post....

  8. #33
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Quote Originally Posted by TouGe View Post
    Why take it to PM's? This is probably the best thread to come around in along time. I'm a fanboy of neither but find the discussion extremely interesting. Pleeeease continue.

    Because I'm right, and he is not.

    Carmack on Megatexture, May2006:

    Q12: Would the consoles having less memory than a PC pose a problem for the MegaTexture? Or is something that you guys have already started to work around?

    Answer: If anything, it works out better for the next-generation consoles, because on the PC you could often get away with not doing texture management if you were targeting fairly high end, while on the consoles, you’ve always had to do it. And especially my newer paged virtual texturing which applies to everything instead of just terrain, allows you to have a uniform management of all texture resources there, as well as allowing infinitely sized texture dimensions. So this is actually working out very nicely on the Xbox 360.
    Q17: Is there anything else that you’d like to add?

    Answer: It’s still very exciting the capabilities that are continuously being added to our arsenal here. I am having a really good time working on the Xbox 360 right now, graphic technology-wise. As for the MegaTexture stuff, it is kind of funny that it’s not super demanding of the hardware. As I mentioned, I was kind of surprised that something like this hadn’t been pushed before we got around to it. There are lots more exciting possibilities for the graphics research and we’re still toying around with some fairly fundamental architectural design issues on the Xbox 360.

    And, the PC space is going to be moving even faster than the consoles. The graphics technology is still exciting and they’re still going to be significant things that we can show to people that will make them look at this and say “wow, this is a lot better than the previous generation.” I do think unique texturing is the key for the coming generation.

    There are lots and lots of graphics technologies that we can look at. And maybe you add five or six up and they wind up being something that really gives it a next generation wow. But just by itself, even with no newer presentation technologies, allowing unique texturing on lots and lots of surfaces, I think, is the key enabler for this generation.
    Of course, Carmack was developing Quakewars for 360 at the time...

    Anyway, all these things use virtual video memory, or deferred rendering, or a combination of both.
    Last edited by cadaveca; 06-29-2007 at 08:04 AM.

  9. #34
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by TouGe View Post
    Why take it to PM's? This is probably the best thread to come around in along time. I'm a fanboy of neither but find the discussion extremely interesting. Pleeeease continue.
    It's not that the discussion isn't interesting, it is, that's why I said if he wished to continue we can take it to pms. I stated that because it's derailing the topic at hand, that's all.

    Quote Originally Posted by purecain View Post
    my statements are my oppinion, and an educated one at that as i have done my research on the matter. why should i be banned for being happy about a technical matter which has altercations in favour of my gpu's architecture. in the first post the devs statement answer's this debate. higher, better quality using shader based aa resolve. gtx isnt as good at this and now puts its performance around the r600 area if not worse. if i had bought a gtx i would be dissapointed reading this but i'm not. i took the gamble and its paid off....
    no attitude, no flame just the honest truth.
    ati r600>gtx in my oppinion like it or not. i'm just going on the non biased information available.
    lets see if any of the big titles offer both methods but the quality is always going to be better using the software based aa resolve. explained in first post....
    The warning isn't about your opinion, it's about the way you present your opinion. Also, the GTS and GTX took less of a hit when enabling shader aa then the HD2900xt, incase you ignored that part. You can check out the benchmarks yourself, going from no AA to enabling AA at both 1280x720 and again at 1920x1200 BOTH had smaller performance penalties for the 8800's then it did for the HD 2900xt.

    Quote Originally Posted by cadaveca View Post
    Because I'm right, and he is not.

    Carmack on Megatexture, May2006:





    Of course, Carmack was developing Quakewars for 360 at the time...
    Yes, quake wars will be on the 360 as well. It'll also be on the PS3... Of course, Megatexturing is part of the id engine 4 and id engine 5 and is a patented technology by John Carmack. For the 360, as far as I can tell, it will be running using a wrapper as the PC version is DEFINITELY still running on OpenGL, as is the PS3 version.

    By the way, id isn't developing quakewars, splash damage is. Infact, neither splash damage or id is handling the xbox360 version. id is working on an as-of-yet unannounced game on id engine 5 currently.

    We've established that there's an xbox360 version, but considering the technology is patented by carmack, that means no one else can use it. Therefore making it opengl only as it stands, no? Doesn't mean it's not possible otherwise, Carmack stated he's surprised no one did it after he released doom 3 as he had the idea then but wanted to do unified lights and shadows first instead, it merely means that no one else CAN do it under any other renderer because carmack already owns the technology and it's used on an open gl engine.

    The technique other people are using, which is similar but doesn't use such massive textures is Texture Streaming(which works FINE in DX9 btw, see UE 3.0). Carmack breaks up giant textures(32kx32k in quakewars, MUCH larger in id engine 5) that stream as needed. They're two different things, which is why I asked if you're thinking of another technique.
    Last edited by DilTech; 06-29-2007 at 08:21 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  10. #35
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    But someone already had done a similar technique, by your own wikipedia link. But really, in iD's own wiki, iDTech4 being OpenGl only was scrapped. Please try to tell me Quake4 doesn't have DX9 codepath...First sound and input were DX in Doom3(do you not remember SM3.0 only in DX9, not OpenGL @ the time?)


    However, you are correct, this is far off topic.

    My point was that until you start seeing the other technologies brought forth by DX10's inception, ShaderAA is not really gonna be that important. But once you figure in everything else, it really makes far more sense to use this type of filtering method(overdraw) with unified shaders, and not with dedicated hardware. nV's shaders are unified in a DIFFERNT way that ATI's, and this is a large part of the reason we see a difference in performance.


    Like I posted earlier...80% of ALU resources go unused in today's most common game engines. Techland saw this, and implemented shaderAA-based MSAA so as to make full use of the gpu, becasue using HardwareAA would not have allowed such high ALU utilization. Nevermind that if you pack the scheduler correctly, shaderAA can be much faster than hardware AA when ALU's have gone idle due to waiting for HardwareAA to resolve so as to apply HDR filtering.
    Last edited by cadaveca; 06-29-2007 at 08:34 AM.

  11. #36
    Xtreme Member
    Join Date
    Nov 2006
    Posts
    134
    COJ isn't anything to buy new hardware for, for sure. The demo was out since last year, the they didn't do too much it seems since then. Still looks likes rather bad most the time, and runs rather bad, whether in DX9 or DX10 mode.

    Let's wait and see how UT3, Crysis, and the host of other games do.
    "And though after my skin worms destroy this body, yet in my flesh shall I see God:" (Job 19:26)
    Q6600 @ 3.46ghz / Asus P5B Deluxe Wifi / 8800gt/ 8gb DDR2800 G.Skill Ram / Thermaltake 750w Toughpower Modular PSU
    Coolermaster Stacker / X-fi / Raptor
    3dmark06 8800gt: 15,119

  12. #37
    Banned
    Join Date
    Oct 2006
    Posts
    963
    you mean lets see if dx10 games use shader based aa resolve or hardware based, as Thats going to have massive implications for gts/gtx owners. i cant wait for a true dx10 title now..........

  13. #38
    Xtreme Addict
    Join Date
    Jun 2002
    Location
    chicago
    Posts
    1,237
    Quote Originally Posted by purecain View Post
    you mean lets see if dx10 games use shader based aa resolve or hardware based, as Thats going to have massive implications for gts/gtx owners. i cant wait for a true dx10 title now..........
    Have you read anything that Diltech said? This has been a great read thanks to Diltech and cadaveca, but there are some stubborn fanboys here that aren't really reading what is being said or just ignoring it completely so as not to feal disproven.
    You say anyone who bought an 8800 is going to regret it........why? An FX 5900Ultra is a DX9 card......would anyone go out and buy one now thinking they will get the most from DX9 games? This is the first gen of DX10 cards and there will be many changes in gfx cards performance as DX10 becomes more widely used.
    There is a lot of good technical information in this thread, but all of it can boil down to absolutely nothing at this point in the game. In a year from now, will we be clammering to get a 2900XT or 8800GTX? No, we will want the newest card available.
    Cursed be the ground for our sake. Both thorns and thistles it shall bring forth for us. For out of the ground we are taken for the dust we are and to the dust we shall return
    Heat

  14. #39
    Banned
    Join Date
    Oct 2006
    Posts
    963
    go read the OP before you start flaming me spooky. you dont know what your talking about clearly. 1. the gtx is twice as exspensive, 2. the hd uses software based aa resolve easily and matches the 8800gtx.
    your 8600 is really going to struggle isnt it.....Oo

  15. #40
    Xtreme Addict
    Join Date
    Jun 2002
    Location
    chicago
    Posts
    1,237
    Quote Originally Posted by purecain View Post
    go read the OP before you start flaming me spooky. you dont know what your talking about clearly. 1. the gtx is twice as exspensive, 2. the hd uses software based aa resolve easily and matches the 8800gtx.
    your 8600 is really going to struggle isnt it.....Oo
    Where did I flame you? I can start if you'd like as I can see you like to be confronting by puting down my video card as if it effects me personally, but I have been on this forum for five years and don't wish to be kicked.
    Your whole purpose of the OP was clear "Everybody point and laugh at the nVidia owners" not in so many words, but it's clear.
    My 8600GTS works just fine for what I do right now, I gave my 7950GT to my son because his card died, so I bought a mid range card and I feel all warm and creamy inside about my purchase, so you're not hurting my feelings if that was your intent with such a comment.
    And that only solidifies my point. When I feel I need to, when DX10 is really prevelant, I can buy a new card, one that is probably going to be much better than your 2900XT or any of the current DX10 cards. My point was clear, I would not go out and buy or recommend to anyone buying a FX5900 and believe it offers the best DX9 performance just because it supports it, obviously later cards will do much better. These are the first DX10 cards and things will improve as always, whether nVidia is the best at it or ATi. I didn't feel bad about buying my first DX8 card, I didn't feel bad about my last DX8 card and on through DX9, etc.. Things will change
    Last edited by SPQQKY; 06-30-2007 at 09:34 AM.
    Cursed be the ground for our sake. Both thorns and thistles it shall bring forth for us. For out of the ground we are taken for the dust we are and to the dust we shall return
    Heat

  16. #41
    Banned
    Join Date
    Oct 2006
    Posts
    963
    this was meant to be a thread about the use of
    shader based aa and its effects upon current hardware with dx10. i dont see your point spooky...

    this is the kind of info this thread should be getting....

    We asked Richard Huddy, Worldwide Developer Relations Manager of AMD's Graphics Products Group, to go into more detail about why the Radeon HD 2000-series architecture has been optimised for shader-based AA rather than traditional multi-sample AA. He told us that 'with the most recent generations of games we've seen an emphasis on shader complexity (mostly more maths) with less of the overall processing time spent on the final part of the rendering process which is "the AA resolve". The resolve still needs to happen, but it's becoming a smaller and smaller part of the overall load. Add to that the fact that HDR rendering requires a non-linear AA resolve and you can see that the old fashioned linear AA resolve hardware is becoming less and less significant.' Huddy also explained that traditional AA 'doesn't work correctly [in games with] HDR because pixel brightness is non-linear in HDR rendering.'

    While many reviews of the HD 2900XT have made unflattering comparisons between it and Nvidia's GeForce 8800-series, Huddy was upbeat about AMD's new chip. 'Even at high resolutions, geometry aliasing is a growing problem that can only really be addressed by shader-based anti-aliasing. You'll see that there is a trend of reducing importance for the standard linear AA resolve operation, and growing importance for custom resolves and shader-based AA. For all these reasons we've focused our hardware efforts on shader horsepower rather than the older fixed-function operations. That's why we have so much more pure floating point horsepower in the HD 2900XT GPU than NVIDIA has in its 8800 cards... There's more value in a future-proof design such as ours because it focuses on problems of increasing importance, rather than on problems of diminishing importance."
    Last edited by purecain; 07-01-2007 at 04:29 AM.

  17. #42
    Xtreme Mentor dengyong's Avatar
    Join Date
    Nov 2006
    Location
    A great place again
    Posts
    2,589
    this was meant to be a thread about the use of
    shader based aa and its effects upon current hardware with dx10.
    You made it clear what this thread was meant to be when you said

    A.T.I. A.T.I. A.T.I. A.T.I. A.M.D. A.M.D. A.M.D. A.M.D. A.T.I. A.T.I. A.T.I. A.T.I. A.M.D. A.M.D. LOL COME ON SHOVE IT UP EM.....

  18. #43
    Banned
    Join Date
    Oct 2006
    Posts
    963
    i was happy that dx10 games used my gpu's architecture in favour of the gtx which will now under perform. i thought ati ati amd amd as i was happy for them aswell. whats your problem, oh thats it you have a gtx.....buzzzzzzzz JK.
    if you cant add any info to the thread then why try ruin it???????....

  19. #44
    Xtreme Cruncher
    Join Date
    Nov 2006
    Location
    Minnesota
    Posts
    841
    Quote Originally Posted by purecain View Post
    i was happy that dx10 games used my gpu's architecture in favour of the gtx which will now under perform. i thought ati ati amd amd as i was happy for them aswell. whats your problem, oh thats it you have a gtx.....buzzzzzzzz JK.
    if you cant add any info to the thread then why try ruin it???????....
    You know, you don't have to keep responding. They add nothing informative at all.

    Hey Cadaveca and Diltech, you guys need to get together and start a new thread concerning DX10 and OpenGL along the lines of previous posts. There is so much to learn on the subject and I'm waiting to hear more .
    QX 9650 | Maximus II Formula | G.SKILL 4GB (2 x 2GB) PC2 8500 | Sapphire 4850 X2 2GB| Auzen X-Fi Prelude 7.1 | 3x's WD 150GB Raptors & 2x's 1TB Seagate 7200.11 | Silverstone OP1200 | SilverStone TJ07 | Dell 2707WFP

    EK-RES 250 --> Swiftech MCP655 --> Apogee GTZ --> Enzotech Sapphire NB --> Feser X-Changer 360


  20. #45
    Xtreme Mentor dengyong's Avatar
    Join Date
    Nov 2006
    Location
    A great place again
    Posts
    2,589
    Quote Originally Posted by purecain View Post
    i was happy that dx10 games used my gpu's architecture in favour of the gtx which will now under perform. i thought ati ati amd amd as i was happy for them aswell. whats your problem, oh thats it you have a gtx.....buzzzzzzzz JK.
    if you cant add any info to the thread then why try ruin it???????....
    Theres an old chinese proverb that says "if I wanted to listen to an ass hole I would fart" JK.

    I won't bother you any more have a nice life.

  21. #46
    Banned
    Join Date
    Oct 2006
    Posts
    963
    good point, the reason i keep responding is that i wish for this info to be added to and do not want the thread to simply go away or end up inaccurate. this is one of 4 or 5 sites on the entire net that has discussed this matter, the 8800 not being able to function as well as the hd2900 in dx10 as its hardware based aa resolve is disabled leaving the card crippled...
    now i just want more info and am raking through forums trying to find it....

  22. #47
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by purecain View Post
    good point, the reason i keep responding is that i wish for this info to be added to and do not want the thread to simply go away or end up inaccurate. this is one of 4 or 5 sites on the entire net that has discussed this matter, the 8800 not being able to function as well as the hd2900 in dx10 as its hardware based aa resolve is disabled leaving the card crippled...
    now i just want more info and am raking through forums trying to find it....
    Again I state, the 8800's took a smaller performance hit when enabling shader aa resolves then the hd 2900xt. Therefore, technically, showing the g80 is faster at software aa then the r600.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  23. #48
    Xtreme Mentor
    Join Date
    Apr 2007
    Location
    Idaho
    Posts
    3,200
    Quote Originally Posted by purecain View Post
    good point, the reason i keep responding is that i wish for this info to be added to and do not want the thread to simply go away or end up inaccurate. this is one of 4 or 5 sites on the entire net that has discussed this matter, the 8800 not being able to function as well as the hd2900 in dx10 as its hardware based aa resolve is disabled leaving the card crippled...
    now i just want more info and am raking through forums trying to find it....
    After sorting through a lot of DX10 documents the ATi AA seems the better solution.
    "To exist in this vast universe for a speck of time is the great gift of life. Our tiny sliver of time is our gift of life. It is our only life. The universe will go on, indifferent to our brief existence, but while we are here we touch not just part of that vastness, but also the lives around us. Life is the gift each of us has been given. Each life is our own and no one else's. It is precious beyond all counting. It is the greatest value we have. Cherish it for what it truly is."

  24. #49
    Xtreme Member
    Join Date
    Jun 2005
    Location
    Bulgaria, Varna
    Posts
    447
    Quote Originally Posted by DilTech View Post
    Again I state, the 8800's took a smaller performance hit when enabling shader aa resolves then the hd 2900xt. Therefore, technically, showing the g80 is faster at software aa then the r600.
    Can you point a reference of bench scores with the DX10 CoJ?

    Mine 2900XT with Cat 7.6 here gets about 20% hit, going from 0xAA to 4xAA.
    Last edited by fellix_bg; 07-01-2007 at 01:51 PM.

  25. #50
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    http://www.extremetech.com/article2/...2147119,00.asp

    Take a look yourself at the hit difference between nvidia and ati for enabling AA, you'll see what I'm talking about. The HD 2900xt takes a larger hit when enabling shader aa then the G80, be it 8800gts or 8800gtx.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

Page 2 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •