Page 1 of 3 123 LastLast
Results 1 to 25 of 68

Thread: Shader Model 3.0 Done Right? X1000 lacks "Vertex Texture Fetching."

  1. #1
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,759

    Shader Model 3.0 Done Right? X1000 lacks "Vertex Texture Fetching."

    Quote Originally Posted by HardOCP
    Shader Model 3.0 Done Right?
    ATI has been selling the "Shader Model 3.0 Done Right" slogan along with the launch of their Radeon X1000 line of video cards. (Which we still have not seen for sale without delayed delivery.) ATI proclaims that they have done Shader Model 3.0 "right" by improving dynamic branching performance and turning batches of pixels into threads. However, with some recent news that has come to light we wonder if this slogan is really correct.


    According to The Tech Report, the Radeon X1000 series lack a Vertex Shader 3.0 feature called "Vertex Texture Fetching." Vertex Texture Fetching is useful when you need a vertex shader to read from texture memory. There are certain 3D effects which can benefit from this ability such as true dynamic displacement mapping. Now, this 3D effect itself may not be currently used in current games, and in fact may not even be used until the next generation of DX10 video cards. However, the case here is that ATI is claiming they have done Shader Model 3.0 "right," yet they are missing an official Shader Model 3.0 feature. This all seems a bit misleading. How can you claim to do Shader Model 3.0 right and then leave out a feature of the specification? The GeForce 6 and 7 series from NVIDIA supports this feature. So where does that leave ATI? Have they really done Shader Model 3.0 right or not?

    It does appear that a workaround is possible. However this does put more work on the shoulders of the game content developer instead of exploiting an easy Shader Model 3.0 feature in their code. Isn’t the point of standards and Shader Model 3.0 to allow game content developers and programmers a standard API among graphics cards to make their job a little easier?

    This may not be such a big deal if this particular 3D feature is never exploited until the next generation of video cards. But that isn’t the point. The point here is that ATI has claimed to do Shader Model 3.0 "right," and from what we have seen today, maybe it isn't so "right" at all. “Shader Model 3 The ATI Way?”
    Quote Originally Posted by The TechReport
    Radeon X1000 series lacks vertex texture fetch
    by Scott Wasson - 12:06 pm, October 6, 2005

    This question came up in the late stages of writing my Radeon X1000 series review, and I just got confirmation from ATI yesterday. Turns out that the vertex shaders in the Radeon X1000 series GPUs don't support a notable Shader Model 3.0 feature: vertex texture fetch. As it sounds, this capability allows the vertex shaders to read from texture memory, which is important because texture memory is sometimes treated as general storage in programmable GPUs. Vertex texture fetch is useful for techniques like displacement mapping, where the vertex and pixel shaders need to share data with one another.

    I asked ATI's David Nalasco about this issue, and he suggested a possible workaround for this limitation:

    No, vertex texture fetch is not supported. However, since the X1000 family does all pixel shader calculations with FP32 precision, just like the vertex shader, it is possible to get the same results using the render to vertex buffer capability. Basically, you do a quick pre-pass where you render to a special linear buffer in which each pixel represents a vertex. Textures can be used to modify each vertex through the pixel shader, and the result is then read back into the vertex shader. The result is fast vertex texturing with full filtering support, without requiring any special hardware in the vertex shader engine.

    Note that render to vertex buffer is possible in R4xx as well, but is limited to FP24 which could cause precision issues in some cases.

    Such a workaround would likely involve a performance penalty, but I doubt it would be a major hit. The larger issue is probably just the fact that the workaround would require special consideration from developers, because the GPUs lack a straightforward vertex texture fetch capability.
    I'd say that's a pretty big letdown for me personnally....

    I'll put the sources up in a sec....

    [H]ardOCP
    Techreport

    So why is ATI always NOT telling these things? Instead of doing it right they just branded it SM3 the ATI way....
    Last edited by Tim; 10-06-2005 at 01:32 PM.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  2. #2
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,471

    Shader Model 3.0 Done Right? X1000 lacks "Vertex Texture Fetching."

    We'll have to ask Grayskull to clarify this issue further. I'm sure there are features ATI has that may not necessarily be requirements for SM 3.0 that nvidia doesnt have (like 3dc which only the R420 had) - hopefully there wont be that much of a drop in performance or quality because of it.

    Perkam

  3. #3
    Xtreme Mentor
    Join Date
    Sep 2005
    Location
    Netherlands
    Posts
    2,772
    i got to say they r right about the: "Shader Model 3.0 Done Right" not being true.
    So false advertisement orso.
    They do say no current game uses this feature and i doubt many sites will pick up on this but its still bad ATI uses the "Shader Model 3.0 Done Right" slogan if its not true.

    im shure Nvidia will pick up on this and wel get a mudd fight. Although i hope not.

    edit:
    would be cool if there were a way to test this feature and to see how much power this work around drains.
    Time flies like an arrow. Fruit flies like a banana.
    Groucho Marx



    i know my grammar sux so stop hitting me

  4. #4
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    13,106
    Could this be what Crytek was talking about when they said SM3.0 on the ATi card need special programming/consideration?

  5. #5
    -150c Club Member
    Join Date
    May 2005
    Location
    Northeast, USA
    Posts
    10,823
    Its a let down for you? It says that the feature may not even be used for what its programmed in the current moment. Its no let down. Its like is 7800gtx vs x1800xt, frankly, Its not of the matter to me, as there both too expensive and im an ati fan boy.


    If you have a cooling question or concern feel free to contact me.

  6. #6
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,759
    Quote Originally Posted by n00b 0f l337
    Its a let down for you? It says that the feature may not even be used for what its programmed in the current moment. Its no let down. Its like is 7800gtx vs x1800xt, frankly, Its not of the matter to me, as there both too expensive and im an ati fan boy.
    Unreal 3 engine is all about Displacement mapping...and that's the first big release next year....

    as there both too expensive and im an ati fan boy.
    Congrats....

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  7. #7
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,622
    Quote Originally Posted by perkam
    I'm sure there are features ATI has that may not necessarily be requirements for SM 3.0 that nvidia doesnt have (like 3dc which only the R420 had)
    3Dc is an open standard, and support for it was enabled a long time ago in the Forceware drivers for GeForce 6 and 7 series cards (albiet they do it differently though).
    Last edited by Cybercat; 10-06-2005 at 02:21 PM.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  8. #8
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    13,106
    Yeah, I agree with TIM....ATi's gonna have a problem or two with the Unreal 3 engine.....could Sweeney have known R520 wouldn't have support and have coded around it for X1800 in advance or could this delay the game?

    Granted, there is a work around with the Pixel Shaders--but they don't seem to be that good from the preliminary benchmarks we've had.
    Last edited by Vapor; 10-06-2005 at 02:43 PM.

  9. #9
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Bay Area, CA
    Posts
    1,466
    Quote Originally Posted by Vapor
    ATi's gonna have a problem or two with the Unreal 3 engine.....could Carmack (being the nVidiot he is) have known R520 wouldn't have support and taken advantage of that?
    HUH?

    Do you mean Sweeney?

  10. #10
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    13,106
    Uhhhhhh....yeah....sure. I'm bad with names....I'll edit my post now.

  11. #11
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Bay Area, CA
    Posts
    1,466
    Quote Originally Posted by Vapor
    Uhhhhhh....yeah....sure. I'm bad with names....I'll edit my post now.
    heh, i know whatcha mean, but i couldn't tell if you were switching gears in mid thought...

  12. #12
    Stop...Ban-Hammer Time!
    Join Date
    May 2005
    Posts
    3,726
    Tim Sweeney *specifically* told people at E3 that if they're buying a card this generation in anticipation for UT2k7 that they should buy the 7800gtx...

    Epic is a 100% NVidia loyal company, the odds of them using this work-around are slim to none... Looks like ATi are in even more trouble.

  13. #13
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    13,106
    Quote Originally Posted by DilTech
    Tim Sweeney *specifically* told people at E3 that if they're buying a card this generation in anticipation for UT2k7 that they should buy the 7800gtx...

    Epic is a 100% NVidia loyal company, the odds of them using this work-around are slim to none... Looks like ATi are in even more trouble.
    I remember that video interview...he said he'd definitely get the G70, but did he know about this (could he have even known about this?) or was it just his bias showing?

  14. #14
    Stop...Ban-Hammer Time!
    Join Date
    May 2005
    Posts
    3,726
    Well, generally, videocard companies tell game developers about features in their upcoming graphics cards in order for the game makers to put them to use in games... I know my boss knew about radiosity in the 7800gtx WAYYYY before the 7800gtx came out, so I'm assuming epic would've had to known... ATi would be RETARDED not to attempt to get a foot in the door on that engine!

    Either way, don't expect a work-around for it, epic is one of the more-pro-nvidia companies on the planet!

    Finally, I'm assuming he said what he said because radiosity is suppose to be used in the unreal 3 engine, a feature that ATi doesn't even have access to. The guy who wrote that algorithm is also a NVidia fan, as is anyone who ever owned one of the old rage cards. .. This very well could be the reason as well though.
    Last edited by DilTech; 10-06-2005 at 03:15 PM.

  15. #15
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Houston, TX
    Posts
    1,559
    Quote Originally Posted by nVidia.com

    GPU Cloth
    This sample demonstrates how to use Shader Model 3.0 to simulate and render cloth on the GPU. The cloth vertex positions are computed through several pixel shaders and saved into a texture. A vertex shader then reads these positions using Vertex Texture Fetch (VTF) to render the cloth.





    Simple Vertex Texture
    This simple example demonstrates the use of the NV_vertex_program3 extension to perform texture look-ups in a vertex program. It uses this feature to perform simple displacement mapping. The example also shows how to implement bilinear filtering of vertex texture fetches.





    Vertex Texture Fetch Water
    This sample demonstrates a technique for simulating and rendering water. The water is simulated via Verlet integration of the 2D wave equation using a pixel shader. The simulation result is used by a vertex shader via vertex texture fetch (VTF). The water surface is rendered by combining screen-space refraction and reflection textures.


    LINK:
    http://download.developer.nvidia.com...les_video.html

    Looks like a nice feature
    So is this feature hardware dependent?
    There is no way to get ATi's drivers, or the game itself patched to support VTF?
    Last edited by Turok; 10-06-2005 at 05:14 PM.

  16. #16
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    13,106
    Some good info there Turok...

  17. #17
    Stop...Ban-Hammer Time!
    Join Date
    May 2005
    Posts
    3,726
    Quote Originally Posted by Turok
    LINK:
    http://download.developer.nvidia.com...les_video.html

    Looks like a nice feature
    So is this feature hardware dependent?
    There is no way to get ATi's drivers, or the game itself patched to support VTF?
    It's a hardware feature, and ATi left it out of their x1800xt, probably by mishap.

    Basically, they pulled a NV3x(which left out parts of dx9 IIRC)... it's going to bite them clean in the come UT2k7.

  18. #18
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,759
    Good info Turok, thanks for sharing

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  19. #19
    XS News
    Join Date
    Aug 2004
    Location
    Sweden
    Posts
    2,025
    Hope someone can test this when they get an XL card.
    Is it possible to download this "demos" ?
    Everything extra is bad!

  20. #20
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,471
    Basically, they pulled a NV3x
    Do you really believe 1500 points more than a gtx in 3dmark05 can constitute "pulling an Nv3x" ??

    Yes it lacks one feature, but ATI will tell you they have couple more GTX doesnt have. The GTX doesnt have every graphics feature that has ever been released or ever will be released...it has its drawbacks as well - this is just one for ATI.

    I'm hoping ATI clears this up though...there are enough misconceptions about the R5xx line to have ppl claiming its not SM3.0 when they've obviously met at least the minimum requirements for the standard.

    Perkam

  21. #21
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,251
    Quote Originally Posted by perkam
    Do you really believe 1500 points more than a gtx in 3dmark05 can constitute "pulling an Nv3x" ??

    Yes it lacks one feature, but ATI will tell you they have couple more GTX doesnt have. The GTX doesnt have every graphics feature that has ever been released or ever will be released...it has its drawbacks as well - this is just one for ATI.

    I'm hoping ATI clears this up though...there are enough misconceptions about the R5xx line to have ppl claiming its not SM3.0 when they've obviously met at least the minimum requirements for the standard.

    Perkam
    but this feature is part of an industry programming standard. it's like you're having a (dx9) graphics card which doesn't support antialiasing.
    if ati wants to go another way - ok, let them go... reminds me of the time when game developers had to consider different renderers like 3dfx' glide and the powervr-engine...
    would be sucky if one day one has to choose a card because of it's different features (and i mean rendering-features...) and not only it's performance.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  22. #22
    XS News
    Join Date
    Aug 2004
    Location
    Sweden
    Posts
    2,025
    That "one day" came years ago i belive.
    Everything extra is bad!

  23. #23
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Houston, TX
    Posts
    1,559
    All I know is that if this feature eather limits the graphical content or doesnt let me play UT2k7, Gears of War, or any other UE3 games, I may end up buying a nVidia card for the first time.

    If the difference is only performance wise, and there is no eye-kandy lost, and it manages to stay above the 7800 GTX in performance, then ATi could still have a chance here.

    Im guessing if VTF is really important with the UE3 games, then ATi will probably make a core simmilar to the r520, but just with a minor tweak to add VTF and probably a few more clocks, and then they might release something like the x1_50 line of cards, or they will probably speed up the r580 and add VTF before release.

    It would kill ATi if they cant run UE3 games without VTF.
    Their cards would be considered obsolete, and nVidia may start making PowerPoint comparisons about it for publicity.
    They would crush ATi even more if they decide to release the 7800 Ultra they said was canceled.
    Last edited by Turok; 10-07-2005 at 06:19 AM.

  24. #24
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,471
    Im guessing if VTF is really important with the UE3 games, then ATi will probably make a core simmilar to the r520, but just with a minor tweak to add VTF and probably a few more clocks, and then they might release something like the x1_50 line of cards, or they will probably speed up the r580 and add VTF before release.
    I dunno about the VTF in it, but before the R580, a little present called the RV560 will be hitting stores in 2K6

    Perkam
    Last edited by perkam; 10-07-2005 at 06:26 AM.

  25. #25
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Posts
    696
    Some points to remember here (and forgive me for the briefness, I'm knocking together a Radeon X1000 technology preview right now which will make mention of this):

    - Vertex texturing is not a required feature for Vertex Shader 3.0 support
    - Vertex texturing is unusably slow on current NVIDIA hardware
    - Vertex texturing requires a large amount of transistors and silicon to be implemented properly without the move to a unified shader architecture
    - ATI are offering 'Render to Vertex Buffer' as an alternative to vertex texturing which can be utilised using a FOURCC identifier. This will offer the same functionality as vertex texturing, but with better performance
    Heres more info

Page 1 of 3 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •