MMM
Results 1 to 10 of 10

Thread: Radiosity on R520, whereforth art thou ??

  1. #1
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058

    Unhappy Radiosity on R520, whereforth art thou ??

    Originally posted by ShadowMage :

    Why won't radiosity work on ATI hardware? Radiosity is a concept, like HDR, which worked on R420.

    Also, it has been 100% proven that dynamic branching is far more superior on R520. Since dynamic branching improves performance drastically on long shaders, R520 will be at an advantage in this aspect.

    Link: http://delivery.acm.org/10.1145/1010...TOKEN=99218155

    This scientific paper has radiosity working on a FX5900 lol
    Member Diltech's response:

    Quote Originally Posted by DilTech
    Real-Time Radiosity requires certain extensions pre-programmed into the core, i.e. hardware support, for the algorithm to work.

    You see, they could render radiosity with older hardware, but not realtime, more-like 3d rendering software(ask ANYONE here who uses 3dstudiomax or maya how long it takes to render radiosity!). The guy who came up with the algorithm said in his tech notes that it was ONLY given to nvidia to add hardware support in the 7800's, as the guy doesn't like ati at all. Without the hardware extensions, radiosity won't work. Epic will be implementing this algorithm in UE3.0.

    Run a search in the news section, I posted it awhile ago. NVidia have a definite advantage there.
    Perkam
    Last edited by perkam; 10-12-2005 at 10:15 AM.

  2. #2
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Run a search in the news section, I posted it awhile ago. NVidia have a definite advantage there.
    When a company has exclusive access to a resource voluntarily not made available to its competitors, its called a monolopy, not an advantage. So yea, Nvidia has a monopoly here, and all the Nvidia-sponsored 3d game companies will jump at the opportunity to make a slide sometime in 2k6 showing lack of it on ATI's nex gen hardware.

    ATI has successfully countered this with HDR, which is available to all graphics companies, and with its successful implementation in many games, it defeats the need for radiosity imo.

    FYI, ATI has never sponsored a company which has limited access to its competitors, which I've said a couple of months back too, unlike Nvidia which allowed Splinter Cell: CT to have SM 3.0-only advanced effects menu. When Nvidia wanted to implement 3dc in its future hardware, it was able to access those algorithms freely Its a foul marketing practice that works and ppl dont notice because Nvidia's green gets the halo and ATI's red the symbol of the devil all too often.

    Perkam
    Last edited by perkam; 10-12-2005 at 10:16 AM.

  3. #3
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    http://www.gamedev.net/reference/pro...s/rtradiosity/
    Note that link is a bit outdated, I'm looking for the updated one with the full explanation of why he chose nvidia and won't give it to ATi.
    Notice, he sent it to NVidia and Atari/Epic, so NVidia could give it hardware support, and so atari/epic could put it in UE3.0. As you can tell by them saying "nvidia chose to ignore me" there, and you can now clearly see at www.nvidia.com under 7800 that Radiosity is added(he noted this in the updated article, which I'll hunt down when I get the chance to), this article is outdated. I'm merely posting this to show you what I'm talking about.

    Realtime Radiosity and why it's needed...it's MUCH more important than HDR, and required to make HDR seem "realistic" rather than like over saturation of light. As you can see, the reason HDR looks fake, or overdone, is because the light doesn't "bleed" colors from the objects it touches like it does in real life, but only the color of the light itself. In real life, every surface a light bounces off of, it gathers a bit of color from that object, and it leads to the final coloring of the light that your eyes percieve. It's why we can tell the difference between a 3d rendering and the actual thing. Without this, you end up with the oversaturated look we currently have with HDR. It's DEFINITELY a huge step forward! Also, having HDR is NOT enough to compensate for missing such a thing.

    http://web.cs.wpi.edu/~matt/courses/...radiosity.html
    There's another explanation of radiosity.

    This is a VERY large blow to ATi, and NVidia couldn't give it to ATi if they wanted to, as the guy who wrote the webpage in the first link owns the patent to the tech. It's up to him, NOT NVIDIA, to give it up, and he doesn't like ATi. So don't blame it on NVidia perkam, it's not their fault in the slightest.
    Last edited by DilTech; 10-12-2005 at 10:39 PM.

  4. #4
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    reply to how long it takes with older hardware:

    Long time ago I used to build maps for the game Medal of Honor, and when doing a final compile of the whole thing (all the lights, including radiosity lightning) it would litterly take A WHOLE DAY to finish (depending on the size of the map ofcourse). Though I'm sure the lightning was all calculated with the CPU which was a 2600 t-bred @ 2.4 back then.

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  5. #5
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    It still takes FOREVER to do with new hardware besides the 7 series biohead, trust me on that. It takes hours on a 8-way SGI system!

  6. #6
    Xtreme Addict
    Join Date
    Sep 2004
    Location
    Fantasia
    Posts
    1,297
    I'll post this one more time, and sorry to get a bit off-topic. I know it's not DirectX or OpenGL, but it's great for animators nonetheless:

    http://www.saarcor.de/
    ebay under aws983s, heatware under Mr. Tinker.

  7. #7
    Xtreme Addict
    Join Date
    Aug 2004
    Location
    Austin, TX
    Posts
    1,346
    I doubt adding a few extensions via hardware can make that big of a difference. Even if it's licensed, that won't stop a few rogue coders from making a Radiosity hack. Probably expect an unofficial one from humus
    oh man

  8. #8
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    A radiosity hack... that's funny.

    You see, this isn't like Transparency AA(or as ati wanna call it adaptive aa), it requires extensions to handle the very time consuming mathetical equations that figure out the exact amount of color inwhich bleeds from every single surface. To do this in drivers(i.e. a hack) would cause an overhead like you wouldn't believe... Why do you think there's so many transistors in the G70/NV47 and the die is so large, enough to account for 32 pipes, even though there's only 24?... Now you know.

    My boss, as you know, works for autodesk. On his 8-way Silicon Graphics system(SGI Power Challenge Server), it literally takes HOURS to render this. This is with a system MADE for this, and an application FULLY multi-threaded that puts all EIGHT cores at 100% usage! The system literally needed it's OWN circuit breaker(it uses 2 1500watt powersupplys, a total of more wattage than macci and sampsa combined have used to set ANY record!)! Incase you're curious as to what the system looks like, here's a picture


    .... I'm trying to get him to fold on it. That system puts out literally more heat than a 9250+ btu space heater.

    Point is, don't just expect a driver hack to handle this one, it takes some SERIOUS power without said extensions to do it in real-time. It's not something that can just be "hacked". Even this beast can't do it, and it's only purpose IS Computer generated graphics!
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	sgibig.jpg 
Views:	167 
Size:	58.4 KB 
ID:	38486  
    Last edited by DilTech; 10-12-2005 at 10:12 PM.

  9. #9
    Xtreme Addict
    Join Date
    Aug 2004
    Location
    Austin, TX
    Posts
    1,346
    Like you said before, NVIDIA isn't using "true" Radiosity, the company is using a "fake" low precision version. That's what I meant by "hack".


    EDIT: I'm getting clarification here: http://www.beyond3d.com/forum/showth...904#post595904

    I'm sure that just about no one here can thoroughly answer the question (aside from maybe Greyskull), so maybe the gfx pros there can help The reason I posted the link is that people may be interested in following the updates in real-time, instead of the delay for me to update this thread.
    Last edited by Shadowmage; 10-13-2005 at 05:45 PM.
    oh man

  10. #10
    Xtreme X.I.P.
    Join Date
    Aug 2004
    Location
    Chile
    Posts
    4,151
    i got some awsome pics today from an ati presentation on a x1800xt (we also got an x1800xl to review), i'll upload them later to show you guys (ruby's demo and toy shop demo)

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •