Results 1 to 25 of 201

Thread: Ian McNaughton goes out against The Way it's Meant to be Played

Hybrid View

  1. #1
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Quote Originally Posted by Final8ty View Post
    That's what Microsofts DX APIs is for & NV is paying Devs to go outside it on somethings that DX is quite capable of doing.
    It's obviously not sufficient as most of the modern deferred-rendering engines have huge issues with enabling AA and are requiring vendor-specific tweaks.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

  2. #2
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by aka1nas View Post
    It's obviously not sufficient as most of the modern deferred-rendering engines have huge issues with enabling AA and are requiring vendor-specific tweaks.
    So what your saying if NV did not sponsor Batman then there would be no AA for anyone in the game.

    Well that's a new one.. i didn't know that AA is games have all needed to be sponsored over the years or from now on.

  3. #3
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Quote Originally Posted by Final8ty View Post
    So what your saying if NV did not sponsor Batman then there would be no AA for anyone in the game.

    Well that's a new one.. i didn't know that AA is games have all needed to be sponsored over the years or from now on.
    Did you actually play any of the first dozen or so UE3-based games? No working AA at all on release in many cases. Deferred-rendering broke most contemporary AA methods.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

  4. #4
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by aka1nas View Post
    Did you actually play any of the first dozen or so UE3-based games? No working AA at all on release in many cases. Deferred-rendering broke most contemporary AA methods.
    I'm not disputing that fact.
    The fact is AA is has been common for years & if now it has become a problem & needed paying for its implantation then the developers have taken a leap backwards & another thorn in the side for PC gaming over consoles, like the PC needs any help with gamers going in that direction.

  5. #5
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Quote Originally Posted by Final8ty View Post
    I'm not disputing that fact.
    The fact is AA is has been common for years & if now it has become a problem & needed paying for its implantation then the developers have taken a leap backwards & another thorn in the side for PC gaming over consoles, like the PC needs any help with gamers going in that direction.
    Fair enough, I think we unfortunately have taken that leap back on the PC side lately. The consolers won't notice the lack of real AA on those titles, the devs can just throw a blur shader effect over it and call it a day.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

  6. #6
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Vienna View Post
    Wow, you really remember your history wrong.

    The nVidia cards (FX 5xxx series) were automatically put on the DX8.1 path instead of the "proper" "ati" shader path as you put it, or rather the DX9.0 shader path, because performance on the DX9.0 shader path was HORRIBLE on the nVidia cards.

    So, yes, abit of image quality(dx8.1 vs dx9.0) on the fx 5xxx series was sacrificed, but for good reason.
    Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.

    You see, it's a little known fact, but during the finalization for the specifications of DX9 NVidia was fighting for FP-32 to be the standard while ATi was fighting for FP-24. FP-32 required 33% more power than FP-24 but didn't bring enough to the table when combined with the rest of the specifications for microsoft to go for it, and as such microsoft took FP-24. That was the first hue strike against the FX series, naturally using 33% more horse power to do the same effects.

    Switching the FX series to the ATi code path took it out of FP-32 mode and put it in to FP-16 which made it run DX9 100% fine in HL2, as no effect in HL2 required more than FP-16. It was pointed out to valve on countless occasions before release, and again after release... They never did nothing.

    Same thing you're all complaining about now.

    Quote Originally Posted by Final8ty View Post
    I'm not disputing that fact.
    The fact is AA is has been common for years & if now it has become a problem & needed paying for its implantation then the developers have taken a leap backwards & another thorn in the side for PC gaming over consoles, like the PC needs any help with gamers going in that direction.
    Well, about that whole AA thing...

    Most UE3 titles have problems with AA. I've been playing back through Gears of War PC on my 4850, and turning on DX10 + AA results in the frame-rate dipping down to less than 20 fps, sometimes even further. Playing with my old 8800GTX it stayed above 50, always. 1920x1080p.....Well, 1920x1080p on the NVidia, ati still hasn't fixed their HDTV support, and it's been YEARS, it reverts back to 1080i in quite a few games.

    Basically, I'm just saying, until we know all the facts here it's just random flaming and speculation. It may truly have issues in certain parts of the game on ATi hardware.
    Last edited by DilTech; 09-29-2009 at 02:31 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  7. #7
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    166
    AMD prides itself on supporting open standards and our goal is to advance PC gaming regardless whether people purchase our products.
    As pointed out by another, its AMD not ATI.
    Batman: Arkham Asylum
    In this game, Nvidia has an in-game option for AA, whereas, gamers using ATI Graphics Cards...
    *DON'T!!!*** ... are required to force AA on in the Catalyst Control Center.
    So it works but only nvidia boys get this option in game.
    The advantage of in-game AA is that the engine can run AA selectively on scenes whereas Forced AA in CCC is required to use brute force to apply AA on every scene and object, requiring much more work.
    Nice way to decrease ATI consumers performance nvida.
    Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced. This option is not available for the retail game as there is a secure rom.
    This is the golden quote though, read it carefully and fully appreciate what has been done here, in particular to the innocent ATI users.

    Anybody who defends these business practices is ANTI-CONSUMER. <--- full stop
    [SIGPIC][/SIGPIC]
    CoolerMaster Stacker 830SE|Antec Signature 850W|Gigabyte X58A-UD5 F5 slic2.1
    Intel Core i7 930 16x200@3,200Mhz|vcore 1.14|Intel Stock CPU Cooler
    GSKILL DDR3 Perfect Storm 2000 @6-6-6-16-1T 1600Mhz|ATI 5870 1024MB 850/1200
    Windows 7 Ultimate x64 bootdisk: Crucial RealSSD-C300 128GB SATA-III

  8. #8
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by DilTech View Post
    Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.

    1.You see, it's a little known fact, but during the finalization for the specifications of DX9 NVidia was fighting for FP-32 to be the standard while ATi was fighting for FP-24. FP-32 required 33% more power than FP-24 but didn't bring enough to the table when combined with the rest of the specifications for microsoft to go for it, and as such microsoft took FP-24. That was the first hue strike against the FX series, naturally using 33% more horse power to do the same effects.

    2Switching the FX series to the ATi code path took it out of FP-32 mode and put it in to FP-16 which made it run DX9 100% fine in HL2, as no effect in HL2 required more than FP-16. It was pointed out to valve on countless occasions before release, and again after release... They never did nothing.

    Same thing you're all complaining about now.



    Well, about that whole AA thing...

    3.Most UE3 titles have problems with AA. I've been playing back through Gears of War PC on my 4850, and turning on DX10 + AA results in the frame-rate dipping down to less than 20 fps, sometimes even further. Playing with my old 8800GTX it stayed above 50, always. 1920x1080p.....Well, 1920x1080p on the NVidia, ati still hasn't fixed their HDTV support, and it's been YEARS, it reverts back to 1080i in quite a few games.

    Basically, I'm just saying, until we know all the facts here it's just random flaming and speculation. It may truly have issues in certain parts of the game on ATi hardware.
    I logged in this account because correcting mods are fun, and there really are parts of what you said that are skewed beyond proportion, and then they're the uhh... challenged people who will just QFT you without even reading finely.

    1-1. Wrong. MS took FP24 only because of nVidia refusing to license chip IP to Microsoft, thus forcing MS to BUY chips for the Xbox from nVidia, while costs stay roughly the same and no shrinking could be done, aka no Xbox Slim ever. This was also partly why MS was in the red so badly for the original Xbox. Partly why nVidia didn't catch wind of it was miscommunication, AND MS doing a deservedly revenge stance.

    1-2. FP16 was NEVER on the board for DX9, it was only used for nVidia's proprietary Cg shading language. Thus developers could not target FP16 if they wanted to use Microsoft's HLSL. You're suggesting Valve to waste more time to compile and unify for a proprietary language for a new series of cards that aren't particularly going anywhere. Wow.

    1-3. Do you seriously think Valve would spend time on an architecture that's so unbalanced in the first place to make it slightly less epic fail? ATI or not, they would NOT have used a myriad of shader languages and try to keep artistic cohesion. Would FP16 just make the nVidia cards fly anyway? There's little point in doing so compared to the DX8.1 path. -Just when you thought the days of Glide vs whatever was over and you expect them to do this BS?

    2-1. Switching HL2 to ATI's codepath put the Geforce FX to FP32. Again there is NO official FP16 codepath at all. The FXes running at FP32 were epic fails.

    2-2. Who cares? nVidia of course. If you didn't get the memo illegal shader code swapping was the rageeeee.

    So nVidia began swapping.
    And swapping.
    And swapping.
    Oh, just in 3DMark03 by the way. The nature test. They didn't have balls to approach Valve and ask them if it was possible if THEY coded the Cg path with partial precision (like ATI and whatever DX10.1 game, they analyze the code and give suggestions), they just kept silent. And swapping for benchmark scores. Cliff's notes version: cheaters

    2-3. MS chose FP24 for a reason. nVidia still recommended FP32 to be used for a reason. I think I saw lots of texture color banding and such although the game was playable. Claiming that you just need to code some alien shader code separately- is that still a just?

    Now that ATI cards have every DirectX spec-compliant feature (and more of course with 11), there is no reason for such AA bull**** to happen. To try to justify the Geforce FX HL2 fiasco with this- AND sympathize with nVidia is incredulous (and of course, hillarious on my side. )

    3-1. Good God. Don't you know you're even hitting a RAM capacity bottleneck? Most non Cryengine 2 engines use MORE vRAM under DX10, you're running at 1080p, with 4xAA. And you call it an engine problem.



    I wouldn't have an issue if you didn't act like you were speaking the penultimate truth. And to think that people would QFT you on this.


    Oh, speaking of GT300, I presume that when the reviews come the review thread itself will stay in the news section eh


    p/s: On a less hostile note, Catalyst 9.9 seems to have fixed HDTV modes. I can't claim 100% accuracy as I don't connect to one, but I think I remember some positive chatter on that over a supposedly bad driver release.
    Last edited by Macadamia; 09-29-2009 at 04:10 AM.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  9. #9
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by DilTech View Post
    Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.


    Well, about that whole AA thing...

    Most UE3 titles have problems with AA. I've been playing back through Gears of War PC on my 4850, and turning on DX10 + AA results in the frame-rate dipping down to less than 20 fps, sometimes even further. Playing with my old 8800GTX it stayed above 50, always. 1920x1080p.....Well, 1920x1080p on the NVidia, ati still hasn't fixed their HDTV support, and it's been YEARS, it reverts back to 1080i in quite a few games.

    Basically, I'm just saying, until we know all the facts here it's just random flaming and speculation. It may truly have issues in certain parts of the game on ATi hardware.
    Your problems with HDTV an ATI cards is not what the topic is about which could be specific ATI card to HDTV model compatibility problem which does happen even with monitors.
    I have no problem with my ATI card on my HDTV

    Gears of war is a dog to run in DX10 and still not my point as i have already said that there is a problem but paying for AA support is not what the future should be because it was not like that in the past.
    Last edited by Final8ty; 09-29-2009 at 04:07 AM.

  10. #10
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by DilTech View Post
    Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.

    You see, it's a little known fact, but during the finalization for the specifications of DX9 NVidia was fighting for FP-32 to be the standard while ATi was fighting for FP-24. FP-32 required 33% more power than FP-24 but didn't bring enough to the table when combined with the rest of the specifications for microsoft to go for it, and as such microsoft took FP-24. That was the first hue strike against the FX series, naturally using 33% more horse power to do the same effects.

    Switching the FX series to the ATi code path took it out of FP-32 mode and put it in to FP-16 which made it run DX9 100% fine in HL2, as no effect in HL2 required more than FP-16. It was pointed out to valve on countless occasions before release, and again after release... They never did nothing.

    Same thing you're all complaining about now.
    Not same thing for a simple reason, why HL2 use FP24/32? Because minimum spec for DX9.0 is FP24! FX series could only do FP16/FP32, that Nvidia fault! Valve use FP32 for HL2 with FX series because DX9.0 specs command it and FX series can't do it. End of rewriting story.

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •