Page 7 of 9 FirstFirst ... 456789 LastLast
Results 151 to 175 of 201

Thread: Ian McNaughton goes out against The Way it's Meant to be Played

  1. #151
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by aka1nas View Post
    Absolutely, it's called a vendor supporting their ing product. Why would you as a software developer want to work with advanced functionality (only available by one of the two vendors) when they won't give you the time of day with regards to support?
    Read the whole thread my friends, already respond about that!
    A.Creed TWIMTBP games. End of story. Exactly the same story that Batman:AA.

    Quote Originally Posted by aka1nas View Post
    You can't really excuse AMD/ATI for "not being big enough" or not being in a financial position to provide partner support for their products. This is a basic part of being a technology vendor and they have failed miserably in both incarnations of the company.
    So reapeating again and again that AMD not supporting developper because some folks said this. There is already tons of link in this thread showing they do it.
    Or just go here and you will see : http://game.amd.com/us-en/play_games.aspx
    Just seems that Nvidia pay much more to gain support!

  2. #152
    Xtreme Member
    Join Date
    Oct 2004
    Posts
    171
    If you needed more proof on Nvidias business practices:

    "...starting with the Release 186.00 of its graphics drivers - nVidia disabled PhysX if you have an ATI Radeon card in the same system."
    http://www.brightsideofnews.com/news...evelopers.aspx

  3. #153
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Farinorco View Post
    As JohnJohn has said, I've said "consumers", not "NVIDIA consumers".

    I don't know about others, but I consider myself a consumer of certain kinds of products, not of brands. I'm a graphic hardware consumer, not a NVIDIA or ATI consumer.

    As a consumer, when I want to buy a new piece of hardware, I chose the product that I think it's better for my budget, be it from NVIDIA, AMD, Intel, or VIA. So if someone it's artificially making some software not working at a full level with certain hardware which I normally could chose, when that software is perfectly compatible with that hardware, that someone is screwing me. As a consumer.



    Think about this hypothetical situation:

    AMD has put money (work, code, it's the same) to the developers of DIRT 2 to use DX11 code. So they make the same that NVIDIA with Batman AA: the DX11 features (or some of them) can't be enabled in NVIDIA hardware (for when they have some).

    Even if that was possible (I don't think AMD is in position to do something like that, as is the case of NVIDIA)... wow, GREAT. Now we have fixed everything. Now I not only can't use AA in B:AA if I have ATi, but I can't use DX11 in DIRT2 if I have NVIDIA. Excelent!. My situation as a consumer has improved hugely since AMD "has improved their relationships with developers to the degree of NVIDIA".

    The problem here is not the relationships with the developers, but what kind of things should/shouldn't be allowed to be done with those relationships.
    Well put.

  4. #154
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by aka1nas View Post
    It's obviously not sufficient as most of the modern deferred-rendering engines have huge issues with enabling AA and are requiring vendor-specific tweaks.
    So what your saying if NV did not sponsor Batman then there would be no AA for anyone in the game.

    Well that's a new one.. i didn't know that AA is games have all needed to be sponsored over the years or from now on.

  5. #155
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Quote Originally Posted by Final8ty View Post
    So what your saying if NV did not sponsor Batman then there would be no AA for anyone in the game.

    Well that's a new one.. i didn't know that AA is games have all needed to be sponsored over the years or from now on.
    Did you actually play any of the first dozen or so UE3-based games? No working AA at all on release in many cases. Deferred-rendering broke most contemporary AA methods.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

  6. #156
    Xtreme Addict
    Join Date
    Aug 2008
    Location
    Hollywierd, CA
    Posts
    1,284
    Quote Originally Posted by DilTech View Post
    First, ATi did try to do there own program... Problem is, they can't afford the test lab that NVidia offers to every developer who signs up to the program. It failed absolutely miserably, and outside of HL2 and Guildwars, it didn't get into any hot selling games IIRC. Sure, they took CoJ from NVidia, but how many copies did that title even sell? Most people just downloaded it and bench marked it, and if you remember ATi had the developers force shader AA even though for the title it didn't make a SINGLE CHANGE visibly while causing a much bigger hit enabling on NVidia hardware. ATi also pulled a few things in HL2 that you could fix by making the game think you were running an ATi card if you were running an FX card. In other words, ATi attempted to strong arm as well, the problem is they don't have the backing to do such things.

    Now, on with the show...

    You see, while it's sad that NVidia got added features put into the new batman, one must remember that like it or not the new batman IS one of the big games this year. The fact that there are features not working on ATi's cards is something people reading the reviews SHOULD know, whether they claim it to be fair or unfair, it STILL EXISTS. As a reviewer, it's their responsibility to inform the consumer of issues that exist with a card and various titles... By not doing so, they prove themselves as biased...

    Moving right along, about the whole RE5 thing. It's not exactly like NVidia has this massive frame boost in said title over ATi. ATi just has issues in their driver in said title apparently(wouldn't know, beat it on console back in march). As such, pointing this out claiming it as unfair is a joke. You see, NVidia is putting out quite a bit of cash and hardware to test their hardware on it early... If ATi did the same(like they tried to do once, but were WAY to picky about who they'd work with) then they'd be in the same boat. Basically, NVidia are reinvesting some of that cash you spend on your card to make sure it works properly with games, and that I see nothing wrong with.

    As for NFS: Shift... It's EA. Seriously... You aren't going to get them to change the code on their engine because of how it reacts with your hardware. You surely aren't going to get them to fix bugs, they won't even do that on closed console games! Even more so if you aren't going to help them do it. Want it working right on your hardware? Fix it in your drivers. There's really no reason why it should interact fine with one line of DX10 hardware and not the other, unless the problem lies with the driver and NOT the programming....Remember, DX10 was created so as to NOT have to make separate coding paths between the 2 vendors, yet they are wanting the code changed so it runs better on their hardware?

    Man, I just realized how snobby this all sounded, but again I point out that I do have an ATi card. This whole "well, this company did this this and this" attitude between the two. Shut up and let your hardware speak for you, don't give us excuses because you can't do this or that... Seriously.
    QFT

    has everyone forgotten that ati just gave $1mil to codemasters? they do the exact same thing, but i would consider them much less open about it. nvidia announces their partners and they place a [nVIDIA TWIMTBP] logo on the box so the consumer knows nvidia had a hand in it's development. ati does no such thing, but it gives them the ability to claim the high ground. the AA issue in baa is a bit sad, and should be mentioned in reviews, but otherwise i see no news here...
    [SIGPIC][/SIGPIC]

    I am an artist (EDM producer/DJ), pls check out mah stuff.

  7. #157
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by aka1nas View Post
    Did you actually play any of the first dozen or so UE3-based games? No working AA at all on release in many cases. Deferred-rendering broke most contemporary AA methods.
    I'm not disputing that fact.
    The fact is AA is has been common for years & if now it has become a problem & needed paying for its implantation then the developers have taken a leap backwards & another thorn in the side for PC gaming over consoles, like the PC needs any help with gamers going in that direction.

  8. #158
    Xtreme Enthusiast
    Join Date
    Jan 2008
    Posts
    614
    Wow that is messy. If this kind of thing is allowed to go unregulated, the consumer ends up being screwed by companies.
    Modded Cosmos. | Maximus II Formula. Bios 1307| 2x2 Mushkin XP ASCENT 8500 | Q9550-E0- 4.10 + TRUE | Visiontek HD4870X2 | LN32A550 1920x1080 | X-FI Extreme Gamer | Z5300E | G15v.1 | G7 | MX518 | Corsair HX1000 | X25-MG2 80G | 5xHDD
    ____________________________________
    Quote Originally Posted by saaya View Post
    most people dont care about opencl, physix, folding at home and direct compute... they want cool explosions and things blowing up and boobs jumping around realistically... .

  9. #159
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Quote Originally Posted by Final8ty View Post
    I'm not disputing that fact.
    The fact is AA is has been common for years & if now it has become a problem & needed paying for its implantation then the developers have taken a leap backwards & another thorn in the side for PC gaming over consoles, like the PC needs any help with gamers going in that direction.
    Fair enough, I think we unfortunately have taken that leap back on the PC side lately. The consolers won't notice the lack of real AA on those titles, the devs can just throw a blur shader effect over it and call it a day.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

  10. #160
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by Farinorco View Post
    As JohnJohn has said, I've said "consumers", not "NVIDIA consumers".

    I don't know about others, but I consider myself a consumer of certain kinds of products, not of brands. I'm a graphic hardware consumer, not a NVIDIA or ATI consumer.

    As a consumer, when I want to buy a new piece of hardware, I chose the product that I think it's better for my budget, be it from NVIDIA, AMD, Intel, or VIA. So if someone it's artificially making some software not working at a full level with certain hardware which I normally could chose, when that software is perfectly compatible with that hardware, that someone is screwing me. As a consumer.



    Think about this hypothetical situation:

    AMD has put money (work, code, it's the same) to the developers of DIRT 2 to use DX11 code. So they make the same that NVIDIA with Batman AA: the DX11 features (or some of them) can't be enabled in NVIDIA hardware (for when they have some).

    Even if that was possible (I don't think AMD is in position to do something like that, as is the case of NVIDIA)... wow, GREAT. Now we have fixed everything. Now I not only can't use AA in B:AA if I have ATi, but I can't use DX11 in DIRT2 if I have NVIDIA. Excelent!. My situation as a consumer has improved hugely since AMD "has improved their relationships with developers to the degree of NVIDIA".

    The problem here is not the relationships with the developers, but what kind of things should/shouldn't be allowed to be done with those relationships.
    i agree in the most part..

    but your not screwed by ATI making dirt 2 using dx11, nvidia has/having/had problems releasing there dx11 cards.. and they WILL be out, its not as if by having dx11 nvidia will never ever be able to use it :/
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  11. #161
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by Jamesrt2004 View Post
    but your not screwed by ATI making dirt 2 using dx11, nvidia has/having/had problems releasing there dx11 cards.. and they WILL be out, its not as if by having dx11 nvidia will never ever be able to use it :/
    I think his point is since ATI give ressources to develop DX11 in Dirt 2 why they can't tell Codemasters to disable it with Nvidia cards even if they could use it.
    It will be fair since Rocksteady disabled AA in Batman:AA altough ATI cards can use it (because someone give them money to help develop it...)

  12. #162
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Vienna View Post
    Wow, you really remember your history wrong.

    The nVidia cards (FX 5xxx series) were automatically put on the DX8.1 path instead of the "proper" "ati" shader path as you put it, or rather the DX9.0 shader path, because performance on the DX9.0 shader path was HORRIBLE on the nVidia cards.

    So, yes, abit of image quality(dx8.1 vs dx9.0) on the fx 5xxx series was sacrificed, but for good reason.
    Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.

    You see, it's a little known fact, but during the finalization for the specifications of DX9 NVidia was fighting for FP-32 to be the standard while ATi was fighting for FP-24. FP-32 required 33% more power than FP-24 but didn't bring enough to the table when combined with the rest of the specifications for microsoft to go for it, and as such microsoft took FP-24. That was the first hue strike against the FX series, naturally using 33% more horse power to do the same effects.

    Switching the FX series to the ATi code path took it out of FP-32 mode and put it in to FP-16 which made it run DX9 100% fine in HL2, as no effect in HL2 required more than FP-16. It was pointed out to valve on countless occasions before release, and again after release... They never did nothing.

    Same thing you're all complaining about now.

    Quote Originally Posted by Final8ty View Post
    I'm not disputing that fact.
    The fact is AA is has been common for years & if now it has become a problem & needed paying for its implantation then the developers have taken a leap backwards & another thorn in the side for PC gaming over consoles, like the PC needs any help with gamers going in that direction.
    Well, about that whole AA thing...

    Most UE3 titles have problems with AA. I've been playing back through Gears of War PC on my 4850, and turning on DX10 + AA results in the frame-rate dipping down to less than 20 fps, sometimes even further. Playing with my old 8800GTX it stayed above 50, always. 1920x1080p.....Well, 1920x1080p on the NVidia, ati still hasn't fixed their HDTV support, and it's been YEARS, it reverts back to 1080i in quite a few games.

    Basically, I'm just saying, until we know all the facts here it's just random flaming and speculation. It may truly have issues in certain parts of the game on ATi hardware.
    Last edited by DilTech; 09-29-2009 at 02:31 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  13. #163
    Registered User
    Join Date
    May 2009
    Location
    Amsterdam
    Posts
    45
    Wow, some people really seem to want to miss the point.
    Blaming ati because ati does not spend enough time "helping" developers is ridiculous. Ati should help developers more, but that does not change that it is unacceptable to not let a piece of code run on hardware based on solely a vendor id check.

    I helped coding a few engines for small directx 9 games and let me assure you that a developer is never supposed to base a featurecheck on just a hardware vendor id check.
    That is just stupidity.

    The only and sole reason why a hardware vendor id check would be used, is when you want certain features to only work on hardware from company x.


    If nvidia helped this developer to get aa to work, that is very nice of nvidia. This does not give either nvidia or the developer the right to make it NOT work on different hardware. That is just screwing over your customer base.

    So fugger, explain to me, what if ati would start spending more effort and money ibnto developer relations and they would also resort to nasty tricks like this. A world in which every game is crippled on either nvidia or ati hardware. Does that sound ideal to you?

  14. #164
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    166
    AMD prides itself on supporting open standards and our goal is to advance PC gaming regardless whether people purchase our products.
    As pointed out by another, its AMD not ATI.
    Batman: Arkham Asylum
    In this game, Nvidia has an in-game option for AA, whereas, gamers using ATI Graphics Cards...
    *DON'T!!!*** ... are required to force AA on in the Catalyst Control Center.
    So it works but only nvidia boys get this option in game.
    The advantage of in-game AA is that the engine can run AA selectively on scenes whereas Forced AA in CCC is required to use brute force to apply AA on every scene and object, requiring much more work.
    Nice way to decrease ATI consumers performance nvida.
    Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced. This option is not available for the retail game as there is a secure rom.
    This is the golden quote though, read it carefully and fully appreciate what has been done here, in particular to the innocent ATI users.

    Anybody who defends these business practices is ANTI-CONSUMER. <--- full stop
    [SIGPIC][/SIGPIC]
    CoolerMaster Stacker 830SE|Antec Signature 850W|Gigabyte X58A-UD5 F5 slic2.1
    Intel Core i7 930 16x200@3,200Mhz|vcore 1.14|Intel Stock CPU Cooler
    GSKILL DDR3 Perfect Storm 2000 @6-6-6-16-1T 1600Mhz|ATI 5870 1024MB 850/1200
    Windows 7 Ultimate x64 bootdisk: Crucial RealSSD-C300 128GB SATA-III

  15. #165
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by DilTech View Post
    Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.

    1.You see, it's a little known fact, but during the finalization for the specifications of DX9 NVidia was fighting for FP-32 to be the standard while ATi was fighting for FP-24. FP-32 required 33% more power than FP-24 but didn't bring enough to the table when combined with the rest of the specifications for microsoft to go for it, and as such microsoft took FP-24. That was the first hue strike against the FX series, naturally using 33% more horse power to do the same effects.

    2Switching the FX series to the ATi code path took it out of FP-32 mode and put it in to FP-16 which made it run DX9 100% fine in HL2, as no effect in HL2 required more than FP-16. It was pointed out to valve on countless occasions before release, and again after release... They never did nothing.

    Same thing you're all complaining about now.



    Well, about that whole AA thing...

    3.Most UE3 titles have problems with AA. I've been playing back through Gears of War PC on my 4850, and turning on DX10 + AA results in the frame-rate dipping down to less than 20 fps, sometimes even further. Playing with my old 8800GTX it stayed above 50, always. 1920x1080p.....Well, 1920x1080p on the NVidia, ati still hasn't fixed their HDTV support, and it's been YEARS, it reverts back to 1080i in quite a few games.

    Basically, I'm just saying, until we know all the facts here it's just random flaming and speculation. It may truly have issues in certain parts of the game on ATi hardware.
    I logged in this account because correcting mods are fun, and there really are parts of what you said that are skewed beyond proportion, and then they're the uhh... challenged people who will just QFT you without even reading finely.

    1-1. Wrong. MS took FP24 only because of nVidia refusing to license chip IP to Microsoft, thus forcing MS to BUY chips for the Xbox from nVidia, while costs stay roughly the same and no shrinking could be done, aka no Xbox Slim ever. This was also partly why MS was in the red so badly for the original Xbox. Partly why nVidia didn't catch wind of it was miscommunication, AND MS doing a deservedly revenge stance.

    1-2. FP16 was NEVER on the board for DX9, it was only used for nVidia's proprietary Cg shading language. Thus developers could not target FP16 if they wanted to use Microsoft's HLSL. You're suggesting Valve to waste more time to compile and unify for a proprietary language for a new series of cards that aren't particularly going anywhere. Wow.

    1-3. Do you seriously think Valve would spend time on an architecture that's so unbalanced in the first place to make it slightly less epic fail? ATI or not, they would NOT have used a myriad of shader languages and try to keep artistic cohesion. Would FP16 just make the nVidia cards fly anyway? There's little point in doing so compared to the DX8.1 path. -Just when you thought the days of Glide vs whatever was over and you expect them to do this BS?

    2-1. Switching HL2 to ATI's codepath put the Geforce FX to FP32. Again there is NO official FP16 codepath at all. The FXes running at FP32 were epic fails.

    2-2. Who cares? nVidia of course. If you didn't get the memo illegal shader code swapping was the rageeeee.

    So nVidia began swapping.
    And swapping.
    And swapping.
    Oh, just in 3DMark03 by the way. The nature test. They didn't have balls to approach Valve and ask them if it was possible if THEY coded the Cg path with partial precision (like ATI and whatever DX10.1 game, they analyze the code and give suggestions), they just kept silent. And swapping for benchmark scores. Cliff's notes version: cheaters

    2-3. MS chose FP24 for a reason. nVidia still recommended FP32 to be used for a reason. I think I saw lots of texture color banding and such although the game was playable. Claiming that you just need to code some alien shader code separately- is that still a just?

    Now that ATI cards have every DirectX spec-compliant feature (and more of course with 11), there is no reason for such AA bull**** to happen. To try to justify the Geforce FX HL2 fiasco with this- AND sympathize with nVidia is incredulous (and of course, hillarious on my side. )

    3-1. Good God. Don't you know you're even hitting a RAM capacity bottleneck? Most non Cryengine 2 engines use MORE vRAM under DX10, you're running at 1080p, with 4xAA. And you call it an engine problem.



    I wouldn't have an issue if you didn't act like you were speaking the penultimate truth. And to think that people would QFT you on this.


    Oh, speaking of GT300, I presume that when the reviews come the review thread itself will stay in the news section eh


    p/s: On a less hostile note, Catalyst 9.9 seems to have fixed HDTV modes. I can't claim 100% accuracy as I don't connect to one, but I think I remember some positive chatter on that over a supposedly bad driver release.
    Last edited by Macadamia; 09-29-2009 at 04:10 AM.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  16. #166
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by DilTech View Post
    Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.


    Well, about that whole AA thing...

    Most UE3 titles have problems with AA. I've been playing back through Gears of War PC on my 4850, and turning on DX10 + AA results in the frame-rate dipping down to less than 20 fps, sometimes even further. Playing with my old 8800GTX it stayed above 50, always. 1920x1080p.....Well, 1920x1080p on the NVidia, ati still hasn't fixed their HDTV support, and it's been YEARS, it reverts back to 1080i in quite a few games.

    Basically, I'm just saying, until we know all the facts here it's just random flaming and speculation. It may truly have issues in certain parts of the game on ATi hardware.
    Your problems with HDTV an ATI cards is not what the topic is about which could be specific ATI card to HDTV model compatibility problem which does happen even with monitors.
    I have no problem with my ATI card on my HDTV

    Gears of war is a dog to run in DX10 and still not my point as i have already said that there is a problem but paying for AA support is not what the future should be because it was not like that in the past.
    Last edited by Final8ty; 09-29-2009 at 04:07 AM.

  17. #167
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by DilTech View Post
    Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.

    You see, it's a little known fact, but during the finalization for the specifications of DX9 NVidia was fighting for FP-32 to be the standard while ATi was fighting for FP-24. FP-32 required 33% more power than FP-24 but didn't bring enough to the table when combined with the rest of the specifications for microsoft to go for it, and as such microsoft took FP-24. That was the first hue strike against the FX series, naturally using 33% more horse power to do the same effects.

    Switching the FX series to the ATi code path took it out of FP-32 mode and put it in to FP-16 which made it run DX9 100% fine in HL2, as no effect in HL2 required more than FP-16. It was pointed out to valve on countless occasions before release, and again after release... They never did nothing.

    Same thing you're all complaining about now.
    Not same thing for a simple reason, why HL2 use FP24/32? Because minimum spec for DX9.0 is FP24! FX series could only do FP16/FP32, that Nvidia fault! Valve use FP32 for HL2 with FX series because DX9.0 specs command it and FX series can't do it. End of rewriting story.

  18. #168
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    A quick google search will verify what I speak of with the HDTV issue...Go ahead, fire up Stalker: Clear sky or Gears of War PC on your ATi hooked to a HDTV via HDMI, and have your TV tell you what it's running at when you select 1080... Bet it says 1080i. Tested on 2 HDTV's, with 2 separate ati cards, 2 different hdmi adapters, and cat 9.9...

    I know how to fix it, but I need a few ATi driver gurus to help, I have a thread going in the ATi section.

    Either way, it's off topic, I'm just pointing out that ATi have more important issues to worry about, and gears of war was pointed out because it's also a UE3 title with issues on ATi cards when it comes to running AA.

    Also, half precision was allowed with DX9. Of course, who cares, the FX cards sucked anyway. Truth is though, ATi have done the same as NVidia in said situation. It literally only would've taken 2 seconds to set the FX series to half precision, with no loss in quality.
    Last edited by DilTech; 09-29-2009 at 04:27 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  19. #169
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by DilTech View Post
    Either way, it's off topic, I'm just pointing out that ATi have more important issues to worry about, and gears of war was pointed out because it's also a UE3 title with issues on ATi cards when it comes to running AA.
    Sure a winter 2006 title is a much more important issue to fix than a winter 2009 title

    Quote Originally Posted by DilTech View Post
    Also, half precision was allowed with DX9. Of course, who cares, the FX cards sucked anyway. Truth is though, ATi have done the same as NVidia in said situation.
    Lol! your argument was HL2 and is proven wrong. HL2 just respect DX9.0 specs and anyway you can use FP16 in HL2 with command line for both ATI and Nvidia cards. What's the command line for using AA the same way Nvidia run it in Batman:AA?

    Quote Originally Posted by DilTech View Post
    It literally only would've taken 2 seconds to set the FX series to half precision, with no loss in quality.
    sure Quality is the same? 2 secs google seach!
    http://www.neowin.net/forum/index.ph...&pid=585020109

  20. #170
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by AbelJemka View Post
    Sure a winter 2006 title is a much more important issue to fix than a winter 2009 title
    Happens in Stalker: Clear skys as well, and other titles I'm sure. I haven't bought too many games as of late because most haven't been worth playing period, so I can't test more, but it's not just ONE game. It even happens to some people when watching blu-ray movies!!! Read up on the ATi under/overscan issue with HDTV's, it's caused by the video card reverting to 1080i. There's threads all over AMD's actual forum about the issue.

    This has been an ongoing issue with ATi cards for quite some time, I've even told them how to fix it, yet they ignore it all together.

    Lol! your argument was HL2 and is proven wrong. HL2 just respect DX9.0 specs and anyway you can use FP16 in HL2 with command line for both ATI and Nvidia cards. What's the command line for using AA the same way Nvidia run it in Batman:AA?


    sure Quality is the same? 2 secs google seach!
    http://www.neowin.net/forum/index.ph...&pid=585020109
    Maybe, maybe not. See, I didn't use a FX card(I had a 9800pro during those days. I do remember the huge talk at guru3d, and people there claiming no quality loss. If that's not the case then my mistake, obviously people on forums didn't know what they were talking about.
    Last edited by DilTech; 09-29-2009 at 05:10 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  21. #171
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    So, what's the last word on Macadamia vs. Diltech? I'm curious to know who is more right.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  22. #172
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Oh and Macadamia, I'm running a 1gb 4850, and the 8800GTX was 768mb of ram. We all know the UE3 engine isn't a VRam hog...So, how exactly am I VRam restricted with gears of war, which mind you is the same engine as batman?

    I'll give you the benefit of the doubt on the rest, as I didn't run a FX series card, but you're out-right wrong on the issue with gears...
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  23. #173
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by DilTech View Post
    Oh and Macadamia, I'm running a 1gb 4850, and the 8800GTX was 768mb of ram. We all know the UE3 engine isn't a VRam hog...So, how exactly am I VRam restricted with gears of war, which mind you is the same engine as batman?
    BAA doesn't run on the same engine as GOW, GOW1 is UE3, GOW2 is 3.25 and BAA is 3.5.

  24. #174
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Toronto, Canada
    Posts
    1,938
    What's happening here is a console type war on a once open platform. AA and high resolutions are the two last things that PC gamers have to set their experience apart from playing the same games on consoles. Now, if you don't have a certain card...you don't get the same in-game support? That's the reason why my PS3 game collection just keeps getting larger and my PC game collection has been at a standstill since BF2.
    GB 790XTA UD4
    GSkill Pi Black 2000 Cas9
    ASUS 4870
    Enermax Revolution 1050+





    http://www.heatware.com/eval.php?id=67661

  25. #175
    Xtreme Addict
    Join Date
    Jun 2005
    Posts
    1,095
    Ok so before it gets lost in the noise, let's recap the accusations here. So you guys are saying:

    1- NVidia paid money to Eidos, the makers of Batman:AA to deliberately cripple ATI graphics cards' performances by not enabling in game AA option for ATI cards.
    2- Eidos who is depended on gamers with any brand graphics cards said: "Wow! What a great idea! Let's abandon half of our customers and risk ourselves to be exposed, boycotted and possibly sued" and took the deal. Because as we all know, only the xtreme people like the ones in this forum can uncover such a dastartly pilot and realize there is whole option missing for one brand of graphics card. Eidos was sure that no one will ever notice.
    3- There is absolutely no other technical (or at least sensible) explanation why the developers disabled AA in game for ATI cards but to damage their performance. So no driver issues, no AA malfuction for ATI cards so leaving AA to CCC is a better option or any other explanation. The only reason is to be pure evil and kill ATI.
    4- AMD knew this, but being the poor sissy boys constantly bullied by the evil giant NVidia, could not do anything about it . People who sue each other for the color of their pants, just bent over and took it. No contacting Eidos and threatening to expose them, no filling complaints, nothing. Just a mere mention in a blog. So there is a new game being developed and ATI has no idea (never get to view the code or test the game on their cards) that the AA option is disabled for their cards until that game hits the stores.

    Just wanted to make sure whether it still looks ridiculous as it is when you put everything together.

Page 7 of 9 FirstFirst ... 456789 LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •