MMM
Page 5 of 18 FirstFirst ... 234567815 ... LastLast
Results 101 to 125 of 444

Thread: Nvidia responds to Batman:AA

  1. #101
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Chumbucket843 View Post
    i am assuming that these would be your posts.
    Haha, nope mine would gone too. I'm a developer, but not of games

  2. #102
    Xtreme Addict
    Join Date
    Aug 2008
    Location
    Hollywierd, CA
    Posts
    1,284
    Quote Originally Posted by Mads321 View Post
    I think that Nvidia should keep to their own, and simply get the hell out of the software delvelopment - PERIOD! Because this is the first and crucial step against open standards, which could end up in a scenario where one must have a cirtain GFX to be able to enjoy a cirtain game..
    really? i have to say, that when i spend money on a high dollar nvidia card i enjoy knowing that nvidia have spent lot's of money, time, and effort making sure that the card will work well with many of todays most popular games. i have no problem with nvidia lending devs a hand with validation and testing, if the competition isn't devoted to ensuring their product will satisfy their customers, then so be it.

    Quote Originally Posted by Mads321 View Post
    If you ask me, it should be illegal to work closely so together as Eidos and Nvidia have done in this case, without explicitly "warning" buyer's of both game and GFX, that this partnership has been going on behind the curtans. If not, then it's only a matter of time till the before mentioned scenario with x gfx to play x game becomes a reality.

    So yeah, give Nvidia the finger, don't buy the game or any of the numerus Batman-theme'd or bundled Nvidia cards out there, and move on.
    again, really? nvidia DOES warn you that they helped with development, they place the [nVIDIA TWIMTBP] logo on the back of the box and in the opening credits for the game! again, i as an nvidia customer, like this. when i buy a game with the TWIMTBP logo i know my card will have no problems playing the game with eye candy and awesome framerates. amd simply GIVES devs money (codemasters, dirt2, $1,000,000) for development and no-one would ever know without some research. also, i believe the "slippery slope" arguement is a bit stale here, as this has been going on for years from both sides and never has the x game for y hardware prediction come true. not with hl2, fc2, dow2, or baa.
    [SIGPIC][/SIGPIC]

    I am an artist (EDM producer/DJ), pls check out mah stuff.

  3. #103
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    ive never seen an ati branded game not work with NV parts, except for dx10.1 or 11 but if NV supported it i would bet that it will work.

    and having an NV logo on the game shouldent mean warning if u have a non NV card you will be missing basic features
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  4. #104
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Denmark
    Posts
    110
    really? i have to say, that when i spend money on a high dollar nvidia card i enjoy knowing that nvidia have spent lot's of money, time, and effort making sure that the card will work well with many of todays most popular games. i have no problem with nvidia lending devs a hand with validation and testing, if the competition isn't devoted to ensuring their product will satisfy their customers, then so be it.
    Thats obviusly not what i meant.. Of course Nvidia should be working their asses of, ensureing new games run great on their cards - all GFX manufacturers should be.

    Where the line is crossed, is when the game ends up with standardized features, working exclusively on Nvidia's GFX. That's what i mean when I say Nvidia should "simply get the hell out of the software delvelopment".

    Yes, they should be there for optimizing purposes, NOT for making cheesy deals under the table, making standardized features consciously disabled, only to favour Nvidia's cards over the competition.

    again, really? nvidia DOES warn you that they helped with development, they place the [nVIDIA TWIMTBP] logo on the back of the box and in the opening credits for the game! again, i as an nvidia customer, like this. when i buy a game with the TWIMTBP logo i know my card will have no problems playing the game with eye candy and awesome framerates. amd simply GIVES devs money (codemasters, dirt2, $1,000,000) for development and no-one would ever know without some research. also, i believe the "slippery slope" arguement is a bit stale here, as this has been going on for years from both sides and never has the x game for y hardware prediction come true. not with hl2, fc2, dow2, or baa.
    Yes indeed, again! The TWIMTBP-logo tell's me that nvidia has helped optimizing the game, NOT consciously disabled features, only to favour Nvidia's cards over the competition. This is exactly where the line is crossed.

    Who gives a flying about how much money they spew at the developors, as long as they don't expect to have their freaking cards run standardized features exclusively

  5. #105
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,838
    if nvidia didnt help them, the game would probably run worse for everyone, and the game just wouldnt have AA for anyone, like GTA4.
    If you think there was some sabotage or conspiracy or something, you dont understand that developers typically have near total control over their own games, and that developers typically want their game to be as good as possible. Cause, ya know, they get the vast majority of their money from the sales of the game, not from hardware companies.

    so, im really not buying the narrative that people are trying to create here: an ominous evil nvidia who has immense control over how games are made, and this innocent ati, who could be doing so well in these games if it wasn't for evil nvidia and their shysterism.
    Last edited by grimREEFER; 09-29-2009 at 06:14 PM.
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  6. #106
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,264
    Quote Originally Posted by randomizer View Post
    This is what happens when you use deferred rendering. Same thing with STALKER, AA is dodgy.
    ^ This.

    Rockstar did the same with their Rage engine ( thus why GTA 4 is a jaggy fest from the early 90s ) Deferred rendering is ideal when you want to use high lighting counts as the render cost for full scene dynamic lighting is much much lower. The devs usually consider it a fair compromise to have all of these fancy shadows. However using DX10-11 AA *can* be supported (key word can ; as mentioned GoWs PC DX10 AA support is sketchy )

    I still think their response is a tad y and ignorant but perhaps that is just me...

    I remember BF2s release. Nvidia had a TWIMTBP add inside the boxes saying something along the lines of "With Geforce 7 series gpus you get advanced effects not available with the competition" Now I believe in this case that was merely due to a lack of SM3 on current gen ATI hardware but It kind of reminded me of this a tad.
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

  7. #107
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by grimREEFER View Post
    if nvidia didnt help them...the game just wouldnt have AA for anyone


    Yep, and all the complainers wouldn't care. But because Nvidia put in their time, effort, $$$ etc to get it for their consumers all of a sudden it's a goddam tragedy.

  8. #108
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    QLD
    Posts
    942
    Honestly as time goes by the ATI shills are getting more and more pathetic, Nvidia is a 'business' and honestly I'm surprised they do not do more underhanded back alley dealings to get your 3-sizes too small panties in a bunch. Just use the hack, get more mature with your complaints or just shut up already! The mindless partisanship that gets more and more acute is really getting on my nerves.

  9. #109
    Xtreme Addict
    Join Date
    Feb 2004
    Posts
    1,176
    Have to disagree and agree with some of you. Nvidia is about business but what they fail to understand is by further polarizing the industry in more fronts just causes the PC gaming to falter further from consumer frustration.

  10. #110
    Xtreme Member
    Join Date
    Apr 2004
    Posts
    100
    Quote Originally Posted by Shadov View Post
    Honestly Im just worried this will end up in a huge game titles auction house.

    Wanted to try Batman 2? Sorry buddy you have an ATI Radeon card, this game runs only on GeForce. Are you interested in Dirt 3 or Alien Vs Predator 2... Sorry game works only on ATI Radeon.

    While both companies will survive this is very bad for consumers.

    Someone has to make it clear to Nvidia to not use and set such business practices before others adapt it and things start going downhill...
    /agree and if you notice these "games meant to be played" anytime there is a problem with an ATI driver that isnt the fault of ATI. The developer never issues a fix...almost always denying the claim the bug even exists.

  11. #111
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by jstutman View Post
    /agree and if you notice these "games meant to be played" anytime there is a problem with an ATI driver that isnt the fault of ATI. The developer never issues a fix...almost always denying the claim the bug even exists.
    this just happened. i dont think you have one of the six.
    Quote Originally Posted by trinibwoy View Post
    Is there a filter for posts from people who don't know jack about game development and therefore shouldn't comment on these topics? This thread would be about 6 posts long.

  12. #112
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    North USA
    Posts
    670
    Quote Originally Posted by trinibwoy View Post
    Is there a filter for posts from people who don't know jack about game development and therefore shouldn't comment on these topics? This thread would be about 6 posts long.
    By all means, please enlighten all of us oh master of knowledge. Your three troll posts on the thread thus far haven't added much value to my life. Are you withholding information that would settle this debate once and for all?

    Why don't you tell us how this AA algorithm is both superior and at the same time inappropriate for ATI cards? Tech specs please, this is after all, a technology forum!

    Now is your time to shine and show us all how much you know about game development!
    Asus P6T-DLX V2 1104 & i7 920 @ 4116 1.32v(Windows Reported) 1.3375v (BIOS Set) 196x20(1) HT OFF
    6GB OCZ Platinum DDR3 1600 3x2GB@ 7-7-7-24, 1.66v, 1568Mhz
    Sapphire 5870 @ 985/1245 1.2v
    X-Fi "Fatal1ty" & Klipsch ProMedia Ultra 5.1 Speaks/Beyerdynamic DT-880 Pro (2005 Model) and a mini3 amp
    WD 150GB Raptor (Games) & 2x WD 640GB (System)
    PC Power & Cooling 750w
    Homebrew watercooling on CPU and GPU
    and the best monitor ever made + a Samsung 226CW + Dell P2210 for eyefinity
    Windows 7 Utimate x64

  13. #113
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    1,397
    Quote Originally Posted by zanzabar View Post
    ive never seen an ati branded game not work with NV parts, except for dx10.1 or 11 but if NV supported it i would bet that it will work.

    and having an NV logo on the game shouldent mean warning if u have a non NV card you will be missing basic features
    As opposed to previously knowing that the game was built on the Unreal Engine 3? And therefore missing a basic feature?
    i7 2600K | ASUS Maximus IV GENE-Z | GTX Titan | Corsair DDR3-2133

  14. #114
    Xtreme Member
    Join Date
    Apr 2004
    Posts
    100
    Granted, when I made my comment I dont have "facts" to back it up, but ANY person who has played any PC game in the last several years from a large developer should see how the "$$$$" comes before the games, with a few exceptions from developers. Why so many patches for bugs that could have been found by a indie developer? Why do they release half completed games, why are some games programmed to benefit one vendor and not the other? It all boils down to the money game.

  15. #115
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Location
    Grimsby, UK
    Posts
    666
    So what this boils down to is money.

    AMD/ATI didnt give the software developer money they dont include the code.

    Nvidia pays a software developer to include code that gives AA.

    Well seen as I own an AMD/ATI graphics card, I dont see no point in buying the game if it doesnt contain all the features, the only people who really lose out is the people selling the game . . . tough luck, plenty of other games I can buy probally clear it in a few days & get rid of it anyway.

    Other games dont get restricted like this, I wonder how much money Nvidia paid, at the end of the day if everyone was like me the people who made the game will end up losing out anyway.

    It's a pity Nvidia dont spend more money on sorting out their drivers, having to switch drivers so that I get higher fps in one game to another, or having to disable sli so a game doesnt crash or sli doesnt work with one driver or gets broken in the next, thats why I got rid of my GTX295.
    i5 2500K @ 4.9GHz MSI Z77 MPower G.Skill Trident 8GB 2400C10
    EVGA GTX 1070 SC 8GB @ 1784/4004MHz Corsair HX 750 PSU
    Samsung 830 256GB Creative ZxR Thermalright Silver Arrow
    NEC 24WMGX3 24" TFT Fractal Design Define S Win 7/10 64bit

  16. #116
    Xtreme Addict
    Join Date
    Jun 2005
    Posts
    1,095

    Question

    Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced.
    How do you do this? Can somebody help me out. I'd like to try. I am downloading the demo now.

  17. #117
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Denmark
    Posts
    110
    Quote Originally Posted by SamHughe View Post
    How do you do this? Can somebody help me out. I'd like to try. I am downloading the demo now.
    Dunno if this works, but could be worth a shot:

    http://www.techpowerup.com/downloads...tor_v1.22.html

  18. #118
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Shadov View Post
    And this explains why Batman Demo could run AA fine on ATI Radeon, after changing the device_id. NOT!

    Im sorry but this is a typical PR response they have given you Fugger
    Regardless, it probably doesn't work exactly as it would with an ATI card. It's no secret that Unreal Engine 3 does not natively support AA (it uses deferred lighting) and the claim that being TWIMTBP certified locks out ATI from working with the developers is completely untrue.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  19. #119
    Xtreme Member
    Join Date
    Mar 2009
    Location
    Miltown, Wisconsin
    Posts
    353
    I see this bull as a piss poor move on nvidias part. I believe this is all connected with batman. I think batman was suppose to be a "revolutinary" turning point for nvidia. Thats why they killed physx on ati cards at the same time. I have a ageia ppu for my ati card that has worked for years till now and this really stinks. They basically made the game a nvidia only game, and if you own a ati card there just fooling you into thinking that its same game. HONESTLY, IF I BUY THE GAME, I SHOULD BE ENTITLED TO THE WHOLE GAME! not just some parts of it, and I should be able to use it any way I like. It should be sold as a different game for less money if you own ati hardware then, since im paying for the extra development done by nvidia and not using it. If your a NVIDIOT you can suck the part from my butt to my belly. I cant see this helping us as consumers one bit, and believe me this is a line in the sand. I vote with my dollars and Ati you deserve it. Ati is only advancing technology not kock blocking it. If you think adopting propietary standards with "conditions" help us, than there is no hope for us. Ever wonder why blue ray disks cost so much? I fight for freedom, not senseless hearding. Ati is back, bigger and better than ever, thats why were forced with all this nvidia nonsense.

  20. #120
    Xtreme Enthusiast
    Join Date
    Jan 2008
    Posts
    637
    Quote Originally Posted by SamHughe View Post
    How do you do this? Can somebody help me out. I'd like to try. I am downloading the demo now.
    I remember using a tool called 3DAnalize back when I had a Geforce 4MX and wanted a game to think it was a Geforce 5200 :P

    You can set the Device ID to be used with each game, alongside many other things i recommend NOT touching (Mostly Pixel&Vertex shaders forcing and manipulating, back in the early DX9 era).

  21. #121
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    The way I see it:

    Prologue: Unreal 3 has problems with AA
    Chapter 1: NVidia works with the devs to solve it. This means investing developing time (=money) into the game
    Chapter 2: NVidia addresses the problem
    Chapter 3: Since if NVidia hadn't invested time there would be no in-game AA; they obviously don't want ATI/AMD to also take advantage of NVidia's investment

    Epilogue: NVidia invests money to an issue and they don't want ATI to benefit from their investment. So they don't get in-game AA.

    Correct me if I'm wrong but I see nothing wrong with this.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  22. #122
    Registered User
    Join Date
    Oct 2008
    Posts
    39
    Quote Originally Posted by Sadasius View Post
    It's like I said many times. Vote with your wallet. You as the consumer basically trains how business's act. If they get out of hand and product don't get sold then they shape up. If you bit$h but buy it anyway then you are really not doing anything but helping support their bad behaviour. I for one will not buy this game as it is an obvious stepping stone in hurting us as the consumer and will help cause a rift in games. The line was drawn in the sand for me.
    Why wouldn't you buy the game mate? You're missing out, it's a great game with or without physx and AA

  23. #123
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Bayamon,PR
    Posts
    257
    Quote Originally Posted by annihilat0r View Post
    The way I see it:

    Prologue: Unreal 3 has problems with AA
    Chapter 1: NVidia works with the devs to solve it. This means investing developing time (=money) into the game
    Chapter 2: NVidia addresses the problem
    Chapter 3: Since if NVidia hadn't invested time there would be no in-game AA; they obviously don't want ATI/AMD to also take advantage of NVidia's investment

    Epilogue: NVidia invests money to an issue and they don't want ATI to benefit from their investment. So they don't get in-game AA.

    Correct me if I'm wrong but I see nothing wrong with this.
    I will correct you , Basically the consumer is affected by their hunger for more money , and cripples choice to us the consumers . Its like the branding is starting all over the place . Here you get to have full use of the game since you bought our product . Oh sorry you only get 1/2 of the game because you failed to buy our product . Its just cheap and unethical . I'm sure people with deep pockets don't give a damn , but for the everyday consumers its a big deal . Same if Ati takes the same steps . Physx I understand if they block it , but why the AA ? Im sure if they decided to share , the would make more profits out of it , instead of making it like they are doing now .
    Last edited by LC_Nab; 09-29-2009 at 10:04 PM.

  24. #124
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Denmark
    Posts
    110
    Quote Originally Posted by annihilat0r View Post
    The way I see it:

    Prologue: Unreal 3 has problems with AA
    Chapter 1: NVidia works with the devs to solve it. This means investing developing time (=money) into the game
    Chapter 2: NVidia addresses the problem
    Chapter 3: Since if NVidia hadn't invested time there would be no in-game AA; they obviously don't want ATI/AMD to also take advantage of NVidia's investment

    Epilogue: NVidia invests money to an issue and they don't want ATI to benefit from their investment. So they don't get in-game AA.
    Prologue: Unreal 3 does not include AA at all. Why would this be Nvidia's problem? Saying it's ok for them to support Eidos develop it is one thing, but saying it's ok AND make it exclusive for Nvidia is another. It's like saying it would be ok if a company that made toasters, were to support only one kind of bread, because they helped make the recipe - you cath my drift.

    Chapter 1: Nvidia puts (perhaps too much) money into the development of the game, and in return get's Eidos to detect, and disable AA if there is not an Nvidia card in your computer.

    Chapter 2: Nvidia buys an insane amount of copy's of the game, to bundle with their cards

    Chapter 3: The money Nvidia has now indirecly invested in Eidos, may or may not have an influence of features in the game itself, but by no means should any kind of lobyism (let's face it, it reaks of it) allow a game devloper to disallow competitors any features ingame, just because Nvidia is throwing them a party.

    If we as consumers accept this, then PC-gaming as we know it will become increasingly more expensive for the end user in the years to come.

    We could only dread the outcome, should ATi decide to fight fire with fire in this situation. This has nothing to do with fanboyism, trolling or favouring one over the other. It's simply saying what you will, and what you will NOT accept as a consumer - by any part!

    The extra creepy part of this, is that you almost can't avoid getting the game in a bundle with Nvidia's cards theese days, and so you but money in both Eidos and Nvidia's pockets. It's a dual whammy.

    Quote Originally Posted by annihilat0r View Post
    Correct me if I'm wrong but I see nothing wrong with this.
    Consider yourself corrected
    Last edited by Mads321; 09-29-2009 at 10:14 PM.

  25. #125
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by Mads321 View Post
    Prologue: Unreal 3 does not include AA at all. Why would this be Nvidia's problem? Saying it's ok for them to support Eidos develop it is one thing, but saying it's ok AND make it exclusive for Nvidia is another. It's like saying it would be ok if a company that made toasters, were to support only one kind of bread, because they helped make the recipe - you cath my drift.

    Chapter 1: Nvidia puts (perhaps too much) money into the development of the game, and in return get's Eidos to detect, and disable AA if there is not an Nvidia card in your computer.

    Chapter 2: Nvidia buys an insane amount of copy's of the game, to bundle with their cards

    Chapter 3: The money Nvidia has now indirecly invested in Eidos, may or may not have an influence of features in the game itself, but by no means should any kind of lobyism (let's face it, it reaks of it) allow a game devloper to disallow any features ingame, just because Nvidia is throwing them a party.

    If we as consumers accept this, then PC-gaming as we know it will become increasingly more expensive for the end user in the years to come.

    We could only fread the outcome, should ATi decide to fight fire with fire in this situation. This has nothing to do with fanboyism, trolling or favouring one over the other. It's simply saying what you will, and what you will NOT accept as a consumer - by any part!



    Consider yourself corrected
    Meh. If there's no AA this is not NVidia's problem but if they can work with the developer to include AA exclusive for NVidia, it can be NVidia's advantage.

    You're talking as if the game by itself had AA but NVidia paid money to disable it for ATI hardware, but you then also say the game didn't have AA at all.

    Fact 1: This game would not have any in-game AA at all if it weren't for NVidia's investments (development => work hours => money)

    Fact 2: NVidia isn't a charity organization working for the betterment of our feelings - it won't invest money in something if it doesn't make them compete better in the hardware arena.

    NVidia's options:

    1. Do not do anything, let the game stay AA-less. Outcome: No money spent. AA-wise competitively, there is no difference between Nvidia and ATI.

    2. Invest money, put AA in the game, and allow this for everyone. Outcome: AA-wise competitively there is still no difference between NVidia and ATI. But NVidia has spent money.

    3. Invest money, put AA in the game, and make it exclusive to the owners of YOUR cards. Outcome: AA-wise competitively NVidia now has advantage that justifies the money they spent.

    Now, I don't think any one of you is stupid enough to say (prove me wrong if you will) that NVidia are cold-hearted evil people because they have not taken the 2nd route (lose money, gain no advantage). The only logical routes to take are 1 or 3.

    If they had taken 1, ATI owners would still be AA-less, so no difference for them. And if that were the case, would you be complaining because "NVidia hasn't spent money to enable AA?". No, if they had gone with route 1, I do not see anyone complaining about that.

    But they have taken route 3, which didn't rob ATI users of anything at all, but merely added a feature for NVidia users.

    Problem officer? I see nothing wrong with that.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

Page 5 of 18 FirstFirst ... 234567815 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •