Page 2 of 9 FirstFirst 12345 ... LastLast
Results 26 to 50 of 201

Thread: Ian McNaughton goes out against The Way it's Meant to be Played

  1. #26
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,656
    I know fist hand Nvidia gives developers the support they need, this has been in place for quite a while.

    Saying Nvidia is "buying" developers is false to stupidity. Nvidia may pay for the ad space for TWIMTBP logo (I am not sure), much like Gigabyte logo at the top of this site. I am not paid to screw the other manufactures, I use ASUS in my gaming rig.

    DX10.1 wasn't shady at all? and it changed your gaming experience any?

    We will always see the top rivals battle it out, keeps us happy with the best products faster to market at a fair price.
    Last edited by Charles Wirth; 09-28-2009 at 06:27 AM.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  2. #27
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    216
    Quote Originally Posted by FUGGER View Post
    I know fist hand Nvidia gives developers the support they need, this has been in place for quite a while.

    Saying Nvidia is "buying" developers is false to stupidity. Nvidia may pay for the ad space for TWIMTBP logo, much like Gigabyte logo at the top of this site. I am not paid to screw the other manufactures, I use ASUS in my gaming rig.

    DX10.1 wasn't shady at all? and it changed your gaming experience any?
    Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced. This option is not available for the retail game as there is a secure rom.

    Have you read this at all?! They paid devs to detect competitors card and gimp it. How the fsck is that OK?

  3. #28
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,656
    SimBy, nope but you did miss "Nvidia working closely with game developers validating in game features"

    Key word, validating.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  4. #29
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    I don't understand how some people is defending this.

    Some of you are saying that NVIDIA paying to improve the performance of some games in their hardware is fair, is business. I agree. This is not what is happening.

    Purposedly damaging in the software the performance or features of that software on hardware of competitors is not fair, and it's not something that can benefit in any way to the consumers. That is: us.

    I don't understand that anybody that is not a NVIDIA employee, has not a huge investment in NVIDIA shares, or has some kind of romantic relation with a NVIDIA employee, can accept that and even less defend it.

    This is not the first place where I read about Batman:AA and its purposedly AA disabling when an ATi card is installed on the system.

    This thing is an absolutely embarrasing thing for both NVIDIA and the developers of the game. I don't know about the other games that are named here (I doubt an ATi representant is exactly unbiased), but made in one title, who knows about others...

    That thing is little short to include a wait function of 10ms each iteration of the game loop if a competitor product is detected to damage it's performance.

    I'm not by any means an ATi defender, or an NVIDIA hater, but as a consumer, things like this turn my stomach.

  5. #30
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    people are getting mad cause a hardware company is offering time and money to help games get smooth results. whats so bad? disabling the AA ingame was done since they cannot verify with ATI that it will not cause any issues (this is my interpretation of if, since i do not know if anyone who help build B:AA could have done this by simply putting it on an ATI rig and testing onsite)

    when in the last 5 years has any game run more than 20% better on one companies hardware? (considering both cards are about equal on average)

  6. #31
    Xtreme Cruncher
    Join Date
    May 2007
    Posts
    339
    Quote Originally Posted by FUGGER View Post
    SimBy, nope but you did miss "Nvidia working closely with game developers validating in game features"

    Key word, validating.
    Quote Originally Posted by Manicdan View Post
    people are getting mad cause a hardware company is offering time and money to help games get smooth results. whats so bad? disabling the AA ingame was done since they cannot verify with ATI that it will not cause any issues (this is my interpretation of if, since i do not know if anyone who help build B:AA could have done this by simply putting it on an ATI rig and testing onsite)
    Isn't the issue 'where do you draw the line'?

    What if they disabled anything other than 'low' texture quality on ATi hardware, saying they hadn't validated it for ATi hardware?

    Surely the point of a platform / API etc., is that you can pretty much abstract the hardware and get on with the coding without worrying about the bare metal? Or am I just being naive?

  7. #32
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,656
    I am not the game developer, this issue I am sure was something they went over and maybe found a problem. They must have taken into account the backlash this would cause too, so an answer from the game developers should be forthcoming given this is a hot topic.
    Last edited by Charles Wirth; 09-28-2009 at 06:52 AM.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  8. #33
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Ummm, all the complainers talking about "ruining the game experience" need to get a clue. ATi card owners get the exact same excellent experience that console owners do. What Nvidia did was get more features added for their customers. How exactly are Nvidia's customers getting screwed here?

  9. #34
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Well I'm sure nvidia tosses money at game developers rather than works with them. I can't imagine Nvidia devoting that much time and effort. If Nvidia was really wanting to play fair then ATI owners would be able to use their old Nvidia cards for Physx
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  10. #35
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,656
    Glow9,

    "I can't imagine Nvidia devoting that much time and effort."

    They do.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  11. #36
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Are you sure about that? I'm tempted to put a lot more faith in Glow9's imagination and his feelings about the whole thing, facts be damned!

  12. #37
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by RazzleUltra View Post
    Isn't the issue 'where do you draw the line'?
    as far as they can go without making obvious asses out of themselves. i think they are close to that limit, but have not hit it yet

    To other peole:

    im all for nvidia paying to help get a game developed, nvidia is really big on trying to get features that keep a fanbase so they always have their dedicated customers.

    if physx became important (which it has a possibility too, but doubtful) i would buy an nvidia card. if needed CUDA just a few times, then i would buy nvidia. but im just a simple gamer who looks at $ per frame, and is very familiar with the OCing tools for ATI cards, and have not found a reason to try learning new tricks that would be needed when switching to an nvidia card.

    who here honestly expects ATI and Nvidia tech demos to work on their opponents cards?

  13. #38
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    960
    The EU will be VERY interested on studying this case.

  14. #39
    Xtreme Member
    Join Date
    Feb 2005
    Location
    Brighton, UK
    Posts
    210
    Nvidia can go around tainting crops, causing wildfires and slashing tires etc. For all i care, i refuse to support ATi after having a X800XT P.E on *Order for 17 months.

  15. #40
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by Manicdan View Post
    people are getting mad cause a hardware company is offering time and money to help games get smooth results. whats so bad?
    No, people is getting annoyed because of detected sabotage.

    I find logical that NVIDIA invests money to help developers to optimize their code, and I find logical that NVIDIA don't invest a single minute in optimizations for the competitors. That's all good. Sabotaging it's not.

    If hardware companies are allowed to pay money to software developers to damage their performance (or features) that should work perfectly on competitor's hardware, we will end up with "games for NVIDIA cards" and "games for ATi cards", like if it were "games for XBox 360" and "games for PS3".

    That's no good.

    disabling the AA ingame was done since they cannot verify with ATI that it will not cause any issues (this is my interpretation of if, since i do not know if anyone who help build B:AA could have done this by simply putting it on an ATI rig and testing onsite)
    That would be a very, very bad excuse that any game developer would refute in a heartbeat. I, as a hobbyist game programmer myself, can assure that there's no need to contact with any hardware maker to verify if some code works on their hardware. That's the exact reason why intermediate APIs such as Direct3D, OpenGL, or the high level shader languages of both exist.

    Yes, it can happen that some code that should work on certain hardware, doesn't because bugs on either side (your code, their hardware). Then, disabling that code would be a shoddy but quick (sometimes deadlines are like prisons) solution, at least in a provisional way.

    But disabling it just in case? That makes me laugh. And why they don't disable the rest of the shaders too? You know, that AA filtering has been proven to work well with ATi hw making use of some tricks (including the crack to pass the Securerom in the game, of course)...

  16. #41
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Farinorco View Post
    No, people is getting annoyed because of detected sabotage.
    i dont see sabotage, i just see a company helping out. nothing more. unless we get the exact reason from a B:AA developer for why AA was disabled, its all speculation. and im speculating in favor of the idea that this is trying to take a mole hill and make a mountain.

  17. #42
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    AA was not disabled. It was simply enabled only for Nvidia hardware. Big difference there. So nobody would have gotten AA. Now Nvidia users get it. ATi users don't lose anything in the deal.

  18. #43
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by trinibwoy View Post
    AA was not disabled. It was simply enabled only for Nvidia hardware. Big difference there. So nobody would have gotten AA. Now Nvidia users get it. ATi users don't lose anything in the deal.
    No. That AA code is shader code (AFAIK).

    I think you know enough (more than me I think) about those things, so:

    If you have a piece of code, written with a standard API, that should work on any hardware compatible with that API, and you include an absolutely unneeded line such as (either way it's the same):

    If myHW is detected, then papapapa...
    or
    If not competitorsHW is detected, then papapapa...

    Do you really think that nobody is losing nothing with "the deal"? Come on man...

    Including extra code to ensure that certain features that should work on standard hw only do with some brand of hw, is not disabling it?
    Last edited by Farinorco; 09-28-2009 at 07:22 AM.

  19. #44
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    You're playing with the words Trinibwoy!
    How you can't enabled AA with ATI cards and you can enabled AA with Nvidia cards implied nobody would have gotten AA? Wich games is out now without AA option?

  20. #45
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by Farinorco View Post
    No. That AA code is shader code (AFAIK).

    I think you know enough (more than me I think) about those things, so:

    If you have a piece of code, written with a standard API, that should work on any hardware compatible with that API, and you include an absolutely unneeded line such as (either way it's the same):

    If myHW is detected, then papapapa...
    or
    If not competitorsHW is detected, then papapapa...

    Do you really think that nobody is losing nothing with "the deal"? Come on man...

    Including extra code to ensure that certain features that should work on standard hw only do with some brand of hw, is not disabling it?
    +1
    And the simple fact that in a demo you can enabled AA on ATI cards by switching ID with Nvidia cards prove it.

    Next step is what?
    If ATI cards detected game can't be launched

  21. #46
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Farinorco View Post
    No. That AA code is shader code (AFAIK).
    It's regular hardware accelerated AA.

    If you have a piece of code, written with a standard API, that should work on any hardware compatible with that API
    An API just asks the hardware and driver to do something. There's no guarantee at all that it does it properly. If what you said was true there would never be bugs in games that only affect one vendor's hardware.

    Including extra code to ensure that certain features that should work on standard hw only do with some brand of hw, is not disabling it?
    It's not disabling it if you wouldn't have gotten it in the first place. What you're saying is that adding it for Nvidia but not adding it for ATi is the same as disabling. That's obviously incorrect.
    Last edited by trinibwoy; 09-28-2009 at 07:33 AM.

  22. #47
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by AbelJemka View Post
    You're playing with the words Trinibwoy!
    How you can't enabled AA with ATI cards and you can enabled AA with Nvidia cards implied nobody would have gotten AA? Wich games is out now without AA option?
    Nearly every single UE3 game? Mass Effect? UT3?

  23. #48
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by AbelJemka View Post
    You're playing with the words Trinibwoy!
    How you can't enabled AA with ATI cards and you can enabled AA with Nvidia cards implied nobody would have gotten AA? Wich games is out now without AA option?
    Not a fair comparison too. The B:AA AA ( what a mess of AA's) it's not standard MSAA or any other standard solution, but custom AA programmed by the developers.

    The thing is: that AA is programmed over a standard Application Layer Interface that is compatible with lots of hardware. Including both NVIDIA and ATi hardware.

    Trinibwoy is probably saying that since NVIDIA has probably helped the game developers with that custom AA code, it's not making anybody any harm if they purposedly disable that code on any other brand of HW (or like he says, if they "only enable it on their own hardware", even thought it's developed and works well on all standard hw).

  24. #49
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by trinibwoy View Post
    Nearly every single UE3 game? Mass Effect? UT3?
    If I remember correctly dead space offered no AA option either...
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  25. #50
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by trinibwoy View Post
    It's regular hardware accelerated AA.
    Are you sure on that? What I've read is that it's custom AA that is more optimized to the game and that don't apply to every object on the screen, but it's the same. What I mean is: is just standard code. It should run, and it does, indeed. It's disabled on purpose.

    An API just asks the hardware and driver to do something. There's no guarantee at all that it does it properly. If what you said was true there would never be bugs in games that only affect one vendor's hardware.
    Just the same that every other piece of their code. Why they didn't disabled all shaders? Heck, why they didn't disabled rendering calls with Direct3D? There's no guaranty that the hw can do it, but it is assumed it does.

    I have said it before: yes, you can have unexpected behaviour because of bugs (on your side or on hw side). And I can understand disabling something if proven buggy as a shoddy solution. But when proven working right?

    It's not disabling it if you wouldn't have gotten it in the first place. What you're saying is that adding it for Nvidia but not adding it for ATi is the same as disabling. That's obviously incorrect.
    No, Trinibwoy, no. Don't play with words. You doesn't need to "enable it" for every platform, one by one. You know perfectly that standard code written over an standard API, don't need any kind of specific instructions to be enabled for each platform/HW piece.

    On the contrary, you need specific instructions to make it run only on certain hw (which is the same than no allowing it to run on other hardware).

    It's disabling since if you don't do anything, it works for everybody. If you leave it default, it works for everybody. You have made an additional work to don't allow it run on any other hardware. So the finality of that additional work is not to allow it to run on your hardware (enable) but impede it to run on any other hardware (disable).

Page 2 of 9 FirstFirst 12345 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •