Page 3 of 9 FirstFirst 123456 ... LastLast
Results 51 to 75 of 201

Thread: Ian McNaughton goes out against The Way it's Meant to be Played

  1. #51
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Farinorco View Post
    Are you sure on that? What I've read is that it's custom AA that is more optimized to the game and that don't apply to every object on the screen, but it's the same. What I mean is: is just standard code. It should run, and it does, indeed. It's disabled on purpose.
    I know what you're referring to. But that was talking about the difference between applying AA with discretion vs the forcing of AA on everything that AMD is forced to do. So since Nvidia cards are only applying AA where the developer asks for it they do less work. But in both cases it's still regular hardware AA. It's just the difference between in-game and CP AA.

    No, Trinibwoy, no. Don't play with words. You doesn't need to "enable it" for every platform, one by one. You know perfectly that standard code written over an standard API, don't need any kind of specific instructions to be enabled for each platform/HW piece.
    Who said anything about need? If Nvidia came on the scene and said "hey guys let's get AA up and running", there's no reason for them to accomodate ATi hardware other than some misguided sense of fairness. Like I said before, ATi users don't lose anything by Nvidia gaining AA.

  2. #52
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by trinibwoy View Post
    Who said anything about need? If Nvidia came on the scene and said "hey guys let's get AA up and running", there's no reason for them to accomodate ATi hardware other than some misguided sense of fairness. Like I said before, ATi users don't lose anything by Nvidia gaining AA.
    That code runs with ATi cards because it's compatible with it. That code it's there. So ATi owners have a game with code that works in their hardware, but that they can't use because...

    Because the developers have put additional code on the game to ensure that if it's not NVIDIA hardware what is trying to run the game, then you can't have access to that part of the game?

    OK, so, with your POV, if NVIDIA (or ATi) helps someone to make a DirectX game, then it's only a misguided sense of fairness what would impede them to include some lines to make it only NVIDIA (or ATi) compatible, so it doesn't run on any other machine? That's right for you?

    Where's the line if not? AA yes, but shaders no? Or AA and shaders yes, but basic geometry no? Or the whole game, maybe?

    ********************

    EDIT: And you're playing with words again. I said "need", and you've perfectly understood, because I'm talking about code. You "need" to code something to work if by default, it's not done. You "don't need" to code it, if by default, it's handled. I think it's pretty clear that I mean that this code is working for D3D hw in general, there's no difference between NVIDIA and ATi hardware at that level. So your code works for all that hardware, except if you purposely disable it for certain hw. That's what I mean with "need".
    Last edited by Farinorco; 09-28-2009 at 08:10 AM.

  3. #53
    Registered User
    Join Date
    Mar 2009
    Posts
    72
    Seems pretty clear Nvidia is throwing money at devs to get an edge in benchmarks. The forced AA for ATI cards in Batman is open and shut. Claiming they are merely providing devs with the resources they need is false to stupidity.

  4. #54
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,838
    i dont see what the big deal is, i usually use control panel AA because its better.
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  5. #55
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653
    To be fair,

    "Because the developers have put additional code on the game to ensure that if it's not NVIDIA hardware what is trying to run the game"

    This could read,

    Because the developers have put additional core to ensure if it is Nvidia hardware it will enable the option.

    ATI users are upset, rightfully so. Without clarification on why from the developer it is all wasted bandwidth in futile semantics.
    Last edited by Charles Wirth; 09-28-2009 at 08:29 AM.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  6. #56
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Farinorco View Post
    Because the developers have put additional code on the game to ensure that if it's not NVIDIA hardware what is trying to run the game, then you can't have access to that part of the game?
    Not sure how many different ways this can be said but I'll repeat it one last time. The developers did not add any code to the base game to prevent ATi cards from getting AA.

    And you're right. They didn't "need" to restrict it to Nvidia hardware but chose to do so. We can faff about for days about their reasons for doing so but in the end we still won't know.

  7. #57
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by FUGGER View Post
    To be fair,

    "Because the developers have put additional code on the game to ensure that if it's not NVIDIA hardware what is trying to run the game"

    This could read,

    Because the developers have put additional core to ensure if it is Nvidia hardware it will enable the option.

    ATI users are upset, rightfully so. Without clarification on why from the developer it is all wasted bandwidth in futile wordplay.
    Fugger, it can be read either way, because it means exactly the same both ways: "you are including code specifically to make that something that normally would run on any standard hw only runs in certain brand specific hw".

    You can word it like you want, but it's so simple as:

    I have a piece of code "<BIGGAPIECEOFCODE>".
    "<BIGGAPIECEOFCODE>" is code which runs in any D3D compatible hardware.

    Ouch, I don't want "<BIGGAPIECEOFCODE>" running on any D3D compatible hardware. I want it only running on NVIDIA hardware. Mmmmmh, let's do:

    If (NVIDIAhwRunningTheGame)
    {
    "<BIGGAPIECEOFCODE>"
    }
    Else "Youred".

    Now, everyone can word it like he wants. That doesn't change what it is done.

  8. #58
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653
    Lots of code to detect hardware type and enable features of that hardware.

    Going back to the obvious, if ATI did not support the developer...
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  9. #59
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Oh, well, so we all (trinibwoy, fugger and me, and probably everybody else) finally agree that the game (the AA option in the game, better said) is fully compatible with every D3D compatible device, but it only runs on NVIDIA hw because the developers on purpose have chosen to not let it run on ATi hw (even when it's perfectly compatible). Riiiiiiight.

    And you are defending that is perfectly right, and it doesn't suppose any penalization to any consumer. Curious. Just for curiosity:

    What do you think is right/wrong? AA is right, from your POV. OK. Shaders/other eyecandy? Textures (we could play with meshes only...)? The game directly running only on certain hw? At which point the consumer starts to being damaged?

  10. #60
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653
    Yes we all agree that ATI has poor to non existant developer support or relations to let things like this happen.

    Best to let the developer answer as to why, my best guess is failed validation. Perfectly compatible? unknown at this time.
    Last edited by Charles Wirth; 09-28-2009 at 08:33 AM.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  11. #61
    Xtreme Addict
    Join Date
    Feb 2008
    Location
    America's Finest City
    Posts
    2,078
    Quote Originally Posted by Mad1723 View Post
    Ian just explains the problems encountered by many players recently from TWIMTBP games from Nvidia.

    Am I the only one tired of Nvidia's shady acting to avoid looking bad. I know it's business and that nearly everything is permitted, but it starts looking like Nvidia is using some shady techniques....


    Source:http://blogs.amd.com/play/2009/09/11...u-should-care/ (further down in the comment is where this comment is located)
    It looks like maybe they should be working with game developers more closely instead of complaining about the fact that nvidia is already doing that. I mean he claims that he's submitting complaints, but it sounds like some of those could have been addressed if they had taken the time to work with the developers in addressing them beforehand.
    Quote Originally Posted by FUGGER View Post
    I am magical.

  12. #62
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by FUGGER View Post
    Yes we all agree that ATI has poor to non existant developer support or relations to let things like this happen.

    Best to let the developer answer as to why, my best guess is failed validation. Perfectly compatible? unknown at this time.
    The last one before I go out to smoke a cigarette : What I mean is: I'm not arguing legality, or understanding of why that happens. What I'm trying to say it's that such a practice, it's detrimental to the consumer. NVIDIA is playing well their cards to gain a marketing advantage by screwing consumers. And if ATi does the same in the other side of things, we are all ed up...

    We have standardization to allow free of choice in hw and intercompatibility with it all. If companies start to punish consumers because they don't have their hw, we all loose:

    Suppose that B:AA doesn't let to do AA to people who has ATi (wow, difficult to imagine) when it could because it's compatible. Because ATi didn't pay to the developers.

    Now suppose that DIRT 2 don't let NVIDIA use DX11 features (when they have a DX11 card). Because NVIDIA didn't pay to the developers.

    Now suppose that no payings would have influence on developers. Don't you think that this practice is in detriment of the consumers? Are you serious?

    I don't mind ATi or AMD or NVIDIA or any of them. I'm worry about the situation for the consumer, and I'm seeing lots of consumers defending this practice. Oh, it's really easy to avoid it, I know: buy only NVIDIA hw and you will have no problems.

    And when AMD starts to do the same because if not they can't sell hw?

    Poor or non existant developer relations? This have nothing to do with it. And it's been tryed and seen working with ATi cards, so it's compatible, it's not an unknown fact.

    **************

    EDIT (before the cigarette): AMD seems to have very good relationships with Blizzard. What if Diablo 3 would only run on ATi hw?
    Last edited by Farinorco; 09-28-2009 at 08:50 AM.

  13. #63
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Eh...

    First, about that whole A. Creed thing. Did you guys forget that with that path on certain shaders didn't work properly(dust particles and the like) and lights bled THRU walls? The shader path was removed because it didn't work properly, and ATi didn't offer the support to ubisoft to help them make it work.

    Also, I can tell you first hand NVidia supports developers with things besides money. Certain people here will tell you about the toy they had in our hands back when the 7800GTX 512mb came out. They give developers hardware that never sees the light of day as far as the consumers know, just to make sure they have a way to test their code that isn't a software renderer. I can tell you right now that ATi has never put a single piece of hardware in our hands... and it's not like we haven't asked for anything to test on.

    To those who say "well, NVidia can't possibly give that much help"...They do. They have a test lab with just about every possible configuration when it comes to nvidia hardware to test your application on to make sure it's going to work across a wide spectrum of hardware. You also have to remember that NVidia employ a lot more people than ati. They have people who's sole job IS testing said applications, finding the bugs, verifying if it's the game code or the driver, and giving a list of possible fixes to the developer. If anything, NVidia does more for the game industry than ATi has ever dreamed of doing, and pays for it with the money the consumer spends on their video card... How is that bad for the consumer? If ATi would do the same, perhaps PC gaming would actually move back up in terms of how many game studios are still out there.

    The fact that people here still believe TWIMTBP is merely a money hat program makes me laugh. Developers have freely explained exactly how it works time and time again. Also, about the whole AA thing for ATi cards, is anyone anywhere close to sure it never causes an issue? I mean, I could always attempt it myself and strip the securom and tell it my card is an NVidia and play through the entire game again.
    Last edited by DilTech; 09-28-2009 at 08:46 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  14. #64
    Registered User
    Join Date
    Jul 2008
    Location
    San Antonio
    Posts
    20
    So developers hold the video card companies hostage now?

    Pay up or no in game AA option for you.

    Interesting.
    Asus R2G | i7 920 D0 @ 4.0 | Asus HD 6950 | 6GB Corsair DDR3-1600 C7 | OCZ Vertex 120GB | Antec CP-850 | Antec 1200 v3 | Liteon SATA DVD Burner | Win 7 Pro X64 | BenQ FP241VW |
    No matter where you go, there you are.

  15. #65
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    473
    Quote Originally Posted by FUGGER View Post
    I know fist hand Nvidia gives developers the support they need, this has been in place for quite a while.

    Saying Nvidia is "buying" developers is false to stupidity. Nvidia may pay for the ad space for TWIMTBP logo (I am not sure), much like Gigabyte logo at the top of this site. I am not paid to screw the other manufactures, I use ASUS in my gaming rig.

    DX10.1 wasn't shady at all? and it changed your gaming experience any?

    We will always see the top rivals battle it out, keeps us happy with the best products faster to market at a fair price.
    Assassins creed lost dx10.1 support thanks to Nvidia's money supposedly. DX10 was ruined because of Nvidia supposedly.

    You don't think that batman missing AA support for only ATI cards is fishy?
    Quote Originally Posted by Bobbylite View Post
    with great MHZ comes great responsibility
    CPU:Q6600 G0 @ 3.825
    Motherboard:Asus P5E X38
    Memory:2x2GB OCZ Reapers DDR2 1066
    Graphics Card:Asus 4850
    Hard Drive:2xSegate 500gb 32MB Cache raid0
    Power Supply:Xion 800W
    Case:3DAurora
    CPU cooling: D-tek Fuzion V2 (Quad insert removed)
    GPU cooling: mcw60
    Monitor:24" LG

  16. #66
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653
    Get real Wrangler, game companies make money on selling games. I am sure they want to sell as many copies as possible and have the best experience for both.

    Favoring who you get your support from is logical.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  17. #67
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Metacritic gave this game 85% from reviewers. But so far it's 4.8 from users. Looks like this game has upset some folk.
    [SIGPIC][/SIGPIC]

  18. #68
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    and hwy pl think that graphic company must support game developers ? ati is short on money thats obvious u can't expect them to spread money right and left unless u want buy card for like 800 dolars

  19. #69
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by DilTech View Post
    Eh...

    First, about that whole A. Creed thing. Did you guys forget that with that path on certain shaders didn't work properly(dust particles and the like) and lights bled THRU walls? The shader path was removed because it didn't work properly, and ATi didn't offer the support to ubisoft to help them make it work.

    Also, I can tell you first hand NVidia supports developers with things besides money. Certain people here will tell you about the toy they had in our hands back when the 7800GTX 512mb came out. They give developers hardware that never sees the light of day as far as the consumers know, just to make sure they have a way to test their code that isn't a software renderer. I can tell you right now that ATi has never put a single piece of hardware in our hands... and it's not like we haven't asked for anything to test on.

    To those who say "well, NVidia can't possibly give that much help"...They do. They have a test lab with just about every possible configuration when it comes to nvidia hardware to test your application on to make sure it's going to work across a wide spectrum of hardware. You also have to remember that NVidia employ a lot more people than ati. They have people who's sole job IS testing said applications, finding the bugs, verifying if it's the game code or the driver, and giving a list of possible fixes to the developer. If anything, NVidia does more for the game industry than ATi has ever dreamed of doing, and pays for it with the money the consumer spends on their video card... How is that bad for the consumer? If ATi would do the same, perhaps PC gaming would actually move back up in terms of how many game studios are still out there.

    The fact that people here still believe TWIMTBP is merely a money hat program makes me laugh. Developers have freely explained exactly how it works time and time again. Also, about the whole AA thing for ATi cards, is anyone anywhere close to sure it never causes an issue? I mean, I could always attempt it myself and strip the securom and tell it my card is an NVidia and play through the entire game again.
    It's money. In the software development industry (probably in any other one) work is money:

    Any help given, any code optimized, any work developed, it's measured in man-month. Hours of work, that someone has to do. And hours of work of a worker has to be payed = money.

    An engine is money. A piece of shader code is money. Some hours of testing are money. That's why third party companies ask a price for their libraries and APIs, that's why programmers and tester ask a price for their work.

    So yes, TWIMTBP is money.

    ((((And now i FOR SURE go to smoke my cigarette)))))

    EDIT: And it becomes bad for the consumer when they start to ask in exchange of this money to restrict certain features that would be normally compatible with all standard hw to their own.
    Last edited by Farinorco; 09-28-2009 at 08:58 AM.

  20. #70
    Xtreme Addict
    Join Date
    Dec 2005
    Location
    UK
    Posts
    1,713
    What do you expect from a company that considers opening a can of whoop ass a legitimate strategy?
    TAMGc5: PhII X4 945, Gigabyte GA-MA790X-UD3P, 2x Kingston PC2-6400 HyperX CL4 2GB, 2x ASUS HD 5770 CUcore Xfire, Razer Barracuda AC1, Win8 Pro x64 (Current)

    TAMGc6: AMD FX, Gigabyte GA-xxxx-UDx, 8GB/16GB DDR3, Nvidia 680 GTX, ASUS Xonar, 2x 120/160GB SSD, 1x WD Caviar Black 1TB SATA 6Gb/s, Win8 Pro x64 (Planned)

  21. #71
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Well from what I can gather Nvidia better supports/works closer with the developers to insure things work properly with their hardware.

    So the solution is simple, ATI/AMD need to support/work closer with the developers to insure things work properly.

    Does it make Nvidia a criminal to put forth effort/time & resources to collaborate with the developer.

    Should the answer not be ATI needs to step up efforts/time & resources to work with developers to fully utilize and validate with their own hardware as well.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  22. #72
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Farinorco View Post
    NVIDIA is playing well their cards to gain a marketing advantage by screwing consumers.
    How exactly did Nvidia screw their consumers?

  23. #73
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Quote Originally Posted by DilTech View Post
    Eh...

    First, about that whole A. Creed thing. Did you guys forget that with that path on certain shaders didn't work properly(dust particles and the like) and lights bled THRU walls? The shader path was removed because it didn't work properly, and ATi didn't offer the support to ubisoft to help them make it work.

    Also, I can tell you first hand NVidia supports developers with things besides money. Certain people here will tell you about the toy they had in our hands back when the 7800GTX 512mb came out. They give developers hardware that never sees the light of day as far as the consumers know, just to make sure they have a way to test their code that isn't a software renderer. I can tell you right now that ATi has never put a single piece of hardware in our hands... and it's not like we haven't asked for anything to test on.

    To those who say "well, NVidia can't possibly give that much help"...They do. They have a test lab with just about every possible configuration when it comes to nvidia hardware to test your application on to make sure it's going to work across a wide spectrum of hardware. You also have to remember that NVidia employ a lot more people than ati. They have people who's sole job IS testing said applications, finding the bugs, verifying if it's the game code or the driver, and giving a list of possible fixes to the developer. If anything, NVidia does more for the game industry than ATi has ever dreamed of doing, and pays for it with the money the consumer spends on their video card... How is that bad for the consumer? If ATi would do the same, perhaps PC gaming would actually move back up in terms of how many game studios are still out there.

    The fact that people here still believe TWIMTBP is merely a money hat program makes me laugh. Developers have freely explained exactly how it works time and time again. Also, about the whole AA thing for ATi cards, is anyone anywhere close to sure it never causes an issue? I mean, I could always attempt it myself and strip the securom and tell it my card is an NVidia and play through the entire game again.
    It's so nice when somebody posts exactly your thoughts. I fully onehundredpercently agree. Couldn't have said it better.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  24. #74
    Xtreme Addict
    Join Date
    Oct 2007
    Location
    Chicago,Illinois
    Posts
    1,182
    WWW III is coming.TWIMTBP TITLES SUCK ANYWAY.But just in case something good come's out of it I will build a nvidia rig,ONE DAY.



  25. #75
    D.F.I Pimp Daddy
    Join Date
    Jan 2007
    Location
    Still Lost At The Dead Show Parking Lot
    Posts
    5,182
    Quote Originally Posted by Xoulz View Post
    Honestly, what's the big deal?

    This clandestine method is needed, to help nVidia's re-branded hardware look as fast as ATi's last gen...



    Jen-Hsun Huang has lost his touch..

    Whats the big deal? It's called cheating and market rigging ...in short entirely unethical & deceitful business practices!

    This is typical and exactly what Intel/ Microsoft do and the majority of code writers who accommodate and favor Intel because of their Monopoly power in the market place.

    Regardless of who's doing what there needs to be a level playing ground and rules , standards and guidelines need to be followed to ensure everyone gets a fair shake.
    SuperMicro X8SAX
    Xeon 5620
    12GB - Crucial ECC DDR3 1333
    Intel 520 180GB Cherryville
    Areca 1231ML ~ 2~ 250GB Seagate ES.2 ~ Raid 0 ~ 4~ Hitachi 5K3000 2TB ~ Raid 6 ~

Page 3 of 9 FirstFirst 123456 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •