Page 6 of 9 FirstFirst ... 3456789 LastLast
Results 126 to 150 of 201

Thread: Ian McNaughton goes out against The Way it's Meant to be Played

  1. #126
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    There is no need to defend Nvidia or ATI, this is the developers choice in the end regardless.

    Once some form or "real" confirmation regarding the issue is made from people who are actually in the know then its simply fuel for another hatefest for the haters of Nvidia and pretty much the same group post hating on them in pretty much every Nvidia related thread.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  2. #127
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    QLD
    Posts
    942
    Quote Originally Posted by Shadov View Post
    For example, Batman with Physics workaround on a CPU (note the word workaround):

    http://www.youtube.com/watch?v=AUOr4cFWY-s

    But guess what... Its a "The way its meant to be payed" game so most features are gone once you disable proprietary PhysX.

    Talk about artificially limiting gaming experience!

    If I had a dime every time I felt like the developers of a game purposely shackled the game in one way or another I would be rich. Stop blaming Nvidia for something they had nothing to do with, it was obviously the game developers using Physx as a marketing tool for their own gains.

  3. #128
    Registered User
    Join Date
    Jan 2006
    Posts
    88
    i remember when ati did this with valve based games, half life 2 in particular ran slow on nvidia hardware but when the games was tricked into running the ati shader path for the game on nvidia cards the performance shot up considerably.

    people tend to forget that things like this have always happened in the past and will continue to happen in the future. i wont be surprised if DIRT 2 has some ati bias in it.
    Opteron 170@ 265 x 10 @1.375v
    Asrock 939N68PV-GLAN, GEIL 2gb 400mhz ram
    8800GT accelero s1, XP X64. Antec p182 mini
    velociraptor 150gb, Corsair 620w

  4. #129
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by highoctane View Post
    Once some form or "real" confirmation regarding the issue is made from people who are actually in the know then its simply fuel for another hatefest for the haters of Nvidia and pretty much the same group post hating on them in pretty much every Nvidia related thread.
    Ian McNaughton would not blog about the issue if the detection was not taking place. He would probably be fired for making baseless claims.

    But Nvidia will never own up to this, they will just claim the detection is there to "improve the user experience on Nvidia hardware, and to ensure a decent playable experience on other hardware".

  5. #130
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by Starscream View Post
    There is only 1 company who supports DX10.1 so they ofc go to this one company for help and the company refuses this help.
    Proof? you just speculate!

    I will give you two facts :
    1. A.Creed was a TWIMTBP's game http://fr.nzone.com/object/nzone_ass...d_home_fr.html

    2. Hawks was not a TWIMTBP's game http://hawxgame.fr.ubi.com/

    Both games were from Ubi Soft, both were initially DX10.1, but one was it DX10.1 support dropped
    Surely because ATI has not invested in the game (DilTech and you seem to think this), or maybe because Nvidia has invested more (like i think...).

  6. #131
    no sleep, always tired TheGoat Eater's Avatar
    Join Date
    Oct 2006
    Location
    Iowa, USA
    Posts
    1,832
    Quote Originally Posted by FUGGER View Post
    now Guiness is just reaching with that one LOL
    But that's not all, Batman: Arkham Asylum has made history by securing a Guinness World Record for ‘Most Critically Acclaimed Superhero Game Ever’!


  7. #132
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by Cyber-Mav View Post
    i remember when ati did this with valve based games, half life 2 in particular ran slow on nvidia hardware but when the games was tricked into running the ati shader path for the game on nvidia cards the performance shot up considerably.

    people tend to forget that things like this have always happened in the past and will continue to happen in the future. i wont be surprised if DIRT 2 has some ati bias in it.
    Another guy who try to rewrite history...
    Nvidia FX series run the game in DX8.1 mode and you can force them to run in DX9.0 mode by using ATI code path. That's true.
    But Nvidia 6 séries run the game without this trick in DX9.0 mode and where behind ATI X8xx series

    Quote Originally Posted by gamervivek View Post
    on a similar note nivida's dx10.1 chip showed improvement for battleforge with the dx10.1 path.
    http://www.pcgameshardware.com/aid,6...iewed/Reviews/
    ATI has not pay enough to have code line :
    {If Nvidia_DX10.1 card run DX9.0 code
    end}
    What a pity
    Last edited by AbelJemka; 09-28-2009 at 03:31 PM.

  8. #133
    Registered User
    Join Date
    May 2009
    Posts
    5
    Quote Originally Posted by Clairvoyant129 View Post
    Do you think Nvidia just pays developers money to slap the TWIMP logo? There is extensive validation and testing to ensure the best possible experience on Nvidia hardware. Just because AMD has horrible developer support, that some how translates too it being Nvidia's fault?

    I do agree however that the no AA option for ATI cards are a little extreme and should be patched out right away.
    The developers should only concentrate on the the API provided by the OS makers. And the hardware makers' job is to ensure their hardware support the latest API. That is called abstraction and isolation, that is how the divisions of labor work in IT. That is the design principle in everything from hardware to software systems.

    Now the problem is that, nvidia is not compettive enough to ensure their cards being conformed to the newest tech, they have to pay some dirty money to make their ty card less ty, nvidia is the rule broker here.

    EDIT: Removed insults, removed dragonso from this section.
    Last edited by Charles Wirth; 09-28-2009 at 03:27 PM.

  9. #134
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,656
    Keep it clean and civil.
    No warnings, you will be removed from this section permanently .
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  10. #135
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by AbelJemka View Post
    ATI has not pay enough to have code line :
    {If Nvidia_DX10.1 card run DX9.0 code
    end}
    What a pity
    really now? you realize d3d is at the hardware level. if you think this happens why don't you think ATi doesnt do it too?

  11. #136
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by Starscream View Post
    read DilTech properly.
    He doesnt mean invest as in pure money but he means it as in sending some people with some hardware over to fix the problem.
    Cause atm ATI isnt giving any kind of real support to the game developers.

    You should see it from the point of view of Ubisoft.
    They got a problem making a DX10.1 work properly in their game. There is only 1 company who supports DX10.1 so they ofc go to this one company for help and the company refuses this help.

    So instead of putting a broken feature (DX10.1) into the game wich wil shure have loads of people complain they take it out instead.
    But I think the point that AMD is making is that why should you have to work with developers to get something like DX10.1 or any other STANDARD feature working?

    Think of it this way, AMD likes the approach that, there is an open standard that can be adhered too. For example, like DirectX, you simply make your card work with DirectX, and any developer doesn't need your hardware. All they need is to program for the standard, and if the cards adhere to the standard, it works.

    No need for payouts, special relationships for code paths that add some new dust particles or run .1% faster, or is optimized for your card. If something is good, then we should raise the bar right?

    I understand there is some point where you introduce new features and such and this would require to work with developers to bring a new feature to the game.

    But there is a difference in philosophy here between Nvidia and AMD, where Nvidia wants to be the influence making proprietary features or ways of doing things, and AMD prefers to work with others to make better standards and then do better in these standards.

    In case that I am not making my point very good. On one hand, you have AMD with a great feature (Tesselation) for their cards, they didn't go to developers and pay them money to get Tesselation to work on their games, or require Tesselation be the new standard through marketing. They went to Microsoft and said look at this feature, we want to bring this to you so that you can make it a standard feature (of DX11) so developers only need to work with DX11 to get the feature to work on any DX11 capable cards.

    Thne you have Nvidia, on the other side, who has a great feature (PhysX) who go out to developers and pay them money to get the game to work with PhysX then go out marking how great PhysX is and how it is works in this game. They don't try to improve the standard for consumers, they simply try to get more people to buy their cards and do it their way. They even pay developers to de-activate standard features of DX not work on competitors cards.
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  12. #137
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Starscream View Post
    read DilTech properly.
    He doesnt mean invest as in pure money but he means it as in sending some people with some hardware over to fix the problem.
    Cause atm ATI isnt giving any kind of real support to the game developers.
    Actually they are, which is kinda sad that certain people keep posting on the contrary even though they KNOW better.
    Sure maybe their relations aren't on the same level as Nvidia but they do have a specific department setup for this type of thing.

    The problem is the stuff that goes on behind the scene and for some reason certain devs not wanting AMD/ATi's take on situations or their help/support.
    Yes, they are in constant communication with the vast majority of game devs.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  13. #138
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Birmingham AL.
    Posts
    1,079
    Quote Originally Posted by FUGGER View Post
    Posted this;

    http://forums.eidosgames.com/showpos...58&postcount=1

    I also have an email into Eidos and Nvidia.
    Good Idea. Wonder if they will even bother with this or just remove it. I look forward to what they have to say if they bother answering.
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

  14. #139
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by Chumbucket843 View Post
    really now? you realize d3d is at the hardware level.
    Link between D3D and HL2 code path?
    Quote Originally Posted by Chumbucket843 View Post
    really now? you realize d3d is at the hardware level. if you think this happens why don't you think ATi doesnt do it too?
    Give me something to think ATI does it?

    3 Last ATI sponsored game (Battleforge, Left for Dead and Hawkx) work perfectly on Nvidia hardware.
    This link http://www.pcgameshardware.com/aid,6...iewed/Reviews/ prove that fact even more.

  15. #140
    Xtreme Enthusiast
    Join Date
    Mar 2009
    Location
    Toronto ON
    Posts
    566
    Additionally, the in-game AA option was removed when ATI cards are detected.

    We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo.

    By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced.

    This option is not available for the retail game as there is a secure rom.
    I'm really surprised some people approve of the above. I wonder if the same people would say it's OK if the situation was reversed.
    Last edited by Heinz68; 09-28-2009 at 03:55 PM.
    Core i7-4930K LGA 2011 Six-Core - Cooler Master Seidon 120XL ? Push-Pull Liquid Water
    ASUS Rampage IV Black Edition LGA2011 - G.SKILL Trident X Series 32GB (4 x 8GB) DDR3 1866
    Sapphire R9 290X 4GB TRI-X OC in CrossFire - ATI TV Wonder 650 PCIe
    Intel X25-M 160GB G2 SSD - WD Black 2TB 7200 RPM 64MB Cache SATA 6
    Corsair HX1000W PSU - Pioner Blu-ray Burner 6X BD-R
    Westinghouse LVM-37w3, 37inch 1080p - Windows 7 64-bit Pro
    Sennheiser RS 180 - Cooler Master Cosmos S Case

  16. #141
    Banned
    Join Date
    Jul 2008
    Posts
    162
    The developers should be coding for the API, that's the whole point of having DX and OGL, they are standards. I think Nvidia should look into resurrecting glide

  17. #142
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Rocksteady statement here: (it was made between the time of the demo & retail release)
    The form of anti-aliasing implemented in Batman: Arkham Asylum uses features specific to NVIDIA cards. However, we are aware some users have managed to hack AA into the demo on ATI cards. We are speaking with ATI/AMD now to make sure it’s easier to enable some form of AA in the final game.
    http://forum.beyond3d.com/showpost.p...3&postcount=38
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  18. #143
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Vancouver,British Columbia, Canada
    Posts
    1,178
    Perhaps Microsoft needs to start regulating the game industry,they make the DX after all - how come no one
    has mentioned them....

    Too many things are optional in DX9,hopefully from DX11
    and up it will be strict enough to not have this kind of PissX.


    World Community Grid's mission is to create the world's largest public computing grid to tackle projects that benefit humanity.
    Our success depends upon individuals collectively contributing their unused computer time to change the world for the better.

  19. #144
    Xtreme Member
    Join Date
    Oct 2006
    Posts
    247
    Quote Originally Posted by Cyber-Mav View Post
    i remember when ati did this with valve based games, half life 2 in particular ran slow on nvidia hardware but when the games was tricked into running the ati shader path for the game on nvidia cards the performance shot up considerably.

    people tend to forget that things like this have always happened in the past and will continue to happen in the future. i wont be surprised if DIRT 2 has some ati bias in it.
    Wow, you really remember your history wrong.

    The nVidia cards (FX 5xxx series) were automatically put on the DX8.1 path instead of the "proper" "ati" shader path as you put it, or rather the DX9.0 shader path, because performance on the DX9.0 shader path was HORRIBLE on the nVidia cards.

    So, yes, abit of image quality(dx8.1 vs dx9.0) on the fx 5xxx series was sacrificed, but for good reason.

    Core 2 Duo(Conroe) was based on the Intel Core Duo(Yonah) which was based on the Pentium M(Banias) which was based on the Pentium III(Coppermine).

    Core 2 Duo is a Pentium III on meth.

  20. #145
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Bayamon,PR
    Posts
    257
    Quote Originally Posted by Dainas View Post
    If I had a dime every time I felt like the developers of a game purposely shackled the game in one way or another I would be rich. Stop blaming Nvidia for something they had nothing to do with, it was obviously the game developers using Physx as a marketing tool for their own gains.
    Its not the fact that they used physx , we dont care about that , what we care is that they made it so to block some features that will not work if you are not using an nvida card . That is the issue .

  21. #146
    Xtreme Addict
    Join Date
    Jun 2002
    Location
    Ontario, Canada
    Posts
    1,782
    Quote Originally Posted by SocketMan View Post
    Perhaps Microsoft needs to start regulating the game industry,they make the DX after all - how come no one
    has mentioned them....

    Too many things are optional in DX9,hopefully from DX11
    and up it will be strict enough to not have this kind of PissX.
    I was under the impression that from DX10 and onward, your card(s) had to support the complete set of DX features to obtain Microsoft certification. Correct me if I'm wrong.

  22. #147
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Quote Originally Posted by AbelJemka View Post
    Lol at your post DilTech!
    You basically say that ATI would have to "invest" (pay Ubi Soft so...) to make them correctly implement DX10.1 mode in A.Creed
    Absolutely, it's called a vendor supporting their ing product. Why would you as a software developer want to work with advanced functionality (only available by one of the two vendors) when they won't give you the time of day with regards to support?

    You can't really excuse AMD/ATI for "not being big enough" or not being in a financial position to provide partner support for their products. This is a basic part of being a technology vendor and they have failed miserably in both incarnations of the company.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

  23. #148
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Quote Originally Posted by freeloader View Post
    I was under the impression that from DX10 and onward, your card(s) had to support the complete set of DX features to obtain Microsoft certification. Correct me if I'm wrong.
    That was Dx10 only. Dx11 has already reversed that as Dx10 cards "already" support Dx11 sans the new functionality.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

  24. #149
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Brother Esau View Post
    Whats the big deal? It's called cheating and market rigging ...in short entirely unethical & deceitful business practices!

    This is typical and exactly what Intel/ Microsoft do and the majority of code writers who accommodate and favor Intel because of their Monopoly power in the market place.

    Regardless of who's doing what there needs to be a level playing ground and rules , standards and guidelines need to be followed to ensure everyone gets a fair shake.
    That's what Microsofts DX APIs is for & NV is paying Devs to go outside it on somethings that DX is quite capable of doing.

  25. #150
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Quote Originally Posted by Final8ty View Post
    That's what Microsofts DX APIs is for & NV is paying Devs to go outside it on somethings that DX is quite capable of doing.
    It's obviously not sufficient as most of the modern deferred-rendering engines have huge issues with enabling AA and are requiring vendor-specific tweaks.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

Page 6 of 9 FirstFirst ... 3456789 LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •