Page 5 of 9 FirstFirst ... 2345678 ... LastLast
Results 101 to 125 of 201

Thread: Ian McNaughton goes out against The Way it's Meant to be Played

  1. #101
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    If someone has to clarify why AA is disabled on ATI hardware and why, it it is the game developer, not some dudes like us on a forum...

    As stated a little back, this is Assassin's Creed Dx10.1 all over again, except we don't know now what is broken in Batman, but we did know a year ago (or was it two?) that AC Dx10.1 was broken (lights through walls, absence of dust/particles and missing rendering passes), so i think we should hold our words until the devs make their comment and explain us all enthusiasts what happened.
    Are we there yet?

  2. #102
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by trinibwoy View Post
    How exactly did Nvidia screw their consumers?
    As JohnJohn has said, I've said "consumers", not "NVIDIA consumers".

    I don't know about others, but I consider myself a consumer of certain kinds of products, not of brands. I'm a graphic hardware consumer, not a NVIDIA or ATI consumer.

    As a consumer, when I want to buy a new piece of hardware, I chose the product that I think it's better for my budget, be it from NVIDIA, AMD, Intel, or VIA. So if someone it's artificially making some software not working at a full level with certain hardware which I normally could chose, when that software is perfectly compatible with that hardware, that someone is screwing me. As a consumer.

    Quote Originally Posted by SamHughe View Post
    I have to agree. All I hear is excuses excuses excuses. ATI can make good cards so why not trying to establish better relationships with the game developers? ATI should stop playing the same old "we're the underdogs, nobody likes us so please support us" game. You are the second largest manufacturer of graphics cards with thousands of customers how hard for you to connect with the game developers and get a demo from them so you can work out the problems?
    Why people would refuse to work with you?

    Having said that, I hope ATI changes their strategies and stay as a formidable competitive to NVdia. Remember the socket 939 dominance days? Conroe came out as a result of that.
    Think about this hypothetical situation:

    AMD has put money (work, code, it's the same) to the developers of DIRT 2 to use DX11 code. So they make the same that NVIDIA with Batman AA: the DX11 features (or some of them) can't be enabled in NVIDIA hardware (for when they have some).

    Even if that was possible (I don't think AMD is in position to do something like that, as is the case of NVIDIA)... wow, GREAT. Now we have fixed everything. Now I not only can't use AA in B:AA if I have ATi, but I can't use DX11 in DIRT2 if I have NVIDIA. Excelent!. My situation as a consumer has improved hugely since AMD "has improved their relationships with developers to the degree of NVIDIA".

    The problem here is not the relationships with the developers, but what kind of things should/shouldn't be allowed to be done with those relationships.
    Last edited by Farinorco; 09-28-2009 at 11:35 AM.

  3. #103
    Xtreme Addict
    Join Date
    May 2004
    Posts
    1,755
    Farinorco buddy, you've been incredibly patient and showed a lot of common sense when dealing with so much bad faith and IMHO, you summed it all up
    Crosshair IV Formula
    Phenom II X4 955 @ 3.7G
    6950~>6970 @ 900/1300
    4 x 2G Ballistix 1333 CL6
    C300 64G
    Corsair TX 850W
    CM HAF 932

  4. #104
    Xtreme Member
    Join Date
    Jul 2006
    Location
    Ontario, Canada
    Posts
    251
    Quote Originally Posted by DilTech View Post
    Eh...

    First, about that whole A. Creed thing. Did you guys forget that with that path on certain shaders didn't work properly(dust particles and the like) and lights bled THRU walls? The shader path was removed because it didn't work properly, and ATi didn't offer the support to ubisoft to help them make it work.

    Also, I can tell you first hand NVidia supports developers with things besides money. Certain people here will tell you about the toy they had in our hands back when the 7800GTX 512mb came out. They give developers hardware that never sees the light of day as far as the consumers know, just to make sure they have a way to test their code that isn't a software renderer. I can tell you right now that ATi has never put a single piece of hardware in our hands... and it's not like we haven't asked for anything to test on.

    To those who say "well, NVidia can't possibly give that much help"...They do. They have a test lab with just about every possible configuration when it comes to nvidia hardware to test your application on to make sure it's going to work across a wide spectrum of hardware. You also have to remember that NVidia employ a lot more people than ati. They have people who's sole job IS testing said applications, finding the bugs, verifying if it's the game code or the driver, and giving a list of possible fixes to the developer. If anything, NVidia does more for the game industry than ATi has ever dreamed of doing, and pays for it with the money the consumer spends on their video card... How is that bad for the consumer? If ATi would do the same, perhaps PC gaming would actually move back up in terms of how many game studios are still out there.

    The fact that people here still believe TWIMTBP is merely a money hat program makes me laugh. Developers have freely explained exactly how it works time and time again. Also, about the whole AA thing for ATi cards, is anyone anywhere close to sure it never causes an issue? I mean, I could always attempt it myself and strip the securom and tell it my card is an NVidia and play through the entire game again.

    Completely agree with this.

    Everybody that thinks Nvidia is doing something illegitimate is living in another dimension. You forget that hardware is nothing without software and software is nothing without hardware. One has to complement the other, and when a hardware manufacturer gives a software developer the tools, and a software developer gives a hardware developer the code, you get a perfect application experience. Intel follows the same practices. If the software you use is optimized for Intel and Nvidia, how you can put up an argument and say, I want to buy AMD/ATI hardware AND have the best application experience...

    If you are wondering why AMD/ATI aren't maintaining developer relationships, it's simple: they have no more money. Think back when the Athlon 64 was the fastest, there was a boat load of applications developed with extensions for those processors. Think back when the Radeon 8 and 9 series were released. The entire Source Engine was developed around ATI hardware.

    And now?

    Intel Core i7 3770K | Asus Maximus V Gene | 16GB DDR3-1600 | Asus GTX 670 directCU TOP | Intel 320 120GB | WD Caviar Black 1TB | Enermax Revolution 1050W | Silverstone TJ08-E | Dell UltraSharp U2711

  5. #105
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by AbelJemka View Post
    Lol at your post DilTech!
    You basically say that ATI would have to "invest" (pay Ubi Soft so...) to make them correctly implement DX10.1 mode in A.Creed

    They do the job for Oblivion, they do the job for R6:Vegas. Why they have to do the job for A.Creed?

    To focus on Batman:AA, i have no problem seeing Nvidia giving money for PhysX support (only them have it) but i have a problem seeing Nvidia giving money to no let ATI cards using something in the game they can use (AA).


    What the link between ATI and socket 939 dominance days?
    Ok i see, Intel pay for manufacturer support and AMD didn't...
    I didn't say pay for DX 10.1 in A. Creed, I said invest, as in invest time and work stations to help them get it working properly. Ubisoft had 2 companies to go through for that help, and NVidia obviously wasn't going to do it. ATi couldn't do it, ATi didn't get it... That simple.

    Finally, lets wait until we hear the reasoning behind AA not being done in the options for ATi cards. There might just be something there...

    Quote Originally Posted by Shadov View Post
    For example, Batman with Physics workaround on a CPU (note the word workaround):

    http://www.youtube.com/watch?v=AUOr4cFWY-s

    But guess what... Its a "The way its meant to be payed" game so most features are gone once you disable proprietary PhysX.

    Talk about artificially limiting gaming experience!
    A lot of cpus choke with the work around, sadly enough. Also, those who got it working complained of it being rather buggy when using the work around.

    Also, about that PhysX support situation. PhysX is built in to Unreal Engine 3. Been that way since WAY BEFORE NVidia bought out Ageia. As such, technically speaking, NVidia didn't have to do a thing to get that support added in.

    Quote Originally Posted by Smartidiot89 View Post
    I can say partly it is AMDs fault that their support is bad when it comes to games but disabling AA?

    Anyways... AMD have begun helping developers now, they have given them DX11 hardware since months back and also other kind of support such as lending out programming staff etc. so I am hoping AMD will start their own "The way its meent to be played" program, atleast it's looking like that right now
    AMD/ATi have been helping developers for quite some time(the Get In The Game program), problem is they choose poorly on which companies to work with, and it's VERY FEW and far between... CoJ was a huge failure, especially looking at some of the other titles that year. HL2 was a smart investment, but look how many copies of HL2 that deal costed ATi... Now they're going for Dirt 2? I find it funny how they randomly pick titles I have no intention of playing.

    Of course, here's hoping they do the smart thing and quickly snag AvP. If AMD/ATi really want to score some market share the big step is going to be their name on some big games.
    Last edited by DilTech; 09-28-2009 at 12:12 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  6. #106
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by Shadov View Post
    Talk about artificially limiting gaming experience!
    +1
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  7. #107
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Vancouver
    Posts
    1,073
    Quote Originally Posted by Syn. View Post
    What do you expect from a company that considers opening a can of whoop ass a legitimate strategy?
    +1 LoL.

    If Nvidia invests in the development of a game, and works with a developer to optimize code, it is only reasonable that they should expect some ROI.
    But this to me, comes across less like spending time on development, and more like purposeful sabotage, whether it is or isn't doesn't matter much, because the perception is what will do the damage. Why would they have the no AA option code based on HW detection at all? Surely it's ATI's responsibility with drivers that if they meet the criteria of the API's specs, that the feature should be enabled? If it fails, it should reflect on ATI not the developer..this would be the only argument i could see with validation, and it admittedly is a stretch. What seems more likely is you scratch my back i scratch yours, ie, Nvidia helps with Development, and said Development company helps with a few lines of code, to slightly impact gameplay on competitions card in an ambiguous way. Nothing will ever be proven, but it does seem anti competitive in a way. I d rather the money be spent on increasing performance through efficient coding, better hardware, than disabling competition, and blanket marketing.. although i m sure the latter has higher dividends and ROI.

    also +1 to this

    Quote Originally Posted by DilTech View Post
    AMD/ATi have been helping developers for quite some time(the Get In The Game program), problem is they choose poorly on which companies to work with, and it's VERY FEW and far between... CoJ was a huge failure, especially looking at some of the other titles that year. HL2 was a smart investment, but look how many copies of HL2 that deal costed ATi... Now they're going for Dirt 2? I find it funny how they randomly pick titles I have no intention of playing.

    Of course, here's hoping they do the smart thing and quickly snag AvP. If AMD/ATi really want to score some market share the big step is going to be their name on some big games.
    Also lol at the metacritic shills.
    Last edited by villa1n; 09-28-2009 at 12:23 PM.
    " Business is Binary, your either a 1 or a 0, alive or dead." - Gary Winston ^^



    Asus rampage III formula,i7 980xm, H70, Silverstone Ft02, Gigabyte Windforce 580 GTX SLI, Corsair AX1200, intel x-25m 160gb, 2 x OCZ vertex 2 180gb, hp zr30w, 12gb corsair vengeance

    Rig 2
    i7 980x ,h70, Antec Lanboy Air, Samsung md230x3 ,Saphhire 6970 Xfired, Antec ax1200w, x-25m 160gb, 2 x OCZ vertex 2 180gb,12gb Corsair Vengence MSI Big Bang Xpower

  8. #108
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,247
    Quote Originally Posted by LowRun View Post
    Farinorco buddy, you've been incredibly patient and showed a lot of common sense when dealing with so much bad faith and IMHO, you summed it all up
    i concur.

    however, i don't want to blame nvidia alone for this mess. if nvidia really plays such dirty games it still needs someone who accompanies them; in this case the game developers.
    furthermore there are way too much if's involved in this whole discussion. we don't know anything for sure (yet).

    even though, we all know nvidia is all but a blank slate... neither is amd/ati, but not to the extent nvidia is, imo.
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  9. #109
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    393
    You're getting the same exact experience console users are getting so why complain? It's not Nvidia's fault AMD doesn't have a marketing team.

    Quote Originally Posted by 216 View Post
    Seems like Fugger is on the payroll???
    You see the Nvidea add in the top corner!

    Rule No. 1 - Follow the money
    First of all, it's a Gigabyte banner and second to accuse Fugger of being on a payroll... Delusional AMD fanboys always gives me a good laugh... always coming out with the conspiracy theories and always being negative about everything that isn't a positive news for AMD/ATI.
    Last edited by Clairvoyant129; 09-28-2009 at 12:34 PM.

  10. #110
    Xtreme Addict
    Join Date
    Dec 2005
    Location
    UK
    Posts
    1,713
    Quote Originally Posted by Clairvoyant129 View Post
    You're getting the same exact experience console users are getting so why complain? It's not Nvidia's fault AMD doesn't have a marketing team.
    The fact that people know TWIMP as a marketing team shows the real problem behind it, in other words its not know for being an engineer's team.
    TAMGc5: PhII X4 945, Gigabyte GA-MA790X-UD3P, 2x Kingston PC2-6400 HyperX CL4 2GB, 2x ASUS HD 5770 CUcore Xfire, Razer Barracuda AC1, Win8 Pro x64 (Current)

    TAMGc6: AMD FX, Gigabyte GA-xxxx-UDx, 8GB/16GB DDR3, Nvidia 680 GTX, ASUS Xonar, 2x 120/160GB SSD, 1x WD Caviar Black 1TB SATA 6Gb/s, Win8 Pro x64 (Planned)

  11. #111
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    393
    Quote Originally Posted by Syn. View Post
    The fact that people know TWIMP as a marketing team shows the real problem behind it, in other words its not know for being an engineer's team.
    Do you think Nvidia just pays developers money to slap the TWIMP logo? There is extensive validation and testing to ensure the best possible experience on Nvidia hardware. Just because AMD has horrible developer support, that some how translates too it being Nvidia's fault?

    I do agree however that the no AA option for ATI cards are a little extreme and should be patched out right away.

  12. #112
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Location
    Grimsby, UK
    Posts
    666
    Additionally, the in-game AA option was removed when ATI cards are detected.

    We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo.

    By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced.

    This option is not available for the retail game as there is a secure rom
    .
    This about sums it all up, what more proof do you need?

    If it worked in the Demo & not in the retail version it must of been done on purpose so that Nvidia Cards had an advantage.

    It's probally the developers doing, but they might come back & say well Nvidia paid us to have control of the features . . . . now that will be interesting as the EU will EAT Nvidias money just like Intel's.
    i5 2500K @ 4.9GHz MSI Z77 MPower G.Skill Trident 8GB 2400C10
    EVGA GTX 1070 SC 8GB @ 1784/4004MHz Corsair HX 750 PSU
    Samsung 830 256GB Creative ZxR Thermalright Silver Arrow
    NEC 24WMGX3 24" TFT Fractal Design Define S Win 7/10 64bit

  13. #113
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Ottawa, Canada
    Posts
    2,443
    Vote with your wallet! It is as simple as that really. I will not be buying Batman because of this so they lose as well. What will end up happening is propriety games and that will suck if this is allowed to continue. ATI needs to step up a little with developers to keep from seeing red and Nvidia needs to calm it's tactics before people see the green eyed monster it really is.

  14. #114
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by DilTech View Post
    Stuk, A. Creed had visual artifacts when running in DX10.1 mode, from lights bleeding through walls to an entire rendering pass missing that included dust particles and other issues... ATi didn't want to invest in helping get it up to snuff, and what interest did NVidia have in doing the work, considering they had nothing to gain in spending the money to make the code work.

    It's a case of AMD/ATi refused to put up or shut up, so they pushed the blame on NVidia instead.
    http://www.youtube.com/watch?v=a7-_Uj0o2aI

    DX10.1 is faster because of rendering passes eh? Then please explain how the rest of the DX10.1 titles run faster than DX10.

    So now GPU companies have to PAY developers to get games to work properly with their cards? Giving them free hardware and support isn't enough huh?
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  15. #115
    Xtreme Mentor
    Join Date
    Sep 2005
    Location
    Netherlands
    Posts
    2,693
    Quote Originally Posted by Stukov View Post
    http://www.youtube.com/watch?v=a7-_Uj0o2aI

    DX10.1 is faster because of rendering passes eh? Then please explain how the rest of the DX10.1 titles run faster than DX10.

    So now GPU companies have to PAY developers to get games to work properly with their cards? Giving them free hardware and support isn't enough huh?
    read DilTech properly.
    He doesnt mean invest as in pure money but he means it as in sending some people with some hardware over to fix the problem.
    Cause atm ATI isnt giving any kind of real support to the game developers.

    You should see it from the point of view of Ubisoft.
    They got a problem making a DX10.1 work properly in their game. There is only 1 company who supports DX10.1 so they ofc go to this one company for help and the company refuses this help.

    So instead of putting a broken feature (DX10.1) into the game wich wil shure have loads of people complain they take it out instead.
    Time flies like an arrow. Fruit flies like a banana.
    Groucho Marx



    i know my grammar sux so stop hitting me

  16. #116
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    I want to hear the real words from the horse's mouth.
    Are we there yet?

  17. #117
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    115
    Quote Originally Posted by Farinorco View Post
    Think about this hypothetical situation:

    AMD has put money (work, code, it's the same) to the developers of DIRT 2 to use DX11 code. So they make the same that NVIDIA with Batman AA: the DX11 features (or some of them) can't be enabled in NVIDIA hardware (for when they have some).

    Even if that was possible (I don't think AMD is in position to do something like that, as is the case of NVIDIA)... wow, GREAT. Now we have fixed everything. Now I not only can't use AA in B:AA if I have ATi, but I can't use DX11 in DIRT2 if I have NVIDIA. Excelent!. My situation as a consumer has improved hugely since AMD "has improved their relationships with developers to the degree of NVIDIA".

    The problem here is not the relationships with the developers, but what kind of things should/shouldn't be allowed to be done with those relationships.

    it's too easy to lose patience when people simply refuse to see your viewpoint.Unless it's a hardware limitation,like the CSAA modes,I find it hard to believe that ati cards can't do correct/any AA in BAA.

    As for ati's lack of developer relations frankly what do you guys expect?
    That they should be helping them out to enable features that work only on their card?Maybe try and get them to use the tessellator on their cards? Or perhaps put in an exclusive edge-detect AA mode and abolish all other modes altogether?Would it be better if ati included a clause of vendor check whenever they helped a developer with dx10.1 or dx11?

    on a similar note nivida's dx10.1 chip showed improvement for battleforge with the dx10.1 path.
    http://www.pcgameshardware.com/aid,6...iewed/Reviews/

    And how can you expect them to look after every game developer and game on the PC scene so that they can go about to implement a specific feature that their competitor got implemented only on their hardware? Blaming ati is like blaming a girl for not seducing you while the hooker put you up with an STD.

    Quote Originally Posted by Sadasius View Post
    Vote with your wallet! It is as simple as that really. I will not be buying Batman because of this so they lose as well. What will end up happening is propriety games and that will suck if this is allowed to continue. ATI needs to step up a little with developers to keep from seeing red and Nvidia needs to calm it's tactics before people see the green eyed monster it really is.
    unfortunately common sense is not that common.And much more so in this thread.

  18. #118
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653
    Quote Originally Posted by Luka_Aveiro View Post
    I want to hear the real words from the horse's mouth.
    I am working on it.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  19. #119
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    115
    Quote Originally Posted by Starscream View Post
    read DilTech properly.
    He doesnt mean invest as in pure money but he means it as in sending some people with some hardware over to fix the problem.
    Cause atm ATI isnt giving any kind of real support to the game developers.

    You should see it from the point of view of Ubisoft.
    They got a problem making a DX10.1 work properly in their game. There is only 1 company who supports DX10.1 so they ofc go to this one company for help and the company refuses this help.

    So instead of putting a broken feature (DX10.1) into the game wich wil shure have loads of people complain they take it out instead.
    surprisingly the reviewers didn't find any glitches that would render the game unplayable,what they rather saw was improved framerates with better AA at the cost of some dust.

    The company refuses help?Got any official word on that? It's nonsensical to think that they would have refused to support a feature that would have improved performance on their cards.From the rage3d article that posted about the dx10.1 in AC-
    Armed with this knowledge, we went and asked ATi about it, since they're probably in the best position to know whether or not the new API is actually implemented. Here's what they had to say:

    "Ubisoft are at the forefront of technology adoption, as showcased with the fantastic Assassin’s Creed title. In this instance our developer relations team worked directly with the developer and found an area of code that could be executed more optimally under DX10.1 operation, thus benefiting the ATI Radeon HD 3000 Series."

  20. #120
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by gamervivek View Post
    surprisingly the reviewers didn't find any glitches that would render the game unplayable,what they rather saw was improved framerates with better AA at the cost of some dust.
    Define unplayable.

    Lights through walls makes part of playable or unplayable situation?

    Quote Originally Posted by gamervivek View Post
    The company refuses help?Got any official word on that? It's nonsensical to think that they would have refused to support a feature that would have improved performance on their cards.From the rage3d article that posted about the dx10.1 in AC-
    Rage3d has gotta be the more biased ATI underground forum, w0mbat rings a bell?

    Quote Originally Posted by FUGGER View Post
    I am working on it.
    That's great to hear, tell us when we are there
    Are we there yet?

  21. #121
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653
    Kinda silly but adding to the mix.

    http://www.batmanarkhamasylum.com/?p.../view&AID=4144
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  22. #122
    Xtreme Member
    Join Date
    Nov 2006
    Location
    UK - Skirts of London
    Posts
    112
    I know very little in the software department, but this kind of thing really doesn't make sense to me.

    If i get this right, we have Nvidia with a development team who help improve performance of there hardware with games, but from the sounds of this spend money on stopping features or supressing the technology used as well.

    - Why not spend that money on bringing out a good competing card at the same time?

    With RE5, if some can explain to me how ATI had a issue with that to begin with. Yes i know it was a driver related issue, but ATI run a R500 in the Xbox 360 which it ran fine on. That was pretty much a x1900 chip with some edram.

    Also, am i the only one thinking that most games are built upon Direct 3D or OpenGL. So why don;t microsoft have a set development model the companies must agree with. (thinking like the OSI model in wireless). Then we wouldn't get these silly issues.

    It's like we got to buy a dual 16x PCI-e slot motherboard so we can have one card from each company.

    sorry about the rant but i guess what im getting to is, why can't we have a set standard that has to be stuck too? - Paul
    Intel Q6600 (0738) @ 3.6Ghz (1.45v)
    Abit IX38 Motherboard
    4GB Memory (2x Team Elite / 2x OCZ Plat)
    2 x HD 4850 512MB Crossfire
    2 x 200GB Maxtor 10 Drives (RAID 0)
    LG IDE DVD-RW Lightscribe drive
    2 x LG SATA DVD-RW drives
    Custom waterloop - low end
    X-Fi XtremeGamer Soundcard
    Logitech Z-2300 Speakers
    Silverstone TJ09 Silver + Window Case
    Jeantech 700W Storm PSU
    Windows Vista Premium 64-Bit Retail

  23. #123
    Xtreme Owner Charles Wirth's Avatar
    Join Date
    Jun 2002
    Location
    Las Vegas
    Posts
    11,653
    Posted this;

    http://forums.eidosgames.com/showpos...58&postcount=1

    I also have an email into Eidos and Nvidia.
    Intel 9990XE @ 5.1Ghz
    ASUS Rampage VI Extreme Omega
    GTX 2080 ti Galax Hall of Fame
    64GB Galax Hall of Fame
    Intel Optane
    Platimax 1245W

    Intel 3175X
    Asus Dominus Extreme
    GRX 1080ti Galax Hall of Fame
    96GB Patriot Steel
    Intel Optane 900P RAID

  24. #124
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Its always a one way street with issues around here, maybe if ATI would simply pony up the developer would have included support for them as well.

    From a different perspective maybe Nvidia paid for and helped the developer to include that support for them and if ATI want's the same then they can pay and work with the developer to support them with the same options.

    What makes this solely Nvidia's fault, its not Nvidia's game to make executive decisions of what to support and what not to support unless they bankrolled the whole project themselves.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  25. #125
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Why not just disable the entire game in ATI hardware. Afterall, they can't possibly "validate" ever other GPU feature that is being used.

    Pure nonsense. Nvidia is essentially saying, hey we are not confident in our hardware enough to let it stand on its own merits, so we'll cripple features on our competitors hardware instead. The absolute LAST thing PC gaming needs is "developer relations" coaxing game devs to favour certain hardware AND detect and disable features on competitors cards, where does the line get drawn?

    And to the people defending Nvidia's actions, there is a major difference between Nvidia working with game devs to squeeze the most of their hardware, and outright turning off STANDARD features if a competitors hardware is detected. I guess there are certain Nvidia fanboys that have actually convinced themselves that Nvidia's actions are entirely innocuous.

Page 5 of 9 FirstFirst ... 2345678 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •