All the customers that have an ATI, and paid 50 bucks for the game, deserve it in the best shape and form possible.
AA (as AF) are common features nowadays, so no excuse for this "omission".
All the customers that have an ATI, and paid 50 bucks for the game, deserve it in the best shape and form possible.
AA (as AF) are common features nowadays, so no excuse for this "omission".
Not sure if it causes issues in the game. A quick google search shows me that people disabled the hardware vendor id check & changed ATi card ID to nv device ID to trick the game. They confirm that NV exclusive MSAA works on Ati cards, and also that "physx" effects are protected by Securom, but still possible to run on the CPU (and in multiple threads). However, doing that is very buggy, probably due to (like you said) poor developer support on Ati's part.
[SIGPIC][/SIGPIC]Bring... bring the amber lamps.
What seems to be lost here (of the first page I read and skimmed over the other two) is that when someone says "ATI tried TWIMTBP program and failed" or "ATI did x in the past", is that ATI no longer exists.
AMD is all that exists now, with just ATI engineers working on AMD graphics cards with ATI brand logo. AMD is making this comment, not the non-existent company once called ATI.
ATI may have tried many things in it's past, but all that is in the past now that AMD owns them. AMD has always been a company about open standards and trying to put the consumer first, yes they try to make a profit like every other company but the tit for tat that ATI and Nvidia have had is something AMD doesn't even get into.
What AMD is complaining about here is that Nvidia is basically paying developers to make AMD hardware run worse than Nvidia's artificially. Remember Assassins Creed DX10.1 BS?
What is going on is that you, the paying consumer, are being screwed by Nvidia's business tactics in order for marketing department to trick you out of your money.
It is one thing to get close to developers in order to get your hardware to run better with their software, it is entirely another pay to make your competitor's hardware perform worse.
What AMD is trying to say is, they are a company that embraces open standards, yeah they want to make video cards faster Nvidia in order to pull good profit, but they aren't going to do it by playing dirty doing ANTI-CONSUMER BS like Nvidia does all the time.
I know many of us, including myself, go with whichever video card offers the best price/performance/watt, it is simple and to the point. However Nvidia is trying to trick/manipulate the system in order to gain an artificial upper hand. The things that Nvidia has done is so numerous to list. Maybe you don't care what a company does, but personally I don't reward anti-consumerism.
I won't by L4D2, I don't buy DRMed games, and I'm not going to buy Nvidia cards if this is the kinds of business practices Nvidia likes to pull.
--Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))
For chrissakes, the source is AMD. It's embarrassing how quick some people are to immediately take it as gospel truth. Nothing else has been offered from either Nvidia or the companies who have put out the games. No wonder so many companies out there consider 'smear tactics' a valid business strategy - threads like this show how many people are willing to immediately believe and loudly parrot the worst they hear.
That might be your problem right there. There are some concise, informed rebuttals to some of your points to be found, if you can spare a few minutes.
i7 2600K | ASUS Maximus IV GENE-Z | GTX Titan | Corsair DDR3-2133
Stuk, A. Creed had visual artifacts when running in DX10.1 mode, from lights bleeding through walls to an entire rendering pass missing that included dust particles and other issues... ATi didn't want to invest in helping get it up to snuff, and what interest did NVidia have in doing the work, considering they had nothing to gain in spending the money to make the code work.
It's a case of AMD/ATi refused to put up or shut up, so they pushed the blame on NVidia instead.
this thread is growing fast, i go out to lunch, and im 2 pages behind.
someone here should pay a moderator to condense all the arguments to simple factual points, but also only ask them to do a good job on the ones they made and make their opponents sound wrong. (remind you of something, lol)
Last edited by Vinas; 09-28-2009 at 10:18 AM.
Current: AMD Threadripper 1950X @ 4.2GHz / EK Supremacy/ 360 EK Rad, EK-DBAY D5 PWM, 32GB G.Skill 3000MHz DDR4, AMD Vega 64 Wave, Samsung nVME SSDs
Prior Build: Core i7 7700K @ 4.9GHz / Apogee XT/120.2 Magicool rad, 16GB G.Skill 3000MHz DDR4, AMD Saphire rx580 8GB, Samsung 850 Pro SSD
Intel 4.5GHz LinX Stable Club
Crunch with us, the XS WCG team
Seems like Fugger is on the payroll???
You see the Nvidea add in the top corner!
Rule No. 1 - Follow the money
How sad is it companies like Intel & Nvidea are allowed to cheat their way to the top with the help of greedy and corruptible 3rd parties (Probably Fugger included)
And guess what, it will only get worse because to many people will happily go along with getting screwed over.
I can see it now! Published correspondence between Fugger & Nvidea made available courtesy of the EU.
Bottom line Nvidea is directly paying to harm consumers for their own greedy gain with the help of some big name 3rd parties.
Anyone remember the fairly recent GPU article Tom's did, which used grossly incorrect pricing to justify why Nvidea GPU's offered better price Vs performance??? You should have read the comments people left after reading that article Lol!
Wonder how long this post will last.
I wish I was on the payroll.
That is a Gigabyte competition in the top right.
Your post will remain, priceless as it is.
Last edited by Charles Wirth; 09-28-2009 at 10:35 AM.
Intel 9990XE @ 5.1Ghz
ASUS Rampage VI Extreme Omega
GTX 2080 ti Galax Hall of Fame
64GB Galax Hall of Fame
Intel Optane
Platimax 1245W
Intel 3175X
Asus Dominus Extreme
GRX 1080ti Galax Hall of Fame
96GB Patriot Steel
Intel Optane 900P RAID
Ah, 216 is banned Rhys aka Rhys216...
Let me show you to the door. /slam
Intel 9990XE @ 5.1Ghz
ASUS Rampage VI Extreme Omega
GTX 2080 ti Galax Hall of Fame
64GB Galax Hall of Fame
Intel Optane
Platimax 1245W
Intel 3175X
Asus Dominus Extreme
GRX 1080ti Galax Hall of Fame
96GB Patriot Steel
Intel Optane 900P RAID
Their GITG program was such a huge success... not. ATi/AMD still have the program up & running but it's been turned towards CrossFire instead of making games better & getting more features into games. It's still a failed program as far as game development goes.
Completely agree, nVidia work with the developers, ATi pays them lip-service then complain about nVidia's TWIMTBP program. ATi/AMD should start to work with developers more, in the "fair" way they like to preach & stop wasting money on press releases moaning about things.
Some ATi fans are seriously becoming like the followers of one Mr Jobs, it's only going to end badly.
haha this thread is jokes...
(sorry for OT)
"Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
//James
I have to agree. All I hear is excuses excuses excuses. ATI can make good cards so why not trying to establish better relationships with the game developers? ATI should stop playing the same old "we're the underdogs, nobody likes us so please support us" game. You are the second largest manufacturer of graphics cards with thousands of customers how hard for you to connect with the game developers and get a demo from them so you can work out the problems?
Why people would refuse to work with you?
Having said that, I hope ATI changes their strategies and stay as a formidable competitive to NVdia. Remember the socket 939 dominance days? Conroe came out as a result of that.
NO ONE who really knows one card from another really cares about TWIMTBP, well besides the usual delusional conspiracy theorist green haters in out midst. Most of the time in my life spent looking at the stupid nvidia logo was likely on my 9700pro and X850XT anyways. If AMD cannot get ATI bagdes on games with the same ease Nvidia does its their lack of effort.
Last edited by Dainas; 09-28-2009 at 10:46 AM.
Lol at your post DilTech!
You basically say that ATI would have to "invest" (pay Ubi Soft so...) to make them correctly implement DX10.1 mode in A.Creed
They do the job for Oblivion, they do the job for R6:Vegas. Why they have to do the job for A.Creed?
To focus on Batman:AA, i have no problem seeing Nvidia giving money for PhysX support (only them have it) but i have a problem seeing Nvidia giving money to no let ATI cards using something in the game they can use (AA).
What the link between ATI and socket 939 dominance days?
Ok i see, Intel pay for manufacturer support and AMD didn't...
Last edited by AbelJemka; 09-28-2009 at 10:57 AM.
i would love for ATI to get more involved with developers. they should hire me to play upcoming games, on an ATI rig. im even willing to work for a guy who thinks hes in the matrix, and getting sucker punched by a kung fu monkey, and loose my virginity to a woman whos had STDs since the great depression. no wait, somehow i dont think it will be worth it.
For example, Batman with Physics workaround on a CPU (note the word workaround):
http://www.youtube.com/watch?v=AUOr4cFWY-s
But guess what... Its a "The way its meant to be payed" game so most features are gone once you disable proprietary PhysX.
Talk about artificially limiting gaming experience!
I can say partly it is AMDs fault that their support is bad when it comes to games but disabling AA?
Anyways... AMD have begun helping developers now, they have given them DX11 hardware since months back and also other kind of support such as lending out programming staff etc. so I am hoping AMD will start their own "The way its meent to be played" program, atleast it's looking like that right now
SweClockers.com
CPU: Phenom II X4 955BE
Clock: 4200MHz 1.4375v
Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
Motherboard: ASUS Crosshair IV Formula
GPU: HD 5770
Bookmarks