That was Dx10 only. Dx11 has already reversed that as Dx10 cards "already" support Dx11 sans the new functionality.
Read the whole thread my friends, already respond about that!
A.Creed TWIMTBP games. End of story. Exactly the same story that Batman:AA.
So reapeating again and again that AMD not supporting developper because some folks said this. There is already tons of link in this thread showing they do it.
Or just go here and you will see : http://game.amd.com/us-en/play_games.aspx
Just seems that Nvidia pay much more to gain support!
If you needed more proof on Nvidias business practices:
"...starting with the Release 186.00 of its graphics drivers - nVidia disabled PhysX if you have an ATI Radeon card in the same system."
http://www.brightsideofnews.com/news...evelopers.aspx
QFT
has everyone forgotten that ati just gave $1mil to codemasters? they do the exact same thing, but i would consider them much less open about it. nvidia announces their partners and they place a [nVIDIA TWIMTBP] logo on the box so the consumer knows nvidia had a hand in it's development. ati does no such thing, but it gives them the ability to claim the high ground. the AA issue in baa is a bit sad, and should be mentioned in reviews, but otherwise i see no news here...
I'm not disputing that fact.
The fact is AA is has been common for years & if now it has become a problem & needed paying for its implantation then the developers have taken a leap backwards & another thorn in the side for PC gaming over consoles, like the PC needs any help with gamers going in that direction.
Wow that is messy. If this kind of thing is allowed to go unregulated, the consumer ends up being screwed by companies.
I think his point is since ATI give ressources to develop DX11 in Dirt 2 why they can't tell Codemasters to disable it with Nvidia cards even if they could use it.
It will be fair since Rocksteady disabled AA in Batman:AA altough ATI cards can use it (because someone give them money to help develop it...):rolleyes:
Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.
You see, it's a little known fact, but during the finalization for the specifications of DX9 NVidia was fighting for FP-32 to be the standard while ATi was fighting for FP-24. FP-32 required 33% more power than FP-24 but didn't bring enough to the table when combined with the rest of the specifications for microsoft to go for it, and as such microsoft took FP-24. That was the first hue strike against the FX series, naturally using 33% more horse power to do the same effects.
Switching the FX series to the ATi code path took it out of FP-32 mode and put it in to FP-16 which made it run DX9 100% fine in HL2, as no effect in HL2 required more than FP-16. It was pointed out to valve on countless occasions before release, and again after release... They never did nothing.
Same thing you're all complaining about now.
Well, about that whole AA thing...
Most UE3 titles have problems with AA. I've been playing back through Gears of War PC on my 4850, and turning on DX10 + AA results in the frame-rate dipping down to less than 20 fps, sometimes even further. Playing with my old 8800GTX it stayed above 50, always. 1920x1080p.....Well, 1920x1080p on the NVidia, ati still hasn't fixed their HDTV support, and it's been YEARS, it reverts back to 1080i in quite a few games.
Basically, I'm just saying, until we know all the facts here it's just random flaming and speculation. It may truly have issues in certain parts of the game on ATi hardware.
Wow, some people really seem to want to miss the point.
Blaming ati because ati does not spend enough time "helping" developers is ridiculous. Ati should help developers more, but that does not change that it is unacceptable to not let a piece of code run on hardware based on solely a vendor id check.
I helped coding a few engines for small directx 9 games and let me assure you that a developer is never supposed to base a featurecheck on just a hardware vendor id check.
That is just stupidity.
The only and sole reason why a hardware vendor id check would be used, is when you want certain features to only work on hardware from company x.
If nvidia helped this developer to get aa to work, that is very nice of nvidia. This does not give either nvidia or the developer the right to make it NOT work on different hardware. That is just screwing over your customer base.
So fugger, explain to me, what if ati would start spending more effort and money ibnto developer relations and they would also resort to nasty tricks like this. A world in which every game is crippled on either nvidia or ati hardware. Does that sound ideal to you?
As pointed out by another, its AMD not ATI. :up:Quote:
AMD prides itself on supporting open standards and our goal is to advance PC gaming regardless whether people purchase our products.
*DON'T!!!*** :shakes::down::down::down::mad:... are required to force AA on in the Catalyst Control Center.Quote:
Batman: Arkham Asylum
In this game, Nvidia has an in-game option for AA, whereas, gamers using ATI Graphics Cards...
So it works but only nvidia boys get this option in game. :shrug:
Nice way to decrease ATI consumers performance nvida.Quote:
The advantage of in-game AA is that the engine can run AA selectively on scenes whereas Forced AA in CCC is required to use brute force to apply AA on every scene and object, requiring much more work.
This is the golden quote though, read it carefully and fully appreciate what has been done here, in particular to the innocent ATI users.Quote:
Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced. This option is not available for the retail game as there is a secure rom.
Anybody who defends these business practices is ANTI-CONSUMER. <--- full stop
I logged in this account because correcting mods are fun, and there really are parts of what you said that are skewed beyond proportion, and then they're the uhh... challenged people who will just QFT you without even reading finely.
1-1. Wrong. MS took FP24 only because of nVidia refusing to license chip IP to Microsoft, thus forcing MS to BUY chips for the Xbox from nVidia, while costs stay roughly the same and no shrinking could be done, aka no Xbox Slim ever. This was also partly why MS was in the red so badly for the original Xbox. Partly why nVidia didn't catch wind of it was miscommunication, AND MS doing a deservedly revenge stance.
1-2. FP16 was NEVER on the board for DX9, it was only used for nVidia's proprietary Cg shading language. Thus developers could not target FP16 if they wanted to use Microsoft's HLSL. You're suggesting Valve to waste more time to compile and unify for a proprietary language for a new series of cards that aren't particularly going anywhere. Wow.
1-3. Do you seriously think Valve would spend time on an architecture that's so unbalanced in the first place to make it slightly less epic fail? ATI or not, they would NOT have used a myriad of shader languages and try to keep artistic cohesion. Would FP16 just make the nVidia cards fly anyway? There's little point in doing so compared to the DX8.1 path. -Just when you thought the days of Glide vs whatever was over and you expect them to do this BS?
2-1. Switching HL2 to ATI's codepath put the Geforce FX to FP32. Again there is NO official FP16 codepath at all. The FXes running at FP32 were epic fails.
2-2. Who cares? nVidia of course. If you didn't get the memo illegal shader code swapping was the rageeeee.
So nVidia began swapping.
And swapping.
And swapping.
Oh, just in 3DMark03 by the way. The nature test. They didn't have balls to approach Valve and ask them if it was possible if THEY coded the Cg path with partial precision (like ATI and whatever DX10.1 game, they analyze the code and give suggestions), they just kept silent. And swapping for benchmark scores. Cliff's notes version: cheaters
2-3. MS chose FP24 for a reason. nVidia still recommended FP32 to be used for a reason. I think I saw lots of texture color banding and such although the game was playable. Claiming that you just need to code some alien shader code separately- is that still a just?
Now that ATI cards have every DirectX spec-compliant feature (and more of course with 11), there is no reason for such AA bull**** to happen. To try to justify the Geforce FX HL2 fiasco with this- AND sympathize with nVidia is incredulous (and of course, hillarious on my side. :rofl:)
3-1. Good God. Don't you know you're even hitting a RAM capacity bottleneck? Most non Cryengine 2 engines use MORE vRAM under DX10, you're running at 1080p, with 4xAA. And you call it an engine problem.
:rofl:
I wouldn't have an issue if you didn't act like you were speaking the penultimate truth. And to think that people would QFT you on this.
Oh, speaking of GT300, I presume that when the reviews come the review thread itself will stay in the news section eh :rolleyes:
p/s: On a less hostile note, Catalyst 9.9 seems to have fixed HDTV modes. I can't claim 100% accuracy as I don't connect to one, but I think I remember some positive chatter on that over a supposedly bad driver release.
Your problems with HDTV an ATI cards is not what the topic is about which could be specific ATI card to HDTV model compatibility problem which does happen even with monitors.
I have no problem with my ATI card on my HDTV
Gears of war is a dog to run in DX10 and still not my point as i have already said that there is a problem but paying for AA support is not what the future should be because it was not like that in the past.
Not same thing for a simple reason, why HL2 use FP24/32? Because minimum spec for DX9.0 is FP24! FX series could only do FP16/FP32, that Nvidia fault! Valve use FP32 for HL2 with FX series because DX9.0 specs command it and FX series can't do it. End of rewriting story.
A quick google search will verify what I speak of with the HDTV issue...Go ahead, fire up Stalker: Clear sky or Gears of War PC on your ATi hooked to a HDTV via HDMI, and have your TV tell you what it's running at when you select 1080... Bet it says 1080i. Tested on 2 HDTV's, with 2 separate ati cards, 2 different hdmi adapters, and cat 9.9...
I know how to fix it, but I need a few ATi driver gurus to help, I have a thread going in the ATi section.
Either way, it's off topic, I'm just pointing out that ATi have more important issues to worry about, and gears of war was pointed out because it's also a UE3 title with issues on ATi cards when it comes to running AA.
Also, half precision was allowed with DX9. Of course, who cares, the FX cards sucked anyway. Truth is though, ATi have done the same as NVidia in said situation. It literally only would've taken 2 seconds to set the FX series to half precision, with no loss in quality.
Sure a winter 2006 title is a much more important issue to fix than a winter 2009 title:rolleyes:
Lol! your argument was HL2 and is proven wrong. HL2 just respect DX9.0 specs and anyway you can use FP16 in HL2 with command line for both ATI and Nvidia cards. What's the command line for using AA the same way Nvidia run it in Batman:AA?
sure Quality is the same? 2 secs google seach!:yepp:
http://www.neowin.net/forum/index.ph...&pid=585020109
Happens in Stalker: Clear skys as well, and other titles I'm sure. I haven't bought too many games as of late because most haven't been worth playing period, so I can't test more, but it's not just ONE game. It even happens to some people when watching blu-ray movies!!! Read up on the ATi under/overscan issue with HDTV's, it's caused by the video card reverting to 1080i. There's threads all over AMD's actual forum about the issue.
This has been an ongoing issue with ATi cards for quite some time, I've even told them how to fix it, yet they ignore it all together.
Maybe, maybe not. See, I didn't use a FX card(I had a 9800pro during those days. I do remember the huge talk at guru3d, and people there claiming no quality loss. If that's not the case then my mistake, obviously people on forums didn't know what they were talking about.Quote:
Lol! your argument was HL2 and is proven wrong. HL2 just respect DX9.0 specs and anyway you can use FP16 in HL2 with command line for both ATI and Nvidia cards. What's the command line for using AA the same way Nvidia run it in Batman:AA?
sure Quality is the same? 2 secs google seach!:yepp:
http://www.neowin.net/forum/index.ph...&pid=585020109
So, what's the last word on Macadamia vs. Diltech? I'm curious to know who is more right.