i7 920 @ 4.2Ghz
Asus P6T6 Revolution
3x GTX260
6x2GB Corsair DDR3-1600
G.Skill Falcon 128GB SSD
G.SKill Titan 128GB SSD
Segate 7200.11 1.5TB
Vista 64 Ultimate
i7 920 @ 4.2Ghz
Asus P6T6 Revolution
3x GTX260
6x2GB Corsair DDR3-1600
G.Skill Falcon 128GB SSD
G.SKill Titan 128GB SSD
Segate 7200.11 1.5TB
Vista 64 Ultimate
I'm not disputing that fact.
The fact is AA is has been common for years & if now it has become a problem & needed paying for its implantation then the developers have taken a leap backwards & another thorn in the side for PC gaming over consoles, like the PC needs any help with gamers going in that direction.
i7 920 @ 4.2Ghz
Asus P6T6 Revolution
3x GTX260
6x2GB Corsair DDR3-1600
G.Skill Falcon 128GB SSD
G.SKill Titan 128GB SSD
Segate 7200.11 1.5TB
Vista 64 Ultimate
Actually Vienna... The HL2 path for ATi cards ran at a different precision than the HL2 path on NVidia cards.
You see, it's a little known fact, but during the finalization for the specifications of DX9 NVidia was fighting for FP-32 to be the standard while ATi was fighting for FP-24. FP-32 required 33% more power than FP-24 but didn't bring enough to the table when combined with the rest of the specifications for microsoft to go for it, and as such microsoft took FP-24. That was the first hue strike against the FX series, naturally using 33% more horse power to do the same effects.
Switching the FX series to the ATi code path took it out of FP-32 mode and put it in to FP-16 which made it run DX9 100% fine in HL2, as no effect in HL2 required more than FP-16. It was pointed out to valve on countless occasions before release, and again after release... They never did nothing.
Same thing you're all complaining about now.
Well, about that whole AA thing...
Most UE3 titles have problems with AA. I've been playing back through Gears of War PC on my 4850, and turning on DX10 + AA results in the frame-rate dipping down to less than 20 fps, sometimes even further. Playing with my old 8800GTX it stayed above 50, always. 1920x1080p.....Well, 1920x1080p on the NVidia, ati still hasn't fixed their HDTV support, and it's been YEARS, it reverts back to 1080i in quite a few games.
Basically, I'm just saying, until we know all the facts here it's just random flaming and speculation. It may truly have issues in certain parts of the game on ATi hardware.
As pointed out by another, its AMD not ATI.AMD prides itself on supporting open standards and our goal is to advance PC gaming regardless whether people purchase our products.
*DON'T!!!***Batman: Arkham Asylum
In this game, Nvidia has an in-game option for AA, whereas, gamers using ATI Graphics Cards...... are required to force AA on in the Catalyst Control Center.
So it works but only nvidia boys get this option in game.
Nice way to decrease ATI consumers performance nvida.The advantage of in-game AA is that the engine can run AA selectively on scenes whereas Forced AA in CCC is required to use brute force to apply AA on every scene and object, requiring much more work.
This is the golden quote though, read it carefully and fully appreciate what has been done here, in particular to the innocent ATI users.Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced. This option is not available for the retail game as there is a secure rom.
Anybody who defends these business practices is ANTI-CONSUMER. <--- full stop
[SIGPIC][/SIGPIC] CoolerMaster Stacker 830SE|Antec Signature 850W|Gigabyte X58A-UD5 F5 slic2.1
Intel Core i7 930 16x200@3,200Mhz|vcore 1.14|Intel Stock CPU Cooler
GSKILL DDR3 Perfect Storm 2000 @6-6-6-16-1T 1600Mhz|ATI 5870 1024MB 850/1200
Windows 7 Ultimate x64 bootdisk: Crucial RealSSD-C300 128GB SATA-III
I logged in this account because correcting mods are fun, and there really are parts of what you said that are skewed beyond proportion, and then they're the uhh... challenged people who will just QFT you without even reading finely.
1-1. Wrong. MS took FP24 only because of nVidia refusing to license chip IP to Microsoft, thus forcing MS to BUY chips for the Xbox from nVidia, while costs stay roughly the same and no shrinking could be done, aka no Xbox Slim ever. This was also partly why MS was in the red so badly for the original Xbox. Partly why nVidia didn't catch wind of it was miscommunication, AND MS doing a deservedly revenge stance.
1-2. FP16 was NEVER on the board for DX9, it was only used for nVidia's proprietary Cg shading language. Thus developers could not target FP16 if they wanted to use Microsoft's HLSL. You're suggesting Valve to waste more time to compile and unify for a proprietary language for a new series of cards that aren't particularly going anywhere. Wow.
1-3. Do you seriously think Valve would spend time on an architecture that's so unbalanced in the first place to make it slightly less epic fail? ATI or not, they would NOT have used a myriad of shader languages and try to keep artistic cohesion. Would FP16 just make the nVidia cards fly anyway? There's little point in doing so compared to the DX8.1 path. -Just when you thought the days of Glide vs whatever was over and you expect them to do this BS?
2-1. Switching HL2 to ATI's codepath put the Geforce FX to FP32. Again there is NO official FP16 codepath at all. The FXes running at FP32 were epic fails.
2-2. Who cares? nVidia of course. If you didn't get the memo illegal shader code swapping was the rageeeee.
So nVidia began swapping.
And swapping.
And swapping.
Oh, just in 3DMark03 by the way. The nature test. They didn't have balls to approach Valve and ask them if it was possible if THEY coded the Cg path with partial precision (like ATI and whatever DX10.1 game, they analyze the code and give suggestions), they just kept silent. And swapping for benchmark scores. Cliff's notes version: cheaters
2-3. MS chose FP24 for a reason. nVidia still recommended FP32 to be used for a reason. I think I saw lots of texture color banding and such although the game was playable. Claiming that you just need to code some alien shader code separately- is that still a just?
Now that ATI cards have every DirectX spec-compliant feature (and more of course with 11), there is no reason for such AA bull**** to happen. To try to justify the Geforce FX HL2 fiasco with this- AND sympathize with nVidia is incredulous (and of course, hillarious on my side.)
3-1. Good God. Don't you know you're even hitting a RAM capacity bottleneck? Most non Cryengine 2 engines use MORE vRAM under DX10, you're running at 1080p, with 4xAA. And you call it an engine problem.
I wouldn't have an issue if you didn't act like you were speaking the penultimate truth. And to think that people would QFT you on this.
Oh, speaking of GT300, I presume that when the reviews come the review thread itself will stay in the news section eh
p/s: On a less hostile note, Catalyst 9.9 seems to have fixed HDTV modes. I can't claim 100% accuracy as I don't connect to one, but I think I remember some positive chatter on that over a supposedly bad driver release.
Your problems with HDTV an ATI cards is not what the topic is about which could be specific ATI card to HDTV model compatibility problem which does happen even with monitors.
I have no problem with my ATI card on my HDTV
Gears of war is a dog to run in DX10 and still not my point as i have already said that there is a problem but paying for AA support is not what the future should be because it was not like that in the past.
Bookmarks