Wow, I would swear everyone is joking ("ATI wouldn't do this", "Nvidia did this") but it sounds like everyone is serious.... wow. Just wow.
Printable View
Wow, I would swear everyone is joking ("ATI wouldn't do this", "Nvidia did this") but it sounds like everyone is serious.... wow. Just wow.
There's so many funny reaction here :clap:
I don't see any reason to raise a stink until we can compare the performance once this "bug" is fixed. If at the point frame rates drop substantially, then I would say ATI deserves a good slapping.
Perhaps someone could try renaming the EXE?
Complete bullbanana...
Pictures have different contrast. Open ATI's one in PS and add contrast to equalize it - and uh-oh... Nvidia picture has less details but hides it by increased contrast!?..
So, who is cheater?
And, also who prepared pictures?
The "brilinear" optimization isn't a new thing at all.
It has been used by both AMD & nVIDIA since the Radeon 8xxx series and the RivaTNT series :D
Only difference between the camps at the moment is that nVIDIA let's you turn it off.
On the other end AMD doesn't allow you to turn it off alone, you can disable Catalyst AI to disable the "brilinear" optimization but that disables other stuff too ( some of them are good, they don't harm the IQ noticeably yet they raise the performance )
Convenient? How is it convenient? A driver glitch that potentially lowers IQ without any evidence of enhancing performance? Sounds more like a PR disaster.
And since when was timing irrefutable proof that something nefarious is going on?
How about we look at this logically and objectively, for a change?
what happened to that perfect IQ we saw in charts from august. the one with the colored circles that would be square or star shaped if the AF was weaker. has IQ really changed at all between the 4xxx and 5xxx?
I think this is being blown out of proportion and I also seem to remember nvidia doing something exactly like this in the original crysis.
I did notice last night though in BC2 that my AF looked strange. If it's a bug hopefully it's fixed and the increased performance stays.
The pictures are tagged with:
HD 5870, A.I. Standard
GTX 480, Quality + Tri-Opt.
GTX 285, Quality + Tri-Opt.
I think a key point in this what settings were set in crysis? I don't know about the nvidia cards + Tri-Opt but the AI standard is in CCC.
He mentions this (google translated) "According to our benchmarks lose a Radeon HD 5870 only about five percent of their fps when you select "16:1" AF of linear filtering is in place." So i assume filtering is set to max in crysis warhead?
Is it the same in all modes of Catalyst AI?
It's a bug. I don't know why would AMD resort to cheating when their hardware is clearly capable enough. Besides, days of old fashion cheating are long over imo...
It requires less processing?
http://www.xtremesystems.org/forums/...1&postcount=42
From the same article:
ATI disables FP16 rendering when Catalyst A.I. is enabled.Quote:
Während der Vorbereitungen zum Launch der Geforce GTX 480/470 teile Nvidia uns mit, dass man bei den Catalyst-Treiber von AMD Unregelmäßigkeiten in den folgenden Spielen entdeckt habe: Dawn of War 2, Empire Total War, Far Cry, Need for Speed: Shift, TES IV: Oblivion und Serious Sam 2. Genauer gesagt sollten AMDs Treiber über die Funktion "Catalyst A.I." in den genannten Spielen FP16-Rendertargets gegen solche im R11G11B10-Format ersetzen und damit die Leistung potenziell auf Kosten der Bildqualität erhöhen. Damit, so Nvidia, verschaffe man sich einen unfairen Vorteil im Wettbewerb. Diese Einschätzung können wir durchaus nachvollziehen und haben das Thema deshalb weiter verfolgt.
That's not proof. That's an explanation of why it might improve performance, but without actual data, we don't know if this is actually the case.
This might simply be an issue with Catalyst AI, as cirons pointed out, in which case the optimization supposedly gains them 5%, but can be disabled. It might not even be a bug, then. If that's all there is to this that makes this thread all the more stupid.
Why are we making so much fuzz over a driver bug in one game? Even skipping the bug part do I need a magnifying glass to notice a difference?
Besides that I remember Nvidia was asking in their "reviewers guide" to turn off catalyst AI, it seems PCGH is happy to follow. Weak and biased...
not cool if its not bug btw 24 aa is kicking 32 aa's ass badly according to same article
Did you realize you are talking to someone with an NVIDIA 8800GTS in their sig? You are no better than the people you are whining about. Besides, nvidia deserves every dig they get right now. :rolleyes:
Anyways... It looks like an unintentional bug to me, but needs to be fixed soon if it is.
No, but Im rather sure you dont fully understand the term unconfirmed driver bug in 10,3a which dosent even have WHQL status or 10,3.
Do I need to remind you of last nvidia drivers that grilled cards or the ones before that disabled OC? Im not even going into various game bugs present on both sides that way again why is there so much fuzz about this?
The timing of the driver release is suspicious, but it's not enough to prove anything. And proving intent isn't easy either.
Either way, if it was intentional or just an accident, it is a bug that needs to be fixed in future drivers. Lowering IQ is not acceptable.
I dont realy understand your post. I said its convenient that this bug apears right now, i never said that that is proof of ATI doing it on purpose.
Further as i said in my post we stil dont know how much extra perf this bug gives.
Seeing this bug forces less processing it would only be logical to assume that it increases performance.
ati/amd did promise us that 10.3 would have the performance gains, and the fact they let us have 10.3a and had to wait an extra week for official ones, shows they were working hard to get everything in. its obviously expected there would be a bug or two.