Wow, I would swear everyone is joking ("ATI wouldn't do this", "Nvidia did this") but it sounds like everyone is serious.... wow. Just wow.
Wow, I would swear everyone is joking ("ATI wouldn't do this", "Nvidia did this") but it sounds like everyone is serious.... wow. Just wow.
There's so many funny reaction here![]()
I don't see any reason to raise a stink until we can compare the performance once this "bug" is fixed. If at the point frame rates drop substantially, then I would say ATI deserves a good slapping.
Perhaps someone could try renaming the EXE?
System: Core I7 920 @ 4200MHz 1.45vCORE 1.35VTT 1.2vIOH // EVGA x58 Classified E760 // 6GB Dominator GT 1866 @ 1688 6-7-6-18 1T 1.65V // Intel X25 80GB // PCP&C 750W Silencer
Cooling: Heatkiller 3.0 LT CPU block // 655 Pump // GTX360 Radiator
Sound: X-FI Titanium HD --> Marantz 2265 --> JBL 4311WXA's
Display: GTX480 // Sony GDM-FW900
Complete bullbanana...
Pictures have different contrast. Open ATI's one in PS and add contrast to equalize it - and uh-oh... Nvidia picture has less details but hides it by increased contrast!?..
So, who is cheater?
And, also who prepared pictures?
Windows 8.1
Asus M4A87TD EVO + Phenom II X6 1055T @ 3900MHz + HD3850
APUs
The "brilinear" optimization isn't a new thing at all.
It has been used by both AMD & nVIDIA since the Radeon 8xxx series and the RivaTNT series
Only difference between the camps at the moment is that nVIDIA let's you turn it off.
On the other end AMD doesn't allow you to turn it off alone, you can disable Catalyst AI to disable the "brilinear" optimization but that disables other stuff too ( some of them are good, they don't harm the IQ noticeably yet they raise the performance )
Coding 24/7... Limited forums/PMs time.
-Justice isn't blind, Justice is ashamed.
Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P.), Juan J. Guerrero
Convenient? How is it convenient? A driver glitch that potentially lowers IQ without any evidence of enhancing performance? Sounds more like a PR disaster.
And since when was timing irrefutable proof that something nefarious is going on?
How about we look at this logically and objectively, for a change?
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
what happened to that perfect IQ we saw in charts from august. the one with the colored circles that would be square or star shaped if the AF was weaker. has IQ really changed at all between the 4xxx and 5xxx?
I think this is being blown out of proportion and I also seem to remember nvidia doing something exactly like this in the original crysis.
I did notice last night though in BC2 that my AF looked strange. If it's a bug hopefully it's fixed and the increased performance stays.
Main: 900D - Prime 1000T - Asus Crosshair VI Extreme - R7 1700X @ 4.0ghz - RX Vega 64? - 32GB DDR4 3466 - 1TB 960 Pro -
--- XSPC AX360 x3 - HK IV Pro - HK RX480 - HK 200 D5 - BP Compression ---
HTPC: 250D - Prime 850T - Gigabyte G1 ITX - i7 6700K @ 4.5ghz - GTX 1080 Ti - 16GB 3200 - 1TB 960 Pro -
--- ST30 x UT60 - Kyros HF - KryoGraphics 1080 - HK100 DDC - Monsoon Compression ---
HV01: Define XL R2 - Prime 1200P - Asus Zenith Extreme - TR 1950X - RX580CF - 128GB DDR4 ECC - 512GB 960P - 4x 2TB RE
HV02: Node 804 - Prime 850T - SuperMicro X1SSH - E3-1230 v6 - Vega FE - 64GB ECC - 512GB 960 Pro - 4x 6TB Gold -
The pictures are tagged with:
HD 5870, A.I. Standard
GTX 480, Quality + Tri-Opt.
GTX 285, Quality + Tri-Opt.
I think a key point in this what settings were set in crysis? I don't know about the nvidia cards + Tri-Opt but the AI standard is in CCC.
He mentions this (google translated) "According to our benchmarks lose a Radeon HD 5870 only about five percent of their fps when you select "16:1" AF of linear filtering is in place." So i assume filtering is set to max in crysis warhead?
Is it the same in all modes of Catalyst AI?
cpu: AMD Phenom II 1090T
motherboard: MSI 890FXA-GD70
ram: A-DATA 4GB(2 x 2GB) DDR3 1333
videocard: Sapphire Radeon HD6870
powersupply: Silverstone DA1000 1000W
It's a bug. I don't know why would AMD resort to cheating when their hardware is clearly capable enough. Besides, days of old fashion cheating are long over imo...
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
It requires less processing?
http://www.xtremesystems.org/forums/...1&postcount=42
i7 920 D0 / Asus Rampage II Gene / PNY GTX480 / 3x 2GB Mushkin Redline DDR3 1600 / WD RE3 1TB / Corsair HX650 / Windows 7 64-bit
From the same article:
ATI disables FP16 rendering when Catalyst A.I. is enabled.Während der Vorbereitungen zum Launch der Geforce GTX 480/470 teile Nvidia uns mit, dass man bei den Catalyst-Treiber von AMD Unregelmäßigkeiten in den folgenden Spielen entdeckt habe: Dawn of War 2, Empire Total War, Far Cry, Need for Speed: Shift, TES IV: Oblivion und Serious Sam 2. Genauer gesagt sollten AMDs Treiber über die Funktion "Catalyst A.I." in den genannten Spielen FP16-Rendertargets gegen solche im R11G11B10-Format ersetzen und damit die Leistung potenziell auf Kosten der Bildqualität erhöhen. Damit, so Nvidia, verschaffe man sich einen unfairen Vorteil im Wettbewerb. Diese Einschätzung können wir durchaus nachvollziehen und haben das Thema deshalb weiter verfolgt.
That's not proof. That's an explanation of why it might improve performance, but without actual data, we don't know if this is actually the case.
This might simply be an issue with Catalyst AI, as cirons pointed out, in which case the optimization supposedly gains them 5%, but can be disabled. It might not even be a bug, then. If that's all there is to this that makes this thread all the more stupid.
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
Why are we making so much fuzz over a driver bug in one game? Even skipping the bug part do I need a magnifying glass to notice a difference?
Besides that I remember Nvidia was asking in their "reviewers guide" to turn off catalyst AI, it seems PCGH is happy to follow. Weak and biased...
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
Did you realize you are talking to someone with an NVIDIA 8800GTS in their sig? You are no better than the people you are whining about. Besides, nvidia deserves every dig they get right now.
Anyways... It looks like an unintentional bug to me, but needs to be fixed soon if it is.
No, but Im rather sure you dont fully understand the term unconfirmed driver bug in 10,3a which dosent even have WHQL status or 10,3.
Do I need to remind you of last nvidia drivers that grilled cards or the ones before that disabled OC? Im not even going into various game bugs present on both sides that way again why is there so much fuzz about this?
The timing of the driver release is suspicious, but it's not enough to prove anything. And proving intent isn't easy either.
Either way, if it was intentional or just an accident, it is a bug that needs to be fixed in future drivers. Lowering IQ is not acceptable.
I dont realy understand your post. I said its convenient that this bug apears right now, i never said that that is proof of ATI doing it on purpose.
Further as i said in my post we stil dont know how much extra perf this bug gives.
Seeing this bug forces less processing it would only be logical to assume that it increases performance.
Time flies like an arrow. Fruit flies like a banana.
Groucho Marx
i know my grammar sux so stop hitting me
ati/amd did promise us that 10.3 would have the performance gains, and the fact they let us have 10.3a and had to wait an extra week for official ones, shows they were working hard to get everything in. its obviously expected there would be a bug or two.
Bookmarks