From 9.12 to 10.3 WHQL it's about 2fps or less in Crysis: http://www.techpowerup.com/reviews/N..._Fermi/10.html
Is the bug still present in WHQL 10.3s?
Printable View
From 9.12 to 10.3 WHQL it's about 2fps or less in Crysis: http://www.techpowerup.com/reviews/N..._Fermi/10.html
Is the bug still present in WHQL 10.3s?
you're making the very common mistake of replying to a thread which might be bad for ATI by saying "BUT NVIDIA DID THIS THIS THIS SOME TIME AGO!!"
I've been saying this all along, this logic is: "This thread says something bad about ATI. So, it must be saying something good about Nvidia, and for some reason I need to counter this, so if I say something bad about Nvidia it will counter whatever bad things might be said about ATI/AMD."
No, when you say something bad about Nvidia it counters nothing related to ATI. Saying something bad about Nvidia isn't saying something good for ATI. Saying something bad about Nvidia is saying something bad about Nvidia, which is saying something bad about Nvidia.
Why cant any of you fire up crysis and run some figures/screenshots then rename the exe and do it again?
sigh..
I dont see the big deal with this. cant you crank up the settings manually with some tools? its probably not a bug, usually big games or benchmarks that are standard in reviews like vantage or crysis get the most attention from the driver teams. thats probably why nvidia does well on them. if they throw a couple of optimizations to make it run a little faster who cares?
april fools?! lol j/k
but yeah cant any one here with 5870 confirm this?? just post 1 screen with AF and the other without
To be honest, I've felt like my 5870 has had weak AF in a number of games.
The responses in this thread crack me up. If the roles were reversed, I am sure the responses would be FAR different. Just shows how long so many here have been smoking the good stuff.
Also, the substitution of FP16 render targets by ATI has been confirmed in quite a few games. DoW 2, NFS: Shift and a few others are the most obvious candidates.
Why does it have to get so personal? This is a thread about a bug that ATI has in their drivers, that's not something to get personal over.
I just hope they fix this as soon as they possibly can, although as soon as they can probably means Catalyst 10.5.
They ate my imaginary puppy :(
I have to concur, granted these optimizations are nothing next to some of that pulled by XGI/nvidia or ATI +6 years ago, if this was Nvidia rather than ATI the forum would be lighting like a Christmas tree with the Daamit Fanatics burning up effigies of Nvidia.
Although I find this thread humorous and all the complaints about "if it was nv...". I went ahead and checked it for myself for the lulz. Between using the original exe and renaming it to lulz I found no difference in ground texture. :shrug: Perhaps it does stay crunchy in milk :eek:.
Edit:
It was never mentioned in the article that anyone was cheating either. Which makes this all the more funny.
Actually, it looks pretty much the same. The fanboy roles are reversed, but the arguments sound about the same - "the other guys did it too", "i'd rather have the extra performance", etc.
http://www.xtremesystems.org/forums/...d.php?t=171677
http://www.xtremesystems.org/forums/...d.php?t=164515
Why settle for less? :cool:Quote:
Just shows how long so many here have been smoking the good stuff.
Well, a couple years ago Nvidia was infinitely more popular than it is today, but it is an interesting comparison none the less. I personally can see texture noise in a lot of games that I wasn't able to with my 8800GTS, but I think it has more to do with this. If I'm able to enable super sampling though, textures appear perfect.
I thought I explained the difference between the filtering methods in my earlier post, bi/tri-linear is considerably faster than AF! The fact that you are asking for proof that filtering can affect performance is kinda ridiculous.
filtering can greater affect performance and most people wouldn't realize the difference between the various mode nor do they know the difference between a texel or a pixel, so they can easily get away with disabling it in some games for a perf boost. Like I said it might account for the nice gains the new drivers brings.
that alienbabeltech article is ridiculous, the author has absolutely NO clue about anisotropic filtering or how the texturing pipeline works.
Soo... anybody confirmed it? And then run some comparison bench between 10.2 and this 10.3 with supposed cheat/bug ?I would but have to go to sleep now, if nobody will i will do tomorrow a comparison.
The difference between the 2 images that saaya posted is that the Nvidia card is rendering more details which in turn would make it to work overtime drilling the performance. The ATI card image looks much better and looks better optimized even though is rendering less details.
Looking at other images, I say the Geforce card is superior rendering details from far distance and ATI card from closer distance.
ATI
http://www.pcgameshardware.de/screen...AA_HD_5870.pnghttp://www.pcgameshardware.de/screen...._Standard.png
Nvidia
http://www.pcgameshardware.de/screen...4x4_OGSSAA.pnghttp://www.pcgameshardware.de/screen...100_DX10_Q.png
The 4x4 ordered SSAA in NV's output probably pushes the texture LOD scale way back and additionally sharpens the surface detail. Same happens with ATi's 8xRGSS mode but to a lesser extent (the LOD scale is adjusted "manually" by the driver).
This is rather unfair comparison.
u need to post links to the images and not the embedded image. the performance didnt go up so i dont think that its cheating but the flickering textures from a crytec game seams to happen every 6-9 months or so
Sounds like a bug, they were too desperate with their optimisation spree. I'm sure it will be fixed in 10.4.
But we will see. Get your torches ready if it's not! :D
Pcgh: Raaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaawr microstuttering
It's ATI they get a free pass, and I bet Jen-Hsun Huang had a hand in this.
Seriously though, I'd wait for more info... does it happen in other games? If not, why crysis and for how much gain? I'm thinking optimization for the regular users which reviewers can and should disable when they do reviews.