so does this mean that wizzard was smart to use 9.12? cant imagine the crapstorm that would follow if it turns out that way
Printable View
so does this mean that wizzard was smart to use 9.12? cant imagine the crapstorm that would follow if it turns out that way
Honestly, and given that I have no brand affiliation here and am just as prepared to believe in dirty tricks from either side...
...
I don't see it. Those piccies look identical.
just another nvidia underhand practices.
The ATI screenshot actually looks better to me too. The nVidia ones look a bit "washed out", whereas the ATI ones show more detail. Anyway, could be just a placebo, as I currently use an ATI card. Then again, it's very hard to spot any differences, while focusing on a _very_ small part of the entire screen. If it means more performance, I'm happy with that, as I wouldn't notice the difference in quality anyway.
Does that mean benching Crysis with Nvidia should be done with lower settings to compare? If choosing anisotropic 16x on ATi does not give you anisotropic 16x... :shrug:
I don't like cheating in benchmarks from any company, if they are going to optomize like this they need to be up-front about it. Seeing how little the visual quality was affected, they should have advertised this as a feature. Now they are stuck with fodder for nvidia fanboys to make snide comments over, even if they don't understand the changes.
Who on earth benches Crysis? or better, who on earth plays Crysis or Cryengine 2 games?
I don't see a difference in the pics....?
But still, if it is supposed to do one thing and is doing another, then shame on them... but I don't see a difference worth mentioning, really.
saaya
I do not know whether you are extremely brave or rather silly for posting this thread. However I also hope that ATi fix this bug too. It is not uncommon for nVidia or ATi to have driver bugs from time to time. I recall recently nVidia had an AF issue in their drivers, no doubt this is also a driver bug.
IMHO the days of cheating (from both sides) is over as the sheer fallout that happened in the days of the Gefarce5 FX driver cheats..sorry optimisations was a lesson to be learned by. (I recall ATi also having a similar misdemeanour with quake and quack or something)
Fingers crossed we see no deliberate driver cheats EVER again, because when companies start cheating and if those cheats degrade image quality then the only loser is the consumer...nobody wins.
John
Just use the last pic as a guide. Basically if you look at the 5870 pic, you will see 3 distinct zones that follow the pattern on that last pic. You can almost make out a line of where they start using tri linear and bi linear.
The NV based cards don't have these zones, it textured the same way throughout.
Exactly; it's the essentially same effect as dithering so there is less banding , or gradients are smoother.
What if Nvidia allowed a lower quality mode and could increase performance by "x" % ?
The point is the tests should be done fairly.
If the tables were turned and Nvidia skimped out on rendering do you think the images would suddenly look different and this would suddenly become important? LOL
Go on the source website and there you can switch between the images directly on top of each other and the difference is very evident.
There is also an image of 16x AF quality, and strangely, the 480 looks better than the 5870 in that as well despite "angle independent" AF on the HD5000 cards...
For all the people saying there is no difference so its not an issue, if nvidia did this everyone would go on and on about cheating...
bilinear filtering does 5 texel lookups per pixel, trilinear does 10 and then does a linear interpolation between 2 colors, anisotropic uses the view angle to vary the sampling range of the texels, each anisotropic level enlarges the sampling range(non-uniformly) and so the texels sampled increase, it then applies trilinear filtering with the new ranges. This allows the textures to look smooth even when moving through a scene and as a result heavily anti-aliases textures and reduces pixel swimming but comes at a heavy cost.
For example in a standard 1680x1050 scene, the texel lookups counts are as follows:
bilinear: ~8.82 million texel lookups
trilinear: ~17.64 million texel lookups
Anisotropic: (depending on AF level) potentially more than 4x the texels of tri-linear depending on the AF level.
In a still image there is no problem, the major filtering artifacts are movement based artifacts such moire patterns and pixel swimming and mip-level pops, static images will never really show the full picture when it comes to filtering. I cant believe ATI would do this!
I'd would love someone used the 10.3s and the previous versions and see if there is a difference, that nice 10% increase across the board the new drivers bring might be this "bug"!
ATI is good...mkay....Nvidia is baaad.... :)
Yep I can easily see the difference, if this was done on purpose then shame shame on them :shakes:
The difference is VERY obvious in motion but you can still see it in the screenshot. The Textures will vibrate and you will see the lines in the textures as the camera moves and its very distracting.
Well that's the problem, people take a screenshot of stuff and go "see?!! see?!!" problem is most of the stuff is only really noticeable in motion. The screenshot you have to really look at quite a bit to begin to see anything in cases such as this.
Good ol' XS, where every bug is a cheat.
Is there any actual proof this results in some sort of performance boost? What incentive would AMD have to do something like this, when they've got so much extra texturing performance that they're pushing for angle-independent AF?
Good job jumping to conclusions and sensationalizing, especially saaya.
Lots of people, obviously?
pretty amazed by some of the reactions here.
If it was Nvidia lowering IQ slighty to get more FPS then we would have 9 people complaining about Nvidias dirty tactics and 1 guy saying he barely see a diffrence and prefers Nvidias IQ.
Wich would then result the 9 to accuse the 1 of being a Nvidia fanboy.
@Cybercat
Like Saaya says.
Its very convenient that this bug apears in a game used by many reviewers just at the time when Nvidia releases their new high-end for reviews.
Even if this bug gets fixed in the next driver version these reviews using the bugged drivers wil stil be out there.
So in a few months when someone who doesnt know about all of this wants to buy a new high-end card and googles some reviews he wil mostly find these old ones using the bugged drivers.
Then again we stil need to know how much extra perf this bug gives.
But even if most people cant see the diffrence, in a review all settings should be the same for a fair result.