http://www.pcgameshardware.com/aid,6...eviews/?page=4

The last pic, was that a bug? was that a cheat?

Quote Originally Posted by Coldon View Post
There is a standard, its called the API specs and drivers are supposed to follow them. Now lets say developer A creates custom mipmaps and writes the rendering engine assuming no LOD bias as per the SPECS, and the guys at ATI obviously know better and introduce a driver level LOD bias after the game has shipped, now who's responsibility is it to fix the problem? The developer that followed the API specs or the driver team that knew better.

3D rendering is not the wild west! The whole graphical pipeline is very simple, the developer has control over most key stages in the pipeline except for the finer points of triangle setup/traversal/texture filtering/clipping and to an extent blending. The driver team just needs to ensure that the driver behaves according to the API specs.

From your post, I'm assuming you don't really know what LOD bias does and from where the supposed sharpness comes from.
But the algorithms that ati and nvidia use differ and there is no standard for that.It's wholly possible that one looks better on simple 3D tests, while other looks better in a game or both look better or worse in a game depending on what scenario you are using.Similarly for AA comparisons.
The texture/mipmap pop-in you referred to are a game problem, I had them more in warhead than in crysis, tweaked the settings a bit or used CCC v2 and they decreased.