Quote Originally Posted by SKYMTL View Post
I still think it is a valid argument no matter what the source is. With the power of today's GPUs versus the requirements for currently and even upcoming games, downsampling should have been put to bed. People can go on and on about a card's AA capabilities and anisotropic filtering, but (IMO) all that is for nothing if there are issues with the textures themselves.

I've noticed the issue in nearly every game I've played over the past year.

Fallout 3: Rocks in the Capital Wasteland and the Washington monument are the worst
HawX: Buildings in some levels
Company of Heroes: Rubble and some infantry weapons on the night levels

I could go on and on. However, there are some games that I haven't noticed it in. Batman, Dawn of War II and Call of Duty: WaW don't seem to have any problems that I have seen.
yes, whats the point in being able to run 8aa 16af with max details at huge resolutions if the textures flicker and are blurry...

batman is based on UT3, and yes, its hard to see it in there, and actually its even dificult to see aliasing in ut3 based games, cause they used polygones in a very smart way... the geometry they avoid using sharp edges and certain angles... pretty smart

Quote Originally Posted by Eastcoasthandle View Post
^^This
When asked to show proof of his statement he couldn't provide it and said that he was was told that was the problem by someone else. All this after he chimed in. So I would wait and see 1st instead of fueling what looks like fud so far.
hmmm so its a case of he said she said then...
well thx for the headsup... nobody would be more happy to hear its not true than me, trust me

the day a new card comes out that has no texture flickering at all and notably improved texture sampling, possibly even UPsampling like in video and texture apps, id upgrade immediatly...