I would start to worry about the state of the space time continuum if that did not happen. We would need a quantum physicist to help us understand the damage to repair the distortion if that happened. :)
Printable View
well its not really trolling if I called their BS and it turns out 64-sample AA doesn't work at all..... is it?
seriously, anything more than 8xAA is a waste of time, especially on lcd displays with high pixel density.
nvidia and ati better focus on making better sli and cf drivers instead of some marketing crap like this is (not that cf drivers are bad, but they could be better)
This is as usefull as a 9400GT with 1GB of frame buffer. And people actually buys 9400GT's with 1 GB of ram... :shakes:
He voiced the sentiments of almost everyone out there, what's wrong with what Snipe said?
Even 8xCSAA / Q / whatever it's called is not noticeable from 4 on 1440 x 900, and it's low resolutions where AA matters.
well here they are...
Quote:
http://i377.photobucket.com/albums/o...s/IMG_2254.jpg
complete PC with GTX 295 consuming only 120 a 130 Watt IDLE 138W instaling vista
Vista 64 with drivers 185.20
http://i377.photobucket.com/albums/o...s/IMG_2255.jpg
Quote:
The difference is that while the framerate people can see can't be summarily quantified scientifically with any sort of consistency or credibility, anti-aliasing can be analyzed pixel-by-pixel within each frame using simple software techniques available to anyone, effectively discarding the fallible human element of perception from the equation.
You're forgetting that analyzing any form of video using individual screenshots is useless. Even if you compare them one by one in sequence, the human perception changes with movement and focus. Something a machine can't do. And there you have the real and visible (for humans) difference.
I'm not sure if you're being sarcastic, but infact he is right. It is the same thing like when you compare different video codecs. Individual images may look sharper with one codec, but it's more about fluid motion. That is also one of the reasons why some forms of exotic AA actually look no better than just plain ol' 4xAA when they are compared in reviews using screenshots.
Anyway, 64xAA is ridiculous ;)
I have a feeling this 64/64Q is more for bragging rights than anything useful in both quality and performance.
Most of ppl were saying the same thing when i was running 16xFSAA on GeForce 2 Pro. V-Rally 2 worked pretty well with that (by "pretty well" i mean the framerate was actually still playable) setting and quality of edges was insane back then. Not a single jaggie. Not sure what res i used with it. Probably 800x600 or 1024x768. Unfortunatelly they removed the support for 16xFSAA in later Detonator drivers.
For example with HD4850 i used 8xFSAA Edge Detect and 16xFSAA on Max Payne 2 and framerate was still so high i could use 3 times the sampling rate and anisotropic multi and framerate would probably still be far over 60fps.
So yeah it's cool to use insane FSAA on older games and finally get graphics you could just dream off when the game came out.
the only reason for the 64xAA, and i know nvidia knows it, is cause they will sell a few more units, to the people with too much money and no understanding of video cards, but somehow believe it means its better. its a sales tactic, nothing more.
ATI's Edge-detect AA at its minimum setting of 12x is plenty fine for me. It looks great even on a 90" monitor.
Well I know a difference would be much more easily seen between 12x and 64x on a 90" screen, but the 12x edge-detect still looks great. So why would anyone need to run 64x? It is completely pointless, and as others have said it probably wouldnt even be capable of actually playing a decent res game at those levels.