Intel Core i7 6900K
Noctua NH-D15
Asus X99A II
32 GB G.Skill TridentZ @ 3400 CL15 CR1
NVidia Titan Xp
Creative Sound BlasterX AE-5
Sennheiser HD-598
Samsung 960 Pro 1TB
Western Digital Raptor 600GB
Asus 12x Blu-Ray Burner
Sony Optiarc 24x DVD Burner with NEC chipset
Antec HCP-1200w Power Supply
Viewsonic XG2703-GS
Thermaltake Level 10 GT Snow Edition
Logitech G502 gaming mouse w/Razer Exact Mat
Logitech G910 mechanical gaming keyboard
Windows 8 x64 Pro
20/10 is the best vision a human can have, so mine aren't far off from that. When I have my eyes checked, I can read all but the finest letters on the bottom row.
And I can certainly tell the difference between HD and SD.. I can even see the difference between 720p and 1080p..
Anyway, I think it's a bogus claim that Nvidia has inferior 2D image quality.....unless Blkout has some evidence.
Intel Core i7 6900K
Noctua NH-D15
Asus X99A II
32 GB G.Skill TridentZ @ 3400 CL15 CR1
NVidia Titan Xp
Creative Sound BlasterX AE-5
Sennheiser HD-598
Samsung 960 Pro 1TB
Western Digital Raptor 600GB
Asus 12x Blu-Ray Burner
Sony Optiarc 24x DVD Burner with NEC chipset
Antec HCP-1200w Power Supply
Viewsonic XG2703-GS
Thermaltake Level 10 GT Snow Edition
Logitech G502 gaming mouse w/Razer Exact Mat
Logitech G910 mechanical gaming keyboard
Windows 8 x64 Pro
You can't examine what someone else's eyes see. Having 20/15 vision has nothing to do with how your brain processes and interprets what your eyes see. Be thankful you can't see the difference dude, I wish I couldn't see the difference because it really annoys me when I have to use Nvidia video cards.
Some people can see 60Hz flicker on an analog monitor, some people can't though, once again just another example of how some people see what others can't. It doesn't mean that the 60Hz flicker isn't there, just that some people can't see it.
Bookmarks