Friend of mine has a Samsung 40" 1080p TV. Currently he has a standard VGA cable connecting the PC to the TV, so while it works it is a bit soft looking. He got a DVI/HDMI cable (his 8800GS doesn't have HDMI out except via DVI/HDMI adapter) but when he does this, it overscans on the TV. Also all the text looks like utter crap because it isn't doing proper 1x1 pixel mapping.
On a box I set up with an ATI card recently, it was doing the opposite and underscanning, but in the CCC there was an adjustment to correct that. After the correction, it was beautiful. But no matter how much I looked, the nvidia control panel on my friend's computer doesn't have anything like that at all.
Am I just missing something? Or is his card or drivers crap for HDMI output to a TV?
He has an 8400GS with 196.21 drivers running Windows 7. Yes, they're old drivers, I want to update them, but haven't gotten there yet as he's busy with work and the VGA works for now. Google searches showed that a lot of people had issues like this with newer drivers too, but nothing conclusive as to any fixes.
Pretty messed up when the analog VGA looks better than the digital HDMI
Bookmarks