Absolute 0
09-17-2009, 08:15 AM
This is one of the most frustrating problems I have ever had.
I have a 19" 1440x900 monitor. I have tried 4 different nVidia cards on this monitor, on three different MBs and two operating systems. My conclusion is that nVidia does not like 1440x900 and I would like to know why.
Instance #1
Old Pentium 4 and old OEM motherboard, using a 7800GS AGP card and WinXP 32. Upon booting, monitor will only display at max 1360x768, without having an option for 1440x900. So I had a black strip at the top and bottom of screen. I had to install nTune and create a custom resolution to force the settings. It finally worked and displayed at 1440x900.
Instance #2
Similar to instance 1 in every way, except using an AMD Phenom II, Gigabyte 790FX-UD4P, and 7900GT. Software was able to force 1440x900.
Instance #3
Foxconn Bloodrage MB, i7, and 9600GT. First tried WinXP 32 before reformat. Boot PC and it takes literally forever to see anything. I thought something was seriously wrong, and after about 4 minutes, it finally got into Windows. Very strange. OK, in Windows, but only 800x600. WTF? Yes, latest nVidia drivers are installed. I revert back to solution #1 from before, to force resolution using nTune. I install nTune, and upon attempting to run, I get an error that nVidia X chip not found and thus the software cannot emulate the card apparently. Software closes and will not run. Swap monitor out for a 1600x1050, PC boots up fast like usual, and displays just fine at full res.
Instance #4
Win 7 64-bit is now installed. Now I try installing nTune, and even swapping out the 9600GT with an 8500 to have one more shot at 1440x900. Of course, this res is not available and my only option is once again to attempt to force res with software. nTune is installed, and now crashes Win7 when attempting to open. Yes, this problem is known about, as I have found other stories on the net about nTune crashing Windows with the BSOD. I even tried using another program which had successfully forced 1440x900 on the old MB, I believe Powerstrip, and it gave the same error about not having a certain nVidia chip found on the MB.
So why can an old OEM MB equivalent to a Dell with Pentium 4 succeed at forcing the cards to run 1440x900, and an AMD board, but a new Bloodrage + PCIe card cannot display that res? This is total crap. Piss on you, nVidia.
I have a 19" 1440x900 monitor. I have tried 4 different nVidia cards on this monitor, on three different MBs and two operating systems. My conclusion is that nVidia does not like 1440x900 and I would like to know why.
Instance #1
Old Pentium 4 and old OEM motherboard, using a 7800GS AGP card and WinXP 32. Upon booting, monitor will only display at max 1360x768, without having an option for 1440x900. So I had a black strip at the top and bottom of screen. I had to install nTune and create a custom resolution to force the settings. It finally worked and displayed at 1440x900.
Instance #2
Similar to instance 1 in every way, except using an AMD Phenom II, Gigabyte 790FX-UD4P, and 7900GT. Software was able to force 1440x900.
Instance #3
Foxconn Bloodrage MB, i7, and 9600GT. First tried WinXP 32 before reformat. Boot PC and it takes literally forever to see anything. I thought something was seriously wrong, and after about 4 minutes, it finally got into Windows. Very strange. OK, in Windows, but only 800x600. WTF? Yes, latest nVidia drivers are installed. I revert back to solution #1 from before, to force resolution using nTune. I install nTune, and upon attempting to run, I get an error that nVidia X chip not found and thus the software cannot emulate the card apparently. Software closes and will not run. Swap monitor out for a 1600x1050, PC boots up fast like usual, and displays just fine at full res.
Instance #4
Win 7 64-bit is now installed. Now I try installing nTune, and even swapping out the 9600GT with an 8500 to have one more shot at 1440x900. Of course, this res is not available and my only option is once again to attempt to force res with software. nTune is installed, and now crashes Win7 when attempting to open. Yes, this problem is known about, as I have found other stories on the net about nTune crashing Windows with the BSOD. I even tried using another program which had successfully forced 1440x900 on the old MB, I believe Powerstrip, and it gave the same error about not having a certain nVidia chip found on the MB.
So why can an old OEM MB equivalent to a Dell with Pentium 4 succeed at forcing the cards to run 1440x900, and an AMD board, but a new Bloodrage + PCIe card cannot display that res? This is total crap. Piss on you, nVidia.