RPGWiZaRD you say you can see the difference between refresh rates up to 100 Hz. I haven't had a CRT for a long time, but some of my friends still have those and to me the flickering bothers is very obvious below 85 Hz and puts an incredible strain on my eyes. (Haven't really compared between 85 Hz and 100 Hz, so I'm not sure whether I'd notice that difference.) If I was still using CRTs today, I would definitely have a 100 Hz one at the very least.

However, there is no inherent flickering in LCD panels. And AFAIK there are no native 75 Hz LCDs (the 75 Hz ones I'm aware of are really only 60 Hz LCDs that convert the 75 Hz input signal into a 60 Hz one), so how can you say the same for LCDs? Are you really sure you would notice going from a 60 Hz LCD to maybe a 75 Hz one? Not to mention going from say 85 Hz to 100 Hz?

Even watching 85 FPS material on a 100 Hz CRT is not remotely the same as comparing a theoretical 85 Hz LCD at 85 FPS to a theoretical 100 Hz LCD at 100 FPS as "telecine judder" or stuttering will be present (caused by certain whole or partial frames being shown longer or more times than others, when FPS is below the refresh rate) on the CRT. So, I'm not convinced.

Hopefully we'll be able to tell in the future (once SEDs finally come around). Until then, unless you have access to very special hardware (like research lab staff or something), it will be impossible to know for sure as we're limited to 60 Hz LCD screen. Even if native 75 Hz LCD screens were readily available, the major problem of LCDs today (imo) is still latency (input lag + response time can easily make even the newest LCDs lag a couple of frames behind a CRT and it's vary varying between models, anything from sub-10ms to 40ms+) so fair definite comparisons would still be hard to make.