Quote Originally Posted by Sr7 View Post
Well the other part of my question was "do you see a major benefit in everyday use for anything over current processors?" I don't think it's a noticeable performance increase at this point, because things are so fast as it is. Granted certain operations in workstation apps might process some data faster, but I can't think of much else that you'd benefit from by having one of these processors when the GPU already does some of the things better. Granted there aren't many GPGPU consumer apps out there right now, but I'm just speaking in theoreticals.. in terms of "the device best suited for workload x."

As far as your mention of the inability for a monitor to display more than 60 frames a second, that's actually wrong.

This is true if you have v-sync on you'll only see 60 fps, exactly, at maximum (1 frame per vertical blank period).

But with v-sync off, if you have 120fps on a 60Hz monitor, you see parts of multiple frames.

By this I mean, that the vertical refresh is not instantaneous.. it will write out whatever is in the front buffer of the swap chain, at the time of the pixel being lit on that refresh pass.

So let's say you have 1 frame ready to display. Your monitor starts displaying that frame 1 pixel at a time, working from left to right, and moving down row by row, displaying what's in the buffer. If halfway down in that pixel refresh process the next frame is done and presented, the second half of that monitor's pixels will show the contents of this new frame, so you get actual visual feedback on your position/environment in-game faster, instead of having to wait for the next refresh cycle to see *any* of that frame.

Now whether you can turn this faster feedback into a meaningful response/reaction is a different story, since it's all happening in a very short period.

If you had 180FPS you'd see roughly the top third of your monitor as frame 1, the middle third as frame 2, and the bottom third as frame 3. With Vsync on you would've only seen frame 1 and had to wait until it was done displaying the whole thing before the refresh moved back to the top of the screen, at which point it would've displayed frame 4.

My point is that your statement about not seeing more than 60 frames in a second is false.
Wow ... you are talkative .. ... this my last tonight, gotta get in bed.

In terms of noticeable difference.... again, that is in the eye of the beholder. Ironically, just a few days ago I was using Lightroom on a dual core rig for some quick touch ups and got very annoyed ... (this was an X6800) ... which at the time I was using that rig was darn fast, but I was annoyed because it wasn't as snappy or responsive.... my quad does it so much faster, and I notice it. Also, when I am transcoding or importing from a different video format into Premier or for simple routine stuff -- Pinnacle Studio 11 -- it is very noticable, especially when it takes 2x longer to build the DVD ... but see this is me, I prefer it faster... you may not be doing this level of computing.

In terms of GPGPU this is really a different topic, a different thread for debate -- but I don't see CUDA, for example, really taking off -- for a few reason which I will not elaborate on ... my personal opinion is that CUDA/nVidia will be victimized very much like AMD victimized Intanium.

For your video monitor commentary, I am correct on this you should do some more research ... google is your friend.... -- video monitors refresh the entire screen at typical 60 Hz that means each pixel within the field is updated in tandem 60 times (progressive), that is 60 frames in one second. Some monitors support higher, but the human brain cannot distinguish between frames that until about 20 FPS ... if you run the game full bore, your 60 Hz refresh will often capture the frame buffer in mid refresh since they are not synced -- this is what creates the lines and tearing in the image ... Vsync gives the smoothest, most enjoyable game play because each refresh of the monitor coincides with a completed frame buffer i.e. that is what it means to be sync'ed. We may be saying the same thing but differently ...