Quote Originally Posted by T_Flight View Post
Wanna put money on that? I will NOT EVER (not today, not a week from now, not a month from now, not a year from now, not ten years from now, or 20, or at any point in time any year of the future) will I fall into that fad. It is stupidly expensive, stupidly expensive to watercool, horribly innefficient, eats up a rediculous amount of wattage, and I won't be buying into the bugs they come pre-equipped with. On top of that, they usually downclock the GPU's which is the opposite of what I do with my systems. I OC...I don't do UC's.

One powerful GPU per card is all I need. It's all I will ever need, because if there is ever a need for more than that, then it's time to buy a new video card. There will never be a use for dual GPU's on one card. That's what SLI is for. One heat producer per card is enough.
Multi-gpu is going to become the norm for the highend soon. And it is going to follow a similar progression as CPUs:

separate sockets (ie, separate chips and/or boards) and poor software compatibility (ever try to install win98 on a 4 socket PPro server? Lol) ---> better multisocket solutions and good software support (winNT,2000,XP,linux,etc) ---> mutiple dies on the same chip package ---> native multicore

The thing holding multi-GPUs back is not having the ability to share the same memory pool. But you can see that issue is being tackled by intel with larabee and you can bet that the other two are chasing the same goal.