Quote Originally Posted by DarthShader View Post
You should say to that EVGA. They listened to you with their guarantees, maybe they will agree with you here too and drop their dual card.
Remember that the dual chip card you are talking about was simply a proof of concept for the time being. It was never planned for release.


Joking aside, I am not sure I can agree with you. Part of what makes us currently percieve dual cards as useless is the 32nm cancelation/ 28nm slips which made the single cards come close to the 300W wall. At that point SLI/corssfire makes for both better performance and easier cooling. If nVidia was able to, I can guarantee they'd have a dual card already, 100%.
Not necessarily. If NVIDIA was willing to push past the 300W envelope they would do so by simply handing off a basic reference design to their board partners and letting them handle the fallout. Personally, I think they see it the way I do: why bother with a 300W, dual GPU card when thermal and efficiency limitations mean it will offer relatively minimal performance increases over the flagship single chip products while retailing for significantly more?

The questions above lead me directly to another point: the issues AMD is having with Antilles. They are likely grappling with the fact that Cayman runs hot and can consume significantly more power than the previous HD 5800series. This means scaling back the architecture in order to meet expectations regarding power consumption, etc. but performance will also suffer. Considering the GTX 580's performance against a single fully-endowed HD 6970, AMD has to strike a very delicate balance with Antilles in order to make it a worthwhile purchase.

A dual card can still have it's uses. In the ultra high-end, it's easier to do quadfire with two dual-cards. In the lower regions, depending on pricing, it might be cheaper to get a cheap P67 board that doesn't have two at least x8 slots, a 2500K and a dual card, than an expensive board and two regular cards. There's also the mATX and ITX form factor, where Crossfire/SLI is troublesome or impossible. See Sugo SG07 for example. Finally, there are OpenCL/CUDA applications, where packing twice as much chips in the same space is going to give moar perforamnce etc.
None of these points I agree with unfortunately. I have yet to see a game that scales well with more than two GPUs. Tri-SLI and quad Crossfire have never performed up to expectations.

You are also talking about price here. Every P67 board I have seen and will likely see has at least two x16 PCI-E slots that operate at 8 / 8 when dual GPUs are detected.

Going further down the list, we get into the H67 territory which opens up a whole new can of worms. Even though most H67 boards lack the typical dual GPU capability of their P67 siblings, they also come without overclocking features people are looking for and some will still feature dual GPU support. In addition, why would someone cheap out on a motherboard and then spend mega bucks on a $600+ dual GPU card, a suitably high end monitor for it to work on AND a bleeding edge CPU?

You and I both know that dual GPU cards are loud, hot and power hungry so why would anyone want one in a HTPC or SFF case? As it stands, there isn't a single ITX sized power supply with enough capacity to power a GTX 580 let alone a dual GPU monster. mATX brings us into another area altogether since there will be plenty of P67 mATX boards introduced in the coming months.

So I will repeat what I said: dual GPU cards may look great and allow a company to plant the flag in a dramatic way but past that, I fail to see much use for them in most cases.