Honestly, you could use your comp setup as Christmas tree with all those lights and no1 would notice. :D *j/k* Looks good
Printable View
omg, its going to get soo hot, especially when oced
maybe itll void warranty if you run furmark on this (in microscopic size, looking like a line: "warning, this product is not suitable for running >70% load longer than 1 min at a time")
I think the main reason why a single PCB is very impractical is the amount of chips that you have to put on there. You will need to put 28 memory chips and 2 huge GPUs on there. For the 4870 X2 you only have 16 memory chips which saves you a lot of space.
BTW, could someone explain me why there are quite a few whom clearly don't like Displayport? I don't see what's wrong with the standard. HDMI and Displayport will probably live next to each in the future, there is not much we can do to stop it. Pretty much all future notebooks will have Displayport in there, the reason for that is because it is a better connection to the display in there than what is common now and HDMI or DVI can't drive a screen directly. So for notebook manufacturers it will only be a small step to also provide an external Displayport connection. More future desktop LCD screens will also get Displayport connections, for TVs HDMI will continue to reign supreme probably.
Am I the only 1 who figure that they went with a sandwich design again not because of the size of the die, but because 2 sets of 448bit 869MB would be impossible to fit on a single PCB?
EDIT: Ah, Helmore had the right idea aswell.
Remember that SidePort is still disabled in HD 4870X2. Now Sideport enable can be a priority to ATI (before it wasnīt because ATI had a huge advantage, so no need to activate it), and get a new magic speed bump ;)
Here we go again with the Sideport thing...
Just a guess, donīt take it to serious :)
If it donīt happen it Christmas and Santa Claus fault :D
Anyway Sideport is there waiting to be used. My theory is that ATI is keeping it unused, and bring a magical boost by January time.
Itīs very unusual they have that feature and donīt use it.... I remember ATI talked that there will be 3 phases in HD 4800 and 2 of them where already rolled out, but they never talked about phase number 3. Maybe Sideport, or they simply forget it.
What were the first and 2nd phases?
1st: RV770
2nd: R700
3rd: either HD4830 (scavenged&cut RV770) or the rumoured RV770 refresh (TSMC 45nm GS-process?)
So how will the gtx285 compare to the gtx295?
Is it the same thing as this:
GTX290 (doesn't exist) ----->GTX285
GTX280 x2 ----------------->GTX295
What I am trying to say is, will the die shrink provide an even faster high end single die/pcd card than the current gtx280 (not just a shrink) as well as the chip itself being faster than a single chip on the gtx295?
That would seem more ideal since SLI still has microstuttering problems.
that card looks soo 3 slot, someone should run benchies with sli 216core gtx260 and down clock them so we can can get and idea of performance.
wait 2-3 more days and we will know everything ;)
WELCOME TO THE FUTURE OF GAMING......:rofl::rofl::rofl:
http://www.hardware.info/images/news...ds_sli_550.jpg
THIS WILL BE YOUR COMPUTER
http://www.collegebeing.com/uploads/...g-computer.jpg
oh noes
aint that picture a bit outdated
same picture with 9800gx2 would be something.. massive block (and massive heat problems lol)
lol crysis hopefully we will be playing at normal framerate of 60fps lol soon 2 years time :)
Something tells me Crysis 2 and/or Crysis 3 will be out by then and up the requirements once again.
So this still has the ridiculous hole in the PCB? I can't really tell from the first picture.