We know nothing about the 55nm shrink, the best option is to wait a few days and see what's going on![]()
We know nothing about the 55nm shrink, the best option is to wait a few days and see what's going on![]()
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
They must be wanting to commit comercial suicide. Not that I'd ever even consider a dual GPU card, but even if I did I wouldn't consider this abomination. If these rumors are true this is going to put them in the red if they attempt to mass produce these things. I seriously hope they don't.
Heck I'm an nVidia user and wouldn't get one of those things near my system. Like was said earlier, I guess they are looking at joining ATi in the microstuttering club. Bad move nVidia. Until this you had better engineering. That just went out the window if this thing is true. I'm seriously hoping it's another one of those Inq rumors that will die quickly, but it sure isn't sounding like it.
Hey, I actually fully agree with you for once.
Unless this card has a better dual GPU solution than ATI, it's just kinda pointless if you ask me. Now if it has some sort of amazing new hardware "load splitting" and scales 99.99% all the time and is just godlike, maybe that would be different.
From what I picture in my head so far though, this card will be nothing but a little oven for your computer.![]()
Laser locking only the bus and not the shaders seems a lil weird to me. I'd speculate that they got the shader count wrong... 2x 216 shaders sounds more feasible to me.
Lets wait and see![]()
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
They need something to go against the x2....
Having the flagship crown or at least being very competitive in the high end is important for name recognition.
It will be very low volume compared to mainstream card sales, not such a big impact in their pockets.
Smart move imo, and they should have released it a long time ago.
They already lost alot of the flagship sales....i dont see people who have an HD x2 changing for this one. they will trade blows here and there but no reason to upgrade.
Will this be a new PCB and new parts? I'm worried about buying any nVidia cards while there are reports of card problems due to heat, although the main place I saw the failure reports were on Inquirer. But I prefer nVidia cards for the historically better Linux driver support.
LOl, the moment Ati went the x2 way, nvidia will have to go that way too, it all started with 3870x2.
But if we look a little back in time, nvidia was the real starter of the X2 style with the 7900GX2 to counterattack the 1950XTX, so i donīt understand how some talk about engenering leadership, it has been like this for some time now.
And it will be like this from now on ----------->
In actual games GTX 260 SLI beats the 4870 X2 almost everywhere. GTX 260-216 SLI or GTX 280 SLI easily beat it.
Unfortunately for ATI, SLI scales better than Crossfire almost universally. This is especially true in the games where it matters, like Crysis.
I'd like to see nVidia continue to develop single-GPU solutions but you can't blame them for going the multi-GPU route when ATI has done it so successfuly with the 4870 X2.
Intel Core i7 920 @ 3.8GHz - Asus P6T Deluxe X58 - 6GB (2GBx3) G. SKILL DDR3-1600 @ 8-8-8-20 - 2 x EVGA GTX 280 1GB SLI - Corsair TX750 PSU - Windows Vista HP 64-bit
Intel Core i7 920 @ 3.8GHz - Asus P6T Deluxe X58 - 6GB (2GBx3) G. SKILL DDR3-1600 @ 8-8-8-20 - 2 x EVGA GTX 280 1GB SLI - Corsair TX750 PSU - Windows Vista HP 64-bit
So i'm guessing the GTX295 going to be very short lived just like the 9800GX2
until they move to another single-pcb architecture that'll match the GTX295
QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT
It will run too hot, have driver problems, clocks will be scaled down, will be beat by 260s in SLI, and have ungodly power consumption ( not a problem), and have scaling problems galore in future games. You heard it here first.
It's a win-win for me, because if they prove me wrong ( highly unlikely) then we have a pretty kickass card, and if they don't prove me wrong then I was right all along![]()
Yeah and so does Forceware 180.xx drivers for NV too.
Anyway this GTX 295 is like only a slightly castrated GTX 280 SLI setup so theoretically it would probably be like 70~80% faster than a single 280 GTX but of course it will vary a bit from game to game as usual with these dual PCB constructions so some games will show quite a lot lower benefit too, maybe 50%.
I've never liked this kinda solutions, it's just an easy way for NV to release a faster card and it's very short lived and won't get proper driver support for games in the future after the next gen series are released so some games will be buggy on it etc. It's both expensive and not very "future-safe" due to lacking driver support.
Last edited by RPGWiZaRD; 12-15-2008 at 07:59 AM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
Bookmarks