The name: GeForce GTX 295
GPU: two 55nm GT200
Shader: 480(240X2)
Memory: 896bit(448bitX2), 1792MB GDDR3
TDP:<300W
Announce date: Jan 8 (CES2009)
http://en.expreview.com/2008/12/09/g...in-ces-09.html
:shocked:
Printable View
The name: GeForce GTX 295
GPU: two 55nm GT200
Shader: 480(240X2)
Memory: 896bit(448bitX2), 1792MB GDDR3
TDP:<300W
Announce date: Jan 8 (CES2009)
http://en.expreview.com/2008/12/09/g...in-ces-09.html
:shocked:
and the price?? €600+?!
Those specs don't seem right. So it has full shaders of a GTX 280, but the Bus of a GTX 260? What type of sense does that make. Why not be something like
The name: GeForce GTX 295
GPU: two 55nm GT200
Shader: 480(240X2)
Memory: 1024bit(512bitX2), 1512MB GDDR3
or this....
GPU: two 55nm GT200
Shader: 432(216X2)
Memory: 896bit(448bitX2), 1344MB GDDR3
That would make better sense but oh well.
January 9 is a bit too late to rival the HD4870X2. RV870 should be on the doorstep by then.
What it doesn't make sense is the specs. It has all the shaders (240) of a GTX280 and the bandwidth of a GTX260? Why are they cutting the bandwidth? To save PCB space?
Hm... Looks like a monster to me. :up:
WOW :rocker::rocker::rocker::rocker::rocker::rocker:
haha there's always a hater in every nvidia thread
wonder if the 4870x2 will still be faster....
hahah
Multi GPU -> boring.
And it's definitely a dual-PCB solution.
*yawn*
lol i cant wait till people here are doing 4 way sli with these these cards..:rocker: im either going to get this or a 4870 if the price drops
What happened to the 'dual gpu, screw it' mentality?
http://vr-zone.com/articles/geforce-....html?doc=6259Quote:
NVIDIA is preparing the GeForce GTX 295 and GTX 285 cards for launch on January 8th at CES 2009. According to the official data VR-Zone has seen, GeForce GTX 295 card has dual 55nm GT200 GPUs and 1792MB GDDR3 memories on 896-bit memory interface. The rest of the specs like number of shader processors and clocks speeds are listed as TBA. As for GeForce GTX 285, it is a replacement for the current GeForce GTX 280 card with a 55nm GT200 GPU with a higher clocks. Therefore, GeForce GTX 280 cards will reach EOL in a month time. As for GeForce GTX 260, Nvidia has released design kits to the card makers and you should be seeing some self-designed cards in Q1 next year.
I wonder if someone could use one of these dual 280's with another single 280? anyone know the awnser to that one?
So Nvidia want a slice of the microstuttering market. :ROTF::ROTF::ROTF:
yes, we already know ... some people has it at home NOW! hahahahaa :wasntme:
I might try one of these out.
My step up window expires in exactly 30 days. Nvidia better hurry up and get these 55nm parts out the door! :mad:
http://www.theinquirer.net/gb/inquir...m-parts-update
It's the Inquirer so I'm not sure if the points made in the article are valid :shrug:
It's not 2GB 1024Bit because there isn't enough room for all the traces. DDR3 needs lots of extra space for 'timing' traces. Read: http://theovalich.wordpress.com/2008...ule-the-world/
The extra 4 memory chips required for 2GB 1024bit wouldn't provide enough performance boost to justify the extra heat either especially since those GPUs will be undervolted/underclocked just to keep them stable. (DDR3 outputs more heat than DDR5 too)
and where from do ppl get info about ati launching there new cards in mid 09 ? lol
New cards = RV870 = mid 09.
But there will be some RV770 refresh meanwhile.
who says that any source or you are just telling me rumours ?
I'm sure it's all rumors but pretty strong rumors, there's ppl with info about it but they're not allowed to speak about it. Logically it makes sense too for a mid 09 launch.
Wonder what the prices for the NV 55nm parts will be? I'd guess $399 (read 399 ~ $429 on sites like newegg) on GTX 285 and $599 for GTX 295. I'm more interested in a 260GTX 55nm card and I wouldn't be suprised if launch price is $299. This should push GTX 280 to 349 ~ $379, GTX 260 Core 216 to $249 (not talking MIR) and normal GTX 260 to ~$199 I think. At least INQ was wrong about the "silent" 55nm replacement of 65nm cards release which I also doubted heavily, GTX 285 and 295 is so much more likely to happen.
No doubt this card will be a ideal for those in colder climates! :D
If this can be setup as Quad SLI will it work on X58 MB's ?
It could be for some strange reason that I do not know but even 1x 9800GX2 does not want to work on my EVGA X58, forget Quad. I was told there was a NVIDA Chip missing on the MB to support Quad.
Maybe it is just me, but isn't it unusual for graphic card companies to release a card straight after new year, early jan? I mean normally they have a time scale to when they bring cards to the market, don't they...so just look at the previous few years of release that would prolly give you idea. Lol maybe I'm completely wrong, i dunno. I would be very surprised if it was straight after ny.
Bring it on, I need to get myself a second 280 so I want them to be cheap :p:
they are cheap already $350 and with rebate $314 and its OCed :D
http://www.newegg.com/Product/Produc...82E16814143142
+ you get Far Cry 2 :)
Cannot wait! I need another space heater for this room :up:
Ugh enough with dual pcb already
Bring on the sandwitch :rofl::ROTF:
too bad its gonna turn into a glass-witch with all the heat from this thing.
nvidia fanboi retards before this announcement:
OMG GUYS DUAL GPU IS NOT A SINGLE CARD!!!!!!! I SWEAR IT SUCKS I WOULD NEVER USE IT!!!!!!
After:
http://www.vtaide.com/png/images/cricket-m3.jpg
man... the power consumption will kill..
Yea, I'm loving how suddenly dual GPU is God's gift to the world.
One month ago you weren't allowed to mention 4870X2 and the GTX 280 in the same sentence without being told it wasn't sensible. Oh how times change when it's Nvidia glueing two cards together. Never change guys, never change.
I hope these will be worth upgrading to. I just got a GTX260 core 216 that I need to register soon.
I wonder how the step up will work and the cost to step up for evga like others are asking. I got a gtx260 just to hold me off. I might even go dual gpu to hold me for a while, plus the folding opportunities. :D:up:
stream processors lol Can anyone verify if this is real there has been so many bogus entries for cards being released I just dont know anymore
Just like I Nostradamus'ed, 2x shrunken and undervolted gtx280 cores are doable in one card (be it 1 or 2 boards, i dont care, its an one slot solution) without exploding.
Yeah it's doable but lower voltage = lower clocks = lower performance.
This has to beat the 4870x2 by a good margin or they'll be in trouble.
Ugh. I hate this Dual GPU madness. I'm glad to see nvidia putting up a fight for the top of the chart on performance, but this isn't the way I'd like to see them do it. Single GPU would have been better, micro-stuttering is terrible. I love my GTX280, single GPU FTW.
They're probably not even going to be on the same PCB, they'll probably make this card horrifically long with a dual PCB set up like the 9800GX2. Which might I add is a pain in the *** to watercool. :down:
These specs don't make much sense to me, I would've assumed they would put out the most powerful card that they could make. If this is true, thumbs down nvidia..
why are we wasting our time....like the gtx295 will max out Gta4 or something :D :D
:rofl:
the shader count is like 2x gtx280 (2x 240 shaders) but bus and mem is like 2 gtx260 (2x 448bits and 2x 896vram) :confused:
I kinda partially phailed indeed lol, care to enlighten me ?
Maybe they got the shader count wrong and its just 2x shrunken 260's cores ? 2x 216 or 2x 192 shaders ? Or is it an new core, an mix of 280 and 260 ?
Thanks in advance :)
We know nothing about the 55nm shrink, the best option is to wait a few days and see what's going on :)
:shakes: They must be wanting to commit comercial suicide. Not that I'd ever even consider a dual GPU card, but even if I did I wouldn't consider this abomination. If these rumors are true this is going to put them in the red if they attempt to mass produce these things. I seriously hope they don't.
Heck I'm an nVidia user and wouldn't get one of those things near my system. Like was said earlier, I guess they are looking at joining ATi in the microstuttering club. Bad move nVidia. Until this you had better engineering. That just went out the window if this thing is true. I'm seriously hoping it's another one of those Inq rumors that will die quickly, but it sure isn't sounding like it.
Hey, I actually fully agree with you for once.
Unless this card has a better dual GPU solution than ATI, it's just kinda pointless if you ask me. Now if it has some sort of amazing new hardware "load splitting" and scales 99.99% all the time and is just godlike, maybe that would be different.
From what I picture in my head so far though, this card will be nothing but a little oven for your computer. :rofl:
Laser locking only the bus and not the shaders seems a lil weird to me. I'd speculate that they got the shader count wrong... 2x 216 shaders sounds more feasible to me.
Lets wait and see :)
They need something to go against the x2....
Having the flagship crown or at least being very competitive in the high end is important for name recognition.
It will be very low volume compared to mainstream card sales, not such a big impact in their pockets.
Smart move imo, and they should have released it a long time ago.
They already lost alot of the flagship sales....i dont see people who have an HD x2 changing for this one. they will trade blows here and there but no reason to upgrade.
Well for synthetics, 4870X2 is beating GTX260 SLI in the top category scores......
Will this be a new PCB and new parts? I'm worried about buying any nVidia cards while there are reports of card problems due to heat, although the main place I saw the failure reports were on Inquirer. But I prefer nVidia cards for the historically better Linux driver support.
LOl, the moment Ati went the x2 way, nvidia will have to go that way too, it all started with 3870x2.
But if we look a little back in time, nvidia was the real starter of the X2 style with the 7900GX2 to counterattack the 1950XTX, so i donīt understand how some talk about engenering leadership, it has been like this for some time now.
And it will be like this from now on ----------->
In actual games GTX 260 SLI beats the 4870 X2 almost everywhere. GTX 260-216 SLI or GTX 280 SLI easily beat it.
Unfortunately for ATI, SLI scales better than Crossfire almost universally. This is especially true in the games where it matters, like Crysis.
I'd like to see nVidia continue to develop single-GPU solutions but you can't blame them for going the multi-GPU route when ATI has done it so successfuly with the 4870 X2.
Video Drivers Catalyst 8.7
ForceWare 177.34
Those drivers are too old to conclude anything ;)
So i'm guessing the GTX295 going to be very short lived just like the 9800GX2
until they move to another single-pcb architecture that'll match the GTX295
QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT
It will run too hot, have driver problems, clocks will be scaled down, will be beat by 260s in SLI, and have ungodly power consumption ( not a problem), and have scaling problems galore in future games. You heard it here first.
It's a win-win for me, because if they prove me wrong ( highly unlikely) then we have a pretty kickass card, and if they don't prove me wrong then I was right all along :rofl:
Yeah and so does Forceware 180.xx drivers for NV too. :)
Anyway this GTX 295 is like only a slightly castrated GTX 280 SLI setup so theoretically it would probably be like 70~80% faster than a single 280 GTX but of course it will vary a bit from game to game as usual with these dual PCB constructions so some games will show quite a lot lower benefit too, maybe 50%.
I've never liked this kinda solutions, it's just an easy way for NV to release a faster card and it's very short lived and won't get proper driver support for games in the future after the next gen series are released so some games will be buggy on it etc. It's both expensive and not very "future-safe" due to lacking driver support.
There were dual voodoo 2 and dual ATI rage 128 pro cards too..