looks like this is gonna be a slooooow year for Nvidia, hopefully ATI will make a killing in the midrange with the 740 and arrive at its upward trend equal with the GT300 in October.
Printable View
looks like this is gonna be a slooooow year for Nvidia, hopefully ATI will make a killing in the midrange with the 740 and arrive at its upward trend equal with the GT300 in October.
Well, this kinda sucks but at least it vindicates my buying a GTX-260+ a few weeks ago...
oh the humanity
Dur.. it's not a new product! It's purely a cost savings move as evident in the lack of re-branding.
I don't know why anyone expected anything more.
I just wonna know will it clock better than 65nm gtx260 216sp...
I think most people expected same cooling solution, lower temperatures, less power consumption => better overclockability. Somehow it failed in at least 3 of these categories if you read this specific review, it's got a cut down cooling solution, slightly higher temp, about same/higher load power consumption. xD
Overclockability might be slightly better still but looks like it won't be as huge improvement as I had hoped for. I have hoped for like ~750MHz core overclock in avg seeing how great 65nm GTX's get close or reach that.
If the 55nm is really performing like that, there wouldnt be a GX2 to follow.
Either that or somehow nV thought a GX2 wasnt feasible until now due to cost, not thermal envelope + associated cooling....
People treat these things as though they are iron laws, and yet they are not. You *tend* to see things correlate as you move further and further to smaller process nodes, but one hop doesn't guarantee anything of the sort.
Not to mention it seems they're comparing both GPUs at the same clocks, where they will perform the same obviously, and 4% is within noise. To attribute a few percent performance improvements to anything is absurd. 1-4% is going to be within margin of error/noise.
You guys know that both ATI and NVIDIA bin the lower leakage chips in order to put together dual GPU products with just 2 power connectors, right?
To look at power usage of these chips and comment on dual-gpu scenarios is not really accurate because these chips are not representative.
Parallels can be drawn. Even for well binned cores, the production line still has to be "in the zone" for the idea to work.
PCI-E slot+ 2 power cables-> 1 power cable per card + the PCI-E slot power is a BIG difference, even with a voltage and MHz drop
Of course these things aren't iron laws but it's much more likely to become a success rather than a failure as NVIDIA are aiming for those achievements, lower cost (most important), lower power comsumption & heat dissipation => base clocks can be raised if the process goes well so prices can be raised on the products too. With NVIDIA that is such advanced manufacturer with all kinds of advanced equipment you'd just expect the successratio to be rather high. Of course that doesn't mean it couldn't become a failure but the risk is rather small still why most people expect a success rather than a failure.
oh my, nvidia is ridin teh faiboat again...
Will at least be a way to diferentiate the old 260 from the new one? Like does it say on the box "55nm can of whoop ass edition"? Or will boxes be the same as the old ones?
Well this is rather unexpected.
I knew not to expect miracles per say, but I was figuring we'd at least get slightly lower temps and
power consumption... This is just pitiful.
The can of whoopass remains broken and cracked... Another year of complete Nvidia dominance is nigh!
Looks like the 55nm for Nvidia has leakage problems. It makes scene to dump this stepping at GTX260 and not the 270/290. My guess is there hoping a new stepping will have less leakage but you never know till the new stepping is tested.
You're absolutely right and everyone else too.
I mean I "thought" that the process transition from 65nm to 55nm was going to revolutionize entirely the GPU industry and thus sending everything else into chaos.
Who would've have thought that going from 65nm to 55nm was not going to change anything at all? I'm extremely puzzled and I am tired of get the hopes so high that I'd go and believe that something of that magnitude and such magnitude wouldn't do anything at all.
______________________
But in the other hand though, it's better than the 4870.
I would but it but.....mann! I thought that the 55nm processing was going to be such an amazing technology compared to the 65nm. Oh well!, let's see how 40nm does.
Why would you think it would revolutionize everything. RV670 and RV770 are at 55nm already and there's nothing magical about them. A 4870 draws about as much power as a GTX 260 65nm.
well.... im still happy with my gtx260-216 overclocked to 750mhz competes with the 280 crysis load 62 cool enuf 4 me.
Agreed, this card has to adapt to a new competition environment with a cheap to make HD 4870 1 GB card, so nVidia is basically try to aim for the best yield, by staying with the same clock specification.
Those three hopes can only happen if the card were still a 350 US$ card, which is not true, that's why the situation is very understandable.
I think nVidia put the lowest denomination for this card clock specs, so they can have the best yield, and the better chips can go to GTX 285 and 295 cards. Clearly they will put GT200b chips over a heavy binning process, with mass production for a high mainstream card in mind (GTX 260), using the smallest die size you can get and the cheapest board and RAM you can put together.
With golden chips available through binning process, nothing is impossible -but availability might be affected.
hmmm, i may step up to this from my 4850. Can get the 675mhz card today for
£220 - not a bad price at all.
Does anyone know how much the GTX 285 is going to cost in UK / EU when it comes out in january?
i can't believe someone has'nt overnighted one of those 55nm from EVGA's site. This review is lacking big time.
at least it's something
remember that going from 65nm to 55nm is a MUCH smaller step than going from 80nm to 55 like ati did with the R600->rv670
it may consume as much power as the 65n ones but it's supposed to be muc cheaper for nv to make and clock a little bit better than the 65nm ones
i don't understand why a lot of people are upset and bash nvidia just because it consumes as much power :shrug:
Expectations is the name of the game. People expected lower prices, higher clocks and lower temps and lower power consumption. None of those things have materialized hence the disappointment. But we really need to wait for full reviews of retail cards though. I also find it strange that nobody has just bought one and reviewed it.
If you thing those figures are right about the power consumption then your very gullable. Remember that the GTX 295 uses 285 Watts. and that's two G200 Chips. Compared to 2 GTX 260's which consume 450 watts of power. The 55nm Process has allowed a lot of power consumption as far as the 295 goes. I dont' see why the GTX 265 would be much different.