why all the 55nm hate? I have 3, they idle at 35c and load high 50s low 60s running 783/1566/2448 all day long ? Doesnt seem too bad to me :(
Printable View
why all the 55nm hate? I have 3, they idle at 35c and load high 50s low 60s running 783/1566/2448 all day long ? Doesnt seem too bad to me :(
Given 55nm chip consume considerably less power I don't see how temps of 65'ers would be a valid reason.
dam you nvidia.. you guys spend billions and billion on top of billions and billions on gpu r&d and when it comes to critical components you skimp out on
how about skimpping out on boxes/packaging/crap included in the box
how about no cables/adapters? which can easily be acquired
ill be choosing nvidia cards more carefully from now on
because of the ammount of copper in the factory cooling solution, power management, voltage regulation, and overall quality of components.
compare these two OC's - check out texture fillrate. one costs $195, the other $355, both on stock cooling
original 65nm 260
http://img.techpowerup.org/080801/bios.jpg
55nm evga GTX285 SSC edition
http://images2.hiboox.com/images/060...6033ecb46c.jpg
Amount of copper in HSF? Heh, I sometimes forget people actually use stock cookers (sic).Quote:
Originally Posted by jaredpace
Power Management? Isn't that where 55nm chips dominate? 55'ers draw 25% less power at load and 0-30% less in idle. (link)
Voltage regulation? Are VRs themselves a value of a sort? How they do the job would be... Infact, reliability of traditional cooler mosfets might just be better than Volterra's ultra hot -running chips.
Overall quality of components? Such as? Polymer caps are better in the "cheapo" design. That is, because Volterra regulation doesn't work with polymer caps - nV's application of Volterra resulted in a squealer due to fubar capacitances.
GPU-Z 0.2.6 had a bug that resulted in glitched texture fillrates for GTX260 192s; the card has 64TMUs, not 70 like the app wrongfully assumed. Thus, the fillrate for the GTX260 in the screenshot is wrong. Correct texture fillrate for a GTX260 192 @ 820MHz is 52.48GTexels/s.Quote:
compare these two OC's - check out texture fillrate. one costs $195, the other $355, both on stock cooling
original 65nm 260
(57.4GTexels/s for 65nm GTX260 192 @ 820MHz using GPU-Z 0.2.6)
GPU-Z 0.3.1 is bugged, aswell!Quote:
55nm evga GTX285 SSC edition
(56.6.4GTexels/s for 55nm GTX285 @ 773MHz using GPU-Z 0.3.1)
GTX280/285 has 80TMUs, no matter the process (65/55nm). For GTX280 GPU-Z calculates texture fillrates correctly, but with GTX285 it thinks the chip has some odd amount of 73.2TMUs. :D
Result is too low displayed texture fillrate for GTX285s. Correct fillrate for a GTX285 @ 773MHz is 61.84GTexels/s.
@ jaredpace - thats 260 vs 285 sli :)
the whole problem @ gtx285/260 55nm is the 1.26vcore limit
once past this limit.. clocks will take off
I'm still happy with my 8800GTS and I play a lot of new games lol I dunno what all the fuss is about, sick of these refresh I want to see some new hardware. Come on ATI force Nvidia to do something
If it wasn't for the HD4830 @ <$120, the HD4850 @ <$150, and the HD4870 @ <$200, I don't see how nVidia would even THINK of dropping the prices on GTX260/280/285. Face it--ATI dominates the value-conscious market. 90% performance for 60% the price. For me, that works. Heck--remember back to 2006/2007--nVidia didn't drop the price. They banked on all the profits until ATI brought out the 2900XT and eventually the cheap HD3850 and 3870. That's when the price of 8800's dropped like a solid lead balloon.
Anyways--my HD3850 still has plenty of kick to it. It was purchased for ~$160 a few weeks after launch. Only a 256MB version, but it still flies. I've got but a cheap 19" LCD, 1680x1050, so I'm all setup for cheap gaming. So now--I've had this computer up & running for ~16 months now. I'm already plenty upgraded. I'll see about more later. Heck--I can get an E5200 for cheap and a HD4830 also for cheap. Everything else works plenty fine.
WRT to the higher cost.. those cases they created a faster product so the price is higher. So be it.
WRT to the same cost... I never said that this will IMMEDIATELY give lower prices to the end user. I said it gives more HEADROOM for lowering prices when the competition forces them to, hence everyone wins.
This is a response I received from somebody much more knowledgeable about this sort of thing on this topic.
Quote:
"I like the new design if it is what I think it is. NV has gone to higher density memory chips and
put them all on the front of the card instead of 1/2 of the front and 1/2 on the back. That makes
trace length and chip to chip timing differences due to those trace lengths difference smaller
and easier to deal with. They have also moved the chip closer to the GPU too. All this is to
deal with ever higher higher memory clock speeds. Moving all memory chips to the front
also allows PCB layers to be reduced. Once you go over 7 layers the cost of the PCB goes
though the roof as more layers are added. That is why MB makers try like hell to keep layers to
a max of six. All the above are good things.
There has never been a video card made that needed more than a single hi/lo Vddq /Vdd
phase. That is just memory buffer I/O voltage which milli-amp per chip current requirements
with the total memory current of about 1 amp. You can do the with a single "D" cell battery.
no big deal there at all.
Reduction in Vcore phases I have already talked about and is the way of the industry as a whole.
This in not a bad thing either as long as final filtration is increased accordingly.
Reducing the EEProm (bios) capacity is easily understood. These new cards no longer
support now obsolete DOS ANSI graphics which were bios functions. Much code and routines
that used to have to be in bios, mostly to support DOS, ANSI and other non Windows functions
have been removed from the bios altogether and/or moved to Windows drivers and supported
there. The simple fact is that a 1Mb EEprom on a GFX card is needed like 100+ octane in a 91
octane max car or another hole in your head.
People always make cost reduction sound like a bad thing when in most case, especially on high
end products where there is often overkill in initial designs, they are not. On low end card cost
reductions are often at the complete expense of performance, life, quality but it is different on the
high end."
Nvidia wishes they could spend billions on R&D.
Further what comes with the card wich raises product price (cables, games, special services etc) all is done by the partners.
Nvidia just sells them the GPUs and a board design that will propery work.
Im pretty shure tho that if the partners wouldnt supply the cables alot of people will moan about it not being included and them being forced to buy the cable + shipping.
And Nvidia will make shure (and is responsible) that the GPU and the board design they supply will work at given spec.
If u as a customer then choose to OC the card is ur choice and not rly Nvidias problem (atleast from Nvidias point of view).
Im shure tho that the brands that r known for OCin will either stick with the old design or will put higher quality parts in the new design.
I gues Nvidias problem atm is that the chips fab price is pretty high and they r trying to cut cost price as much as possible to cut loss when lowering prices.
Will they be naming it a GTX265?
This is why I always like Nvidia cards. I don't have any proof, but they always felt kinda over engineered, just in case. I've given them hell and never had one fail. Too bad they are going this route, but the economy is tough. It might turn out for the best.
I've though the exact opposite for a long time. IMO, nV boards generally appear to be overly complex and that makes them look sloppy.
For example:
Reference G200
Attachment 94535
5 vGPU phases and 1+1 for memory all phases are made of Volterra 1100-series parts. VRM area is a mess as one can see. 12V source coils here and there. And there's something what looks like a general chaos in both ends of the card.
PCB is 10.35 inches.
Reference R600 (OEM)
Attachment 94536
This card uses the exact same Volterra chips as the G200 above, yet, it has 6 vGPU phases and 1+1 for memory. They are all in a clean row, except for the 2nd memory phase - vDDQ, I believe, which is tucked in the corner by the fan connectors. 12V coils neatly on the backside. Very clean board compared to G200.
PCB is 8.25 inches - this thing is whopping 2.1 inches shorter than G200's!
Too bad the GPU (R600) on this card is nothing but a waste of good silicon. If they just had used the same basic PCB design for RV670, RV770 and so on... Ofcourse, they would/could have cut 2 vGPU phases - but no more than 2, RV770 with 3 is *tsk-tsk* - and half of the RAM bus for post-R600 chips. But slap some 256bits worth of high-bin GDDR5 in this board and a GPU 1½-2× logic of RV770 at 40nm and the results are -> !!!!!!!
But da-ymn, is this the perfect video card PCB...
It is a masterpiece.
http://largon.wippiespace.com/smilies/1242.gif
G80 (GTX not GTS) was the first solidly built nVidia card. The other would be GT200 original. They are solid and definitely a nice step over the cheapenings of the past, but not overly impressive.
R600 is still something on the verge of untouchable. I still have it (although dead) and most Xtreme benchers can attest to the goodness of real good design and components, especially in the LN2 runs thereafter.
The other "overkill" built card was the RV630 XT with GDDR4. Damn, single slot fappage. Too bad after that the 635 and 730 skimpped pretty badly.
Well if the Volterra chipset is the main reason for the high-pitched squeeling (which makes sense since 4000 series also have the squeeling) then I approve this move. :up: Fan-noise I have no problem with but I can't stand squeeling components, especially if it also varies a bit in tone.
I would buy one of those R600 based FireGL V860/50s just to have it (and gently fondle it whenever I please) if they were available and in sensible prices. Oh man, I think I have somekind of an unnatural relationship with the OEM R600 board... But it's so pretty!
http://largon.wippiespace.com/smilies/lol.png
RPGWiZaRD,
Volterra chips aren't the reason for the squealing, underspec/wrong vGPU capacitance and those flimsy standalone chokes on G200 is the reason.
^Ouch, it actually is 6+1 phases and not 5+1+1.
Say goodbye to memory OC'ing.
But it's a clean, good looking laid out card, unlike the jumble mess on "older" designs. Say what you will about ATI cards and their performance, the cards have pretty much always been laid out neatly.
Lol I get all kind of noise out of my 285. Is the 295 any better? Waiting for BFG to list it as a "trade up" item.
Gross.
I don't mind them creating a stripped-down version, but FFS they should re-label it. GTX 259 or something. Not re-labelling it does seem to follow the general GPU market trend of relabelling upwards at all times though, so at least they're consistent.