MMM
Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 89

Thread: GTX260 55nm Preview/Review

  1. #26
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    looks like this is gonna be a slooooow year for Nvidia, hopefully ATI will make a killing in the midrange with the 740 and arrive at its upward trend equal with the GT300 in October.

  2. #27
    Xtreme Addict
    Join Date
    Jan 2004
    Location
    somewhere in Massachusetts
    Posts
    1,009
    Well, this kinda sucks but at least it vindicates my buying a GTX-260+ a few weeks ago...

  3. #28
    Banned
    Join Date
    Dec 2008
    Posts
    63
    oh the humanity
    Last edited by slapmehard; 12-28-2008 at 02:29 AM.

  4. #29
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Dur.. it's not a new product! It's purely a cost savings move as evident in the lack of re-branding.

    I don't know why anyone expected anything more.

  5. #30
    Xtreme Member
    Join Date
    Jun 2006
    Location
    Banja Luka, Bosnia and Herzegovina
    Posts
    423
    I just wonna know will it clock better than 65nm gtx260 216sp...

  6. #31
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by Sr7 View Post
    Dur.. it's not a new product! It's purely a cost savings move as evident in the lack of re-branding.

    I don't know why anyone expected anything more.
    I think most people expected same cooling solution, lower temperatures, less power consumption => better overclockability. Somehow it failed in at least 3 of these categories if you read this specific review, it's got a cut down cooling solution, slightly higher temp, about same/higher load power consumption. xD

    Overclockability might be slightly better still but looks like it won't be as huge improvement as I had hoped for. I have hoped for like ~750MHz core overclock in avg seeing how great 65nm GTX's get close or reach that.
    Last edited by RPGWiZaRD; 12-28-2008 at 05:19 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  7. #32
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    If the 55nm is really performing like that, there wouldnt be a GX2 to follow.

    Either that or somehow nV thought a GX2 wasnt feasible until now due to cost, not thermal envelope + associated cooling....
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  8. #33
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by RPGWiZaRD View Post
    I think most people expected same cooling solution, lower temperatures, less power consumption => better overclockability. Somehow it failed in at least 3 of these categories if you read this specific review. xD
    People treat these things as though they are iron laws, and yet they are not. You *tend* to see things correlate as you move further and further to smaller process nodes, but one hop doesn't guarantee anything of the sort.

    Not to mention it seems they're comparing both GPUs at the same clocks, where they will perform the same obviously, and 4% is within noise. To attribute a few percent performance improvements to anything is absurd. 1-4% is going to be within margin of error/noise.

  9. #34
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by K404 View Post
    If the 55nm is really performing like that, there wouldnt be a GX2 to follow.

    Either that or somehow nV thought a GX2 wasnt feasible until now due to cost, not thermal envelope + associated cooling....
    You guys know that both ATI and NVIDIA bin the lower leakage chips in order to put together dual GPU products with just 2 power connectors, right?

    To look at power usage of these chips and comment on dual-gpu scenarios is not really accurate because these chips are not representative.

  10. #35
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    Parallels can be drawn. Even for well binned cores, the production line still has to be "in the zone" for the idea to work.

    PCI-E slot+ 2 power cables-> 1 power cable per card + the PCI-E slot power is a BIG difference, even with a voltage and MHz drop
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  11. #36
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by Sr7 View Post
    People treat these things as though they are iron laws, and yet they are not. You *tend* to see things correlate as you move further and further to smaller process nodes, but one hop doesn't guarantee anything of the sort.
    Of course these things aren't iron laws but it's much more likely to become a success rather than a failure as NVIDIA are aiming for those achievements, lower cost (most important), lower power comsumption & heat dissipation => base clocks can be raised if the process goes well so prices can be raised on the products too. With NVIDIA that is such advanced manufacturer with all kinds of advanced equipment you'd just expect the successratio to be rather high. Of course that doesn't mean it couldn't become a failure but the risk is rather small still why most people expect a success rather than a failure.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  12. #37
    Xtreme Member
    Join Date
    Sep 2008
    Location
    Romania
    Posts
    157
    oh my, nvidia is ridin teh faiboat again...

    Will at least be a way to diferentiate the old 260 from the new one? Like does it say on the box "55nm can of whoop ass edition"? Or will boxes be the same as the old ones?
    the state is universally evil, there is no good country only good people

  13. #38
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by azraeel101 View Post
    oh my, nvidia is ridin teh faiboat again...

    Will at least be a way to diferentiate the old 260 from the new one? Like does it say on the box "55nm can of whoop ass edition"? Or will boxes be the same as the old ones?
    I think it will depend on the companies making the package if it will be recognisable or not. At least EVGA clearly states 55nm and uses another appearence for the boxes on their 55nm counterparts.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  14. #39
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    Well this is rather unexpected.

    I knew not to expect miracles per say, but I was figuring we'd at least get slightly lower temps and
    power consumption... This is just pitiful.

    The can of whoopass remains broken and cracked... Another year of complete Nvidia dominance is nigh!

  15. #40
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Posts
    684
    Looks like the 55nm for Nvidia has leakage problems. It makes scene to dump this stepping at GTX260 and not the 270/290. My guess is there hoping a new stepping will have less leakage but you never know till the new stepping is tested.

  16. #41
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    Strive for peace w/Acts of War
    Posts
    868
    Quote Originally Posted by Sly Fox View Post
    Well this is rather unexpected.

    I knew not to expect miracles per say, but I was figuring we'd at least get slightly lower temps and
    power consumption... This is just pitiful.
    You're absolutely right and everyone else too.

    I mean I "thought" that the process transition from 65nm to 55nm was going to revolutionize entirely the GPU industry and thus sending everything else into chaos.

    Who would've have thought that going from 65nm to 55nm was not going to change anything at all? I'm extremely puzzled and I am tired of get the hopes so high that I'd go and believe that something of that magnitude and such magnitude wouldn't do anything at all.

    ______________________


    But in the other hand though, it's better than the 4870.

    I would but it but.....mann! I thought that the 55nm processing was going to be such an amazing technology compared to the 65nm. Oh well!, let's see how 40nm does.
    ASUS P5B Deluxe P965 BIOS 1236 | Intel Core 2 Quad Q6600 G0 8MBL2 @ 3.15GHZ | G.Skill DDR2 800 F2-6400PHU2-2GBHZ & XTreem DDR 800 D9GMH - 4GB RAM Total | 4:5 Ratio @ 350fsbx9 | Tuniq Tower 120 | BFG GeForce 9800GTX | Seagate 2x 250GB Perpendicular HDDs RAID-0 | PC Power & Cooling Silencer 750W EPS12V | Samsung TOC T240 24" LCD Monitor |

  17. #42
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Why would you think it would revolutionize everything. RV670 and RV770 are at 55nm already and there's nothing magical about them. A 4870 draws about as much power as a GTX 260 65nm.

  18. #43
    Xtreme Enthusiast
    Join Date
    Jul 2008
    Posts
    950
    well.... im still happy with my gtx260-216 overclocked to 750mhz competes with the 280 crysis load 62 cool enuf 4 me.

  19. #44
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    Quote Originally Posted by Sr7 View Post
    Dur.. it's not a new product! It's purely a cost savings move as evident in the lack of re-branding.

    I don't know why anyone expected anything more.
    Agreed, this card has to adapt to a new competition environment with a cheap to make HD 4870 1 GB card, so nVidia is basically try to aim for the best yield, by staying with the same clock specification.

    Quote Originally Posted by RPGWiZaRD View Post
    I think most people expected same cooling solution, lower temperatures, less power consumption => better overclockability. Somehow it failed in at least 3 of these categories if you read this specific review, it's got a cut down cooling solution, slightly higher temp, about same/higher load power consumption. xD

    Overclockability might be slightly better still but looks like it won't be as huge improvement as I had hoped for. I have hoped for like ~750MHz core overclock in avg seeing how great 65nm GTX's get close or reach that.
    Those three hopes can only happen if the card were still a 350 US$ card, which is not true, that's why the situation is very understandable.

    I think nVidia put the lowest denomination for this card clock specs, so they can have the best yield, and the better chips can go to GTX 285 and 295 cards. Clearly they will put GT200b chips over a heavy binning process, with mass production for a high mainstream card in mind (GTX 260), using the smallest die size you can get and the cheapest board and RAM you can put together.

    Quote Originally Posted by K404 View Post
    If the 55nm is really performing like that, there wouldnt be a GX2 to follow.

    Either that or somehow nV thought a GX2 wasnt feasible until now due to cost, not thermal envelope + associated cooling....
    With golden chips available through binning process, nothing is impossible -but availability might be affected.

  20. #45
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by SnipingWaste View Post
    Looks like the 55nm for Nvidia has leakage problems. It makes scene to dump this stepping at GTX260 and not the 270/290. My guess is there hoping a new stepping will have less leakage but you never know till the new stepping is tested.
    I was wondering the same thing. I can only assume that in time more information will be revealed.
    [SIGPIC][/SIGPIC]

  21. #46
    Registered User
    Join Date
    Jun 2007
    Location
    Wolverhampton
    Posts
    34
    hmmm, i may step up to this from my 4850. Can get the 675mhz card today for
    £220 - not a bad price at all.

    Does anyone know how much the GTX 285 is going to cost in UK / EU when it comes out in january?
    Last edited by jmase; 12-28-2008 at 11:30 AM.

  22. #47
    Xtreme Enthusiast
    Join Date
    Oct 2008
    Location
    Campbellsville, Kentucky
    Posts
    896
    i can't believe someone has'nt overnighted one of those 55nm from EVGA's site. This review is lacking big time.
    Main Rig
    • Intel Core i7 4790K CPU Stock @ 4.4Ghz
    • Asus Maximus VI Extreme Motherboard
    • 32GB GSKILL Trident X 2400MHZ RAM
    • EVGA GTX 980 Superclocked 4GB GDDR5
    • Corsair TX850W v2 TX Power Supply 70A 12V Rail
    • Swiftech Apex Ultima w/ Apogee Drive II & Dual 120 RAD w/integrated res
    • 2X Seagate 333AS 1TB 7,200 32MB HD's in RAID 0
    • 2X Samsung 830's 128GB in RAID 0
    • Windows 8.1 Pro x64
    • Coolermaster HAF-XB
    • Dual Asus ProArt PA248Q 24" IPS LED Monitors
    • Samsung 46" 5600 Series Smart HDTV
    • iPhone 6 Plus 64GB AT&T & Xbox One


    UNOFFICIAL Rampage II Extreme Thread

  23. #48
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    Quote Originally Posted by jasonelmore View Post
    i can't believe someone has'nt overnighted one of those 55nm from EVGA's site. This review is lacking big time.
    at least it's something

    remember that going from 65nm to 55nm is a MUCH smaller step than going from 80nm to 55 like ati did with the R600->rv670


    it may consume as much power as the 65n ones but it's supposed to be muc cheaper for nv to make and clock a little bit better than the 65nm ones

    i don't understand why a lot of people are upset and bash nvidia just because it consumes as much power
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  24. #49
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Expectations is the name of the game. People expected lower prices, higher clocks and lower temps and lower power consumption. None of those things have materialized hence the disappointment. But we really need to wait for full reviews of retail cards though. I also find it strange that nobody has just bought one and reviewed it.

  25. #50
    Xtreme Enthusiast
    Join Date
    Oct 2008
    Location
    Campbellsville, Kentucky
    Posts
    896
    If you thing those figures are right about the power consumption then your very gullable. Remember that the GTX 295 uses 285 Watts. and that's two G200 Chips. Compared to 2 GTX 260's which consume 450 watts of power. The 55nm Process has allowed a lot of power consumption as far as the 295 goes. I dont' see why the GTX 265 would be much different.
    Main Rig
    • Intel Core i7 4790K CPU Stock @ 4.4Ghz
    • Asus Maximus VI Extreme Motherboard
    • 32GB GSKILL Trident X 2400MHZ RAM
    • EVGA GTX 980 Superclocked 4GB GDDR5
    • Corsair TX850W v2 TX Power Supply 70A 12V Rail
    • Swiftech Apex Ultima w/ Apogee Drive II & Dual 120 RAD w/integrated res
    • 2X Seagate 333AS 1TB 7,200 32MB HD's in RAID 0
    • 2X Samsung 830's 128GB in RAID 0
    • Windows 8.1 Pro x64
    • Coolermaster HAF-XB
    • Dual Asus ProArt PA248Q 24" IPS LED Monitors
    • Samsung 46" 5600 Series Smart HDTV
    • iPhone 6 Plus 64GB AT&T & Xbox One


    UNOFFICIAL Rampage II Extreme Thread

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •