MMM
Page 3 of 4 FirstFirst 1234 LastLast
Results 51 to 75 of 89

Thread: GTX260 55nm Preview/Review

  1. #51
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    Strive for peace w/Acts of War
    Posts
    868
    Quote Originally Posted by trinibwoy View Post
    Why would you think it would revolutionize everything. RV670 and RV770 are at 55nm already and there's nothing magical about them. A 4870 draws about as much power as a GTX 260 65nm.
    Well, if you read the previous post before mine. You'd see that EVERYONE "thought" that nVidia would revolutionized the GPU industry by moving from 65nm to 55nm.

    It's rather amusing that everyone believes that with a reduction in processing, such major improvement "should" occur.

    I for once don't see anything magical about the reduction of the GPU. But everyone else "expects" this magical accomplishment with a mere change.

    ___________________________________

    And something tells me you did not look at the 2nd picture, where a 260 draws 265W vs the 4870's 312W and even at idling the 4870 draws more too. So, no such thing as "as much" from what you can see.
    ASUS P5B Deluxe P965 BIOS 1236 | Intel Core 2 Quad Q6600 G0 8MBL2 @ 3.15GHZ | G.Skill DDR2 800 F2-6400PHU2-2GBHZ & XTreem DDR 800 D9GMH - 4GB RAM Total | 4:5 Ratio @ 350fsbx9 | Tuniq Tower 120 | BFG GeForce 9800GTX | Seagate 2x 250GB Perpendicular HDDs RAID-0 | PC Power & Cooling Silencer 750W EPS12V | Samsung TOC T240 24" LCD Monitor |

  2. #52
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Ah, gotcha. Well "everyone" needs to temper their expectations so as to avoid being disappointed over and over. G80 and 4870 are two recent examples of us being pleasantly surprised though. But that can't happen every time.
    Last edited by trinibwoy; 12-28-2008 at 12:42 PM.

  3. #53
    Xtreme Enthusiast
    Join Date
    Apr 2005
    Posts
    757
    Quote Originally Posted by largon View Post
    Kinda like "G200-103-B2" is faster & more efficient than "G200-103-A2"?
    Oh, wait?


    G200-103-B2 = 55nm
    G200-103-A2 = 65nm
    The 55nm will consume less power and produce less heat than the 65nm if all settings are equal. This is not even debatable, simple physics.

  4. #54
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    Quote Originally Posted by trinibwoy View Post
    Ah, gotcha. Well "everyone" needs to temper their expectations so as to avoid being disappointed over and over. G80 and 4870 are two recent examples of us being pleasantly surprised though. But that can't happen every time.
    very true, a great example of 65nm to 55nm that wasn't a g80 or a 4870 is clearly the 3870. It was a low power card for sure but it set the 4870 up for greatness.

    I think nvidia would have been better off going for a risk on the 280/260 being 55nm in the first place rather than leaving the midrange with nothing to run off of.

  5. #55
    Xtreme Enthusiast
    Join Date
    Apr 2005
    Posts
    757
    Quote Originally Posted by SnipingWaste View Post
    Looks like the 55nm for Nvidia has leakage problems. It makes scene to dump this stepping at GTX260 and not the 270/290. My guess is there hoping a new stepping will have less leakage but you never know till the new stepping is tested.
    You sure are drawing a lot of conclusions from ONE half-assed preview.

  6. #56
    Xtreme Enthusiast
    Join Date
    May 2005
    Location
    Montana, USA
    Posts
    503
    Something that everyone should keep in mind when judging measured heat on these 55nm GPUS is thermal density. I dont know if nVidia lowered vGpu on these new cards or not, but assuming they did not there is going to be a similiar amount of heat being dissapated thru a smaller contact area with the HSF- therefore equal or higher measured temperature. (Thermal density could even be higher with a decrese in vgpu...)

    This being said, I think what everyone is still hoping to see is a decrease in net power consumption.

    I also subscribe to the theory that nVidia is closely binning these first 55nm chips for use in the GX2s. Releasing a "new" 260 216 is a great way to use up chips that leak more current than they should on the new process. They are also more than likely binning current chips for the upcoming 285 as well. If I had to guess, these 285 bins probably fall somewhere between the GX2 chips and the "recycled" 55nm 216 parts for leakage.
    i7-2600K @ 4806Mhz 102.3x47 1.368v LinX stable
    MSI P67 GD-80
    16Gb Corsair 8-8-8-24 1T 818.2Mhz
    MSI GTX560Ti 1005/2250 1.062v
    Crucial m4 256Gb SSD
    Corsair TX850 | 64bit WIN7 Pro
    Custom watercooling
    47" 1080p LCD | Onkyo 876/ Polk 5.1 surround

  7. #57
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    Fact that a 470 mm^2 chip endowed card with 896 MB 448 bit specs must be sold at around 250 US$, i think nVidia would make sure every darn chips count. But i do also think, when people get lucky with this card, they can get a golden chip that clocks amazingly, because i don't think the sales of GTX 295 and 285 would be that high in this kinda economic environment, while the demand of nicely priced GTX 260 core 216 would be quite high.

  8. #58
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Quote Originally Posted by SNiiPE_DoGG View Post
    looks like this is gonna be a slooooow year for Nvidia, hopefully ATI will make a killing in the midrange with the 740 and arrive at its upward trend equal with the GT300 in October.
    whens this going to happen? the 740
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  9. #59
    Xtreme Member
    Join Date
    Dec 2007
    Location
    CR:IA
    Posts
    384
    the only one i could find online was the eVGA and the dang things sold out almost everywhere it's offered.
    PC-A04 | Z68MA-ED55 | 2500k | 2200+ XPG | 7970 | 180g 520 | 2x1t Black | X3 1000w

  10. #60
    Xtreme Cruncher
    Join Date
    Oct 2006
    Location
    1000 Elysian Park Ave
    Posts
    2,669
    Wow, i was expecting another 9800GTX+, slightly lower power consumption but OCs higher. Once you OC it the power consumption is actually the same or more than the old one.
    i3-8100 | GTX 970
    Ryzen 5 1600 | RX 580
    Assume nothing; Question everything

  11. #61
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    Quote Originally Posted by Glow9 View Post
    whens this going to happen? the 740
    well since its already taped out it should be out before the end of Q1 '09

    nvidia's new midrange doesn't come till at least q3

  12. #62
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Old Vizima
    Posts
    952
    I want to see max core and shader clock for these.

  13. #63
    Xtreme Addict
    Join Date
    Oct 2005
    Location
    MA/NH
    Posts
    1,251
    I got a stock clocked evga 55nm coming midweek...
    Mpower Max | 4770k | H100 | 16gb Sammy 30nm 1866 | GTX780 SC | Xonar Essence Stx | BIC DV62si | ATH AD700 | 550d | AX850 | VG24QE | 840pro 256gb | 640black | 2tb | CherryReds | m60 | Func1030 |
    HEAT

  14. #64
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Quote Originally Posted by Nasgul View Post
    And something tells me you did not look at the 2nd picture, where a 260 draws 265W vs the 4870's 312W and even at idling the 4870 draws more too. So, no such thing as "as much" from what you can see.
    expreview's power consumption measurements were done during Furmark which is known to cause RV770s to draw power more than in any other benchmark. Thus using Furmark for powr consumption testing makes no sense as it doesn't represent any real world scenario.

    Xbitlabs measures the exact power draw of the card itself (during loop of 3DMark06 SM3.0) and their figures show HD4870 consumes 6 watts more than 65nm GTX260. And since 55nm GTX260 draws 10W more than 65nm GTX260...


    Idle figures?
    Reference HD4870 bios has quite conservative PowerPlay settings for the hungry GDDR5 - users can downclock their memories and cut ~40W... That's a "no-can-do" on GTX260.
    You were not supposed to see this.

  15. #65
    Xtreme Enthusiast
    Join Date
    Jul 2008
    Posts
    950
    Quote Originally Posted by LaMpiR View Post
    I just wonna know will it clock better than 65nm gtx260 216sp...
    got mine 260-216 clocked to 750-1540-1250 im happy with that.

  16. #66
    Xtreme Addict
    Join Date
    Feb 2007
    Location
    Arizona, USA
    Posts
    1,700
    Quote Originally Posted by LexDiamonds View Post
    Something that everyone should keep in mind when judging measured heat on these 55nm GPUS is thermal density. I dont know if nVidia lowered vGpu on these new cards or not, but assuming they did not there is going to be a similiar amount of heat being dissapated thru a smaller contact area with the HSF- therefore equal or higher measured temperature. (Thermal density could even be higher with a decrese in vgpu...)

    This being said, I think what everyone is still hoping to see is a decrease in net power consumption.

    I also subscribe to the theory that nVidia is closely binning these first 55nm chips for use in the GX2s. Releasing a "new" 260 216 is a great way to use up chips that leak more current than they should on the new process. They are also more than likely binning current chips for the upcoming 285 as well. If I had to guess, these 285 bins probably fall somewhere between the GX2 chips and the "recycled" 55nm 216 parts for leakage.
    Good point on the thermal disapation!

    I to "subscribe" to the theory that nvidia as binning the hell out of these new chips. At this point, nvidia cares a heck of a lot more about the performance crown than mainstream parts such as this new model.

    Quote Originally Posted by largon View Post
    expreview's power consumption measurements were done during Furmark which is known to cause RV770s to draw power more than in any other benchmark. Thus using Furmark for powr consumption testing makes no sense as it doesn't represent any real world scenario.

    Xbitlabs measures the exact power draw of the card itself (during loop of 3DMark06 SM3.0) and their figures show HD4870 consumes 6 watts more than 65nm GTX260. And since 55nm GTX260 draws 10W more than 65nm GTX260...


    Idle figures?
    Reference HD4870 bios has quite conservative PowerPlay settings for the hungry GDDR5 - users can downclock their memories and cut ~40W... That's a "no-can-do" on GTX260.
    Thank you for pointing this out largon.


    Core i7 920 D0 B-batch (4.1) (Kinda Stable?) | DFI X58 T3eH8 (Fed up with its' issues, may get a new board soon) | Patriot 1600 (9-9-9-24) (for now) | XFX HD 4890 (971/1065) (for now) |
    80GB X25-m G2 | WD 640GB | PCP&C 750 | Dell 2408 LCD | NEC 1970GX LCD | Win7 Pro | CoolerMaster ATCS 840 {Modded to reverse-ATX, WC'ing internal}

    CPU Loop: MCP655 > HK 3.0 LT > ST 320 (3x Scythe G's) > ST Res >Pump
    GPU Loop: MCP655 > MCW-60 > PA160 (1x YL D12SH) > ST Res > BIP 220 (2x YL D12SH) >Pump

  17. #67
    Xtreme Member
    Join Date
    Jun 2006
    Location
    Banja Luka, Bosnia and Herzegovina
    Posts
    423
    Quote Originally Posted by dan7777 View Post
    got mine 260-216 clocked to 750-1540-1250 im happy with that.
    What cooling method are you using?

  18. #68
    Xtreme Enthusiast
    Join Date
    Apr 2005
    Posts
    757
    Quote Originally Posted by largon View Post
    expreview's power consumption measurements were done during Furmark which is known to cause RV770s to draw power more than in any other benchmark. Thus using Furmark for powr consumption testing makes no sense as it doesn't represent any real world scenario.

    Xbitlabs measures the exact power draw of the card itself (during loop of 3DMark06 SM3.0) and their figures show HD4870 consumes 6 watts more than 65nm GTX260. And since 55nm GTX260 draws 10W more than 65nm GTX260...


    Idle figures?
    Reference HD4870 bios has quite conservative PowerPlay settings for the hungry GDDR5 - users can downclock their memories and cut ~40W... That's a "no-can-do" on GTX260.

    Why would it make no sense to use Furmark just because it stresses a video card more than using 3DMark06? If you want to see the maximum current draw, you would choose a benchmark that stresses a card to it's maximum. Disregarding Furmark just for the sake of showing ATI in a more favorable light is just as bad as disregarding any other benchmark to show favorable results for one piece of hardware over another.

    While Furmark may not be typical real world scenario today, who's to say that it won't be with tomorrow's games, therefore Furmark is just as valid as any other benchmark assuming one manufacturer is not somehow optimizing their power circuitry to show favorable results, which in this case isn't happening.

  19. #69
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by dan7777 View Post
    got mine 260-216 clocked to 750-1540-1250 im happy with that.
    55 or 65nm?
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  20. #70
    I am Xtreme Ket's Avatar
    Join Date
    Apr 2004
    Location
    United Kingdom
    Posts
    6,822
    Quote Originally Posted by largon View Post
    expreview's power consumption measurements were done during Furmark which is known to cause RV770s to draw power more than in any other benchmark. Thus using Furmark for powr consumption testing makes no sense as it doesn't represent any real world scenario.
    Err.. actually Furmark is kinda MORE relevent as it represents a future real world scenario. Just look at Crysis, if game software keeps evolving at that rate Furmark will be outdated in a year.

    "Prowler"
    X570 Tomahawk | R7 3700X | 2x16GB Klevv BoltX @ 3600MHz CL18 | Powercolor 6800XT Red Devil | Xonar DX 7.1 | 2TB Barracuda | 256GB & 512GB Asgard NVMe drives | 2x DVD & Blu-Ray opticals | EVGA Supernova 1000w G2

    Cooling:

    6x 140mm LED fans, 1x 200mm LED fan | Modified CoolerMaster Masterliquid 240

    Asrock Z77 thread! | Asrock Z77 Extreme6 Review | Asrock P67 Extreme4 Review | Asrock P67 Extreme4/6 Pro3 thread | Asrock Z68 Extreme4 thread | Asrock Z68 Extreme4 Review | Asrock Z68 Gen3 Thread | 8GB G-Skill review | TK 2.ZERO homepage | P5Q series mBIOS thread
    Modded X570 Aorus UEFIs

  21. #71
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Quote Originally Posted by Blkout
    Why would it make no sense to use Furmark just because it stresses a video card more than using 3DMark06? If you want to see the maximum current draw, you would choose a benchmark that stresses a card to it's maximum.
    Why would make sense to benchmark power consumption and actual performance with different standards?
    RV770 totally and absolutely dominates G200 in Furmark. So does RV770 dominate in real world performance?
    Quote Originally Posted by Ket View Post
    Furmark is kinda MORE relevent as it represents a future real world scenario.
    But not present. And I like to deal with present. But that's just me.


    Btw, what's it like in the future?
    You were not supposed to see this.

  22. #72
    Xtreme Enthusiast
    Join Date
    Apr 2005
    Posts
    757
    Quote Originally Posted by largon View Post
    Why would make sense to benchmark power consumption and actual performance with different standards?
    RV770 totally and absolutely dominates G200 in Furmark. So does RV770 dominate in real world performance?
    But not present. And I like to deal with present. But that's just me.


    Btw, what's it like in the future?

    We're not judging performance, we're judging power consumption. Did you get confused?

  23. #73
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858

    What's the relevance of power consumption in some non-real world situation? One cannot draw any perf/watt concusions based on such figures.
    You were not supposed to see this.

  24. #74
    Xtreme Mentor
    Join Date
    Jul 2004
    Posts
    3,247
    Quote Originally Posted by ChinStrap View Post
    the only one i could find online was the eVGA and the dang things sold out almost everywhere it's offered.
    In stock at Newegg

    EVGA 896-P3-1255-AR GeForce GTX 260 Core 216 @$254.99
    http://www.newegg.com/Product/Produc...82E16814130434

    EVGA 896-P3-1257-AR GeForce GTX 260 Core 216 Superclocked Edition @$264.99
    http://www.newegg.com/Product/Produc...82E16814130433

  25. #75
    Xtreme Member
    Join Date
    Oct 2008
    Location
    New Orleans
    Posts
    496
    Quote Originally Posted by ChinStrap View Post
    the only one i could find online was the eVGA and the dang things sold out almost everywhere it's offered.
    Newegg wasn't supposed to have any until Jan 5, but they got some in yesterday.

Page 3 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •