Page 5 of 9 FirstFirst ... 2345678 ... LastLast
Results 101 to 125 of 223

Thread: Nvidia GTX 580 Reviews

  1. #101
    Registered User
    Join Date
    Nov 2008
    Posts
    72
    Quote Originally Posted by saaya View Post
    i bet the only reason the 580 consumes 10% less is because it has a much better cooler.
    lower temps = lower power consumption, we all know that...
    so id love to see somebody switch heatsinks on a 480 and 580 or compare both cards with either one of the two heatsinks
    There's a difference, but not a big one at same temperature.
    http://www.hardwarecanucks.com/forum...review-22.html

  2. #102
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Wild West, USA
    Posts
    655
    Quote Originally Posted by damha View Post
    I'd only buy this if I had more money than brains. Shouldn't take long for that anyways, but still
    I just want to ask are you getting more dumber with time or more wealthier? I have a direct interest in your answer as I'm getting older too
    Abit IC7 P4 2.8a @4.21 | P4 3.4e @4.9 | Gainward 6800GT GS @486/1386
    Asus P4P800 SE Dothan 730-PM @ 2900 | EVGA 6800 Ultra GS @521/1376

    e8400@4.3G & 8800GTS G92 800/1932/1132 as gaming rig 24/7

    Custom self build chillbox with watercooling @-28c 24/7 | chilled wc " cpu -18c idle/-3c load
    3DMark 2005 Score Dothan & 6800U
    3DMark 2005 Score p4 & 6800GT

  3. #103
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    I think people are tired of the name game already. With 15~20% extra performance and basically no new features, not even hd audio bitstreaming support like on gf104, the gf110 hardly deserves a 500 moniker. Yet most people seem to just let it pass because it's Nvidia and that's what they do.

    As for the 5970, I'm inclined to agree that it's not as good a solution as a single GTX 580. Crossfire support is flaky at best, and minimum framerates suck according to reviews. The fact of the matter is that it's outdated now, soon to be replaced, and not a good buy unless with considerably lower price. I see it can be had for 390€ in Germany, while the cheapest GTX 580 are around 450€. Now that's a price low enough for a 5970 that makes you reconsider buying a GTX 580. And I suppose that's the idea too, AMD dropped the price for a few SKU just to make GTX 580 look worse, even though the 5970 stock will be sold out in no time.
    "No, you'll warrant no villain's exposition from me."

  4. #104
    Xtreme Addict
    Join Date
    Feb 2008
    Location
    Russia
    Posts
    1,910
    Nice performance. Will wait some numbers 580 vs 6970.

    Intel Q9650 @500x9MHz/1,3V
    Asus Maximus II Formula @Performance Level=7
    OCZ OCZ2B1200LV4GK 4x2GB @1200MHz/5-5-5-15/1,8V
    OCZ SSD Vertex 3 120Gb
    Seagate RAID0 2x ST1000DM003
    XFX HD7970 3GB @1111MHz
    Thermaltake Xaser VI BWS
    Seasonic Platinum SS-1000XP
    M-Audio Audiophile 192
    LG W2486L
    Liquid Cooling System :
    ThermoChill PA120.3 + Coolgate 4x120
    Swiftech Apogee XT, Swiftech MCW-NBMAX Northbridge
    Watercool HeatKiller GPU-X3 79X0 Ni-Bl + HeatKiller GPU Backplate 79X0
    Laing 12V DDC-1Plus with XSPC Laing DDC Reservoir Top
    3x Scythe S-FLEX "F", 4x Scythe Gentle Typhoon "15", Scythe Kaze Master Ace 5,25''

    Apple MacBook Pro 17` Early 2011:
    CPU: Sandy Bridge Intel Core i7 2720QM
    RAM: Crucial 2x4GB DDR3 1333
    SSD: Samsung 840 Pro 256 GB SSD
    HDD: ADATA Nobility NH13 1GB White
    OS: Mac OS X Mavericks

  5. #105
    Xtreme Enthusiast
    Join Date
    Jul 2008
    Location
    Portugal
    Posts
    811
    Quote Originally Posted by saaya View Post
    i bet the only reason the 580 consumes 10% less is because it has a much better cooler.
    That and the fact that it has 200,000,000 less transistors than GF100 eh?
    ASUS Sabertooth P67B3· nVidia GTX580 1536MB PhysX · Intel Core i7 2600K 4.5GHz · Corsair TX850W · Creative X-Fi Titanium Fatal1ty
    8GB GSKill Sniper PC3-16000 7-8-7 · OCZ Agility3 SSD 240GB + Intel 320 SSD 160GB + Samsung F3 2TB + WD 640AAKS 640GB · Corsair 650D · DELL U2711 27"

  6. #106
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    Was there really an official number of transistors? Last I heard they were both ~3 billion, and Nvidia didn't give any more specific number. And since GTX 580 didn't drop any support afaik what exactly was in those now supposedly missing transistors?
    "No, you'll warrant no villain's exposition from me."

  7. #107
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Pantsu View Post
    Was there really an official number of transistors? Last I heard they were both ~3 billion, and Nvidia didn't give any more specific number. And since GTX 580 didn't drop any support afaik what exactly was in those now supposedly missing transistors?
    There was, yep, 3.2B vs 3B.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  8. #108
    Xtreme Mentor
    Join Date
    Jan 2009
    Location
    Oslo - Norway
    Posts
    2,879
    Quote Originally Posted by Pantsu View Post
    Was there really an official number of transistors? Last I heard they were both ~3 billion, and Nvidia didn't give any more specific number. And since GTX 580 didn't drop any support afaik what exactly was in those now supposedly missing transistors?
    I'm not sure about the details on micro-level, but I guess they have taken out some of Tesla-like functions. Those function in original GF100 was meant to come handy for GPGPU, but then it got hot and power hungry. But I'm not sure, and trying to find out more about it.

    ASUS P8P67 Deluxe (BIOS 1305)
    2600K @4.5GHz 1.27v , 1 hour Prime
    Silver Arrow , push/pull
    2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
    GTX560 GB OC @910/2400 0.987v
    Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
    CM Storm Scout + Corsair HX 1000W
    +
    EVGA SR-2 , A50
    2 x Xeon X5650 @3.86GHz(203x19) 1.20v
    Megahalem + Silver Arrow , push/pull
    3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
    XFX GTX 295 @650/1200/1402
    Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
    SilverStone Fortress FT01 + Corsair AX 1200W

  9. #109
    Xtreme Enthusiast
    Join Date
    Sep 2008
    Location
    ROMANIA
    Posts
    687
    I don't think they cut any functions, they've just reorganized the transzistor arangement they use third type of tranzistors, and they used less tranzistor because some of them were in excess.. I mean i think they cut some leakege tranzistors

    Thus the trick to making a good GPU is to use leaky transistors where you must, and use slower transistors elsewhere. This is exactly what NVIDIA did for GF100, where they primarily used 2 types of transistors differentiated in this manner. At a functional unit level we’re not sure which units used what, but it’s a good bet that most devices operating on the shader clock used the leakier transistors, while devices attached to the base clock could use the slower transistors. Of course GF100 ended up being power hungry – and by extension we assume leaky anyhow – so that design didn’t necessarily work out well for NVIDIA.

    For GF110, NVIDIA included a 3rd type of transistor, which they describe as having “properties between the two previous ones”. Or in other words, NVIDIA began using a transistor that was leakier than a slow transistor, but not as leaky as the leakiest transistors in GF100. Again we don’t know which types of transistors were used where, but in using all 3 types NVIDIA ultimately was able to lower power consumption without needing to slow any parts of the chip down. In fact this is where virtually all of NVIDIA’s power savings come from, as NVIDIA only outright removed few if any transistors considering that GF110 retains all of GF100’s functionality.
    http://www.anandtech.com/show/4008/n...orce-gtx-580/3
    i5 2500K@ 4.5Ghz
    Asrock P67 PRO3


    P55 PRO & i5 750
    http://valid.canardpc.com/show_oc.php?id=966385
    239 BCKL validation on cold air
    http://valid.canardpc.com/show_oc.php?id=966536
    Almost 5hgz , air.

  10. #110
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    I wonder if and when OC versions and custom versions appear whether it would be upgrading from a Single PCB GTX 295 to one of these puppies.
    The noise is what impresses me most and the performance appears to be good too
    Must admit the extra VRAM will come in handy for me in GTA IV and EFLC!
    John
    Stop looking at the walls, look out the window

  11. #111
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    Quote Originally Posted by JohnZS View Post
    I wonder if and when OC versions and custom versions appear whether it would be upgrading from a Single PCB GTX 295 to one of these puppies.
    The noise is what impresses me most and the performance appears to be good too
    Must admit the extra VRAM will come in handy for me in GTA IV and EFLC!
    John
    http://www.guru3d.com/news/sparkle-g...0-and-calibre/

    Besides the Sparkle card only Evga waterblock version is custom, the rest are afaik reference boards, though some might have meager OC clocks.
    "No, you'll warrant no villain's exposition from me."

  12. #112
    Xtreme Enthusiast
    Join Date
    Nov 2009
    Posts
    526
    Quote Originally Posted by zalbard View Post
    There was, yep, 3.2B vs 3B.
    No, nVidia has not given official amount of transistors other than ~3B for both GF100 & GF110.

  13. #113
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601

  14. #114
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by AffenJack View Post
    There's a difference, but not a big one at same temperature.
    http://www.hardwarecanucks.com/forum...review-22.html
    5-15W... and every chip comes binned to a different voltage which results in differences...

    then again, on the other hand the 580 has more sps and is clocked higher... so even at the same temperature it consumes less, with more logic enabled at higher clocks... so there really is an improvement chip-wise... cool...

    could also be the pwm... but i think its identical for 480 and 580 right?

    thx for the link, and good job hwcanucks!

    Quote Originally Posted by Pantsu View Post
    I think people are tired of the name game already. With 15~20% extra performance and basically no new features, not even hd audio bitstreaming support like on gf104, the gf110 hardly deserves a 500 moniker. Yet most people seem to just let it pass because it's Nvidia and that's what they do.

    As for the 5970, I'm inclined to agree that it's not as good a solution as a single GTX 580. Crossfire support is flaky at best, and minimum framerates suck according to reviews. The fact of the matter is that it's outdated now, soon to be replaced, and not a good buy unless with considerably lower price. I see it can be had for 390€ in Germany, while the cheapest GTX 580 are around 450€. Now that's a price low enough for a 5970 that makes you reconsider buying a GTX 580. And I suppose that's the idea too, AMD dropped the price for a few SKU just to make GTX 580 look worse, even though the 5970 stock will be sold out in no time.
    idk man... read the bittech review... at 2560x1600 with aa the 580 is notably faster and gets playable fps, especially min fps, while the 480 just doesnt cut it...

    its still "only" 20-30% up there as well, but its the 20-30% that was missing to have it playable with a 480... so idk... for a 2560x1600 gaming rig a 580 sounds great... you no longer need sli or xfire...

    the only thing is that availability right now is 0, and everybody claims itll be bad at best... and ati has a competing card around the corner, supposedly... so... even if i WANTED to upgrade, id HAVE to wait anyways, and by the time i could buy it, the 6900 is probably going to be out heh...

    Quote Originally Posted by NaMcO View Post
    That and the fact that it has 200,000,000 less transistors than GF100 eh?
    idk... do you think that makes a big diff?
    doubt those transistors were used a lot to begin with, otherwise nvidia wouldnt have cut them off, cause they want more perf, not less with gf110...
    not used a lot = dont add to power consumption...

  15. #115
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Higher temps than 5970... Almost as much noise.
    Not very good scaling at 2560 either, hmmm... 6870 definitely has better scaling.
    Last edited by zalbard; 11-10-2010 at 08:44 AM.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  16. #116
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,631
    5970 CF is missing!!
    Coming Soon

  17. #117
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    Ryan said in the comments they didn't have two 5970 to use.
    "No, you'll warrant no villain's exposition from me."

  18. #118
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by safan80 View Post
    I only use max details and filtering, if i wanted to deal with anything less I would play games on consoles. I myself used an 8800GTX on my 3007wfp. Is card fashion it would only give me 30-40 fps in games, that is ok for single player games, but not good at all for multi-player games. Later I switched to dual GTX 285s and they gave me 60-80 fps which made multi-player games playable without AA, but i don't use that for multi-player anyways. Now a days I like to keep the frames around 70-100fps for online play.
    I'm not taking issue with your personal preference in regards to what you need/want/like simply that line about "People that run 2560x1600 need dual gpu" is an exaggeration because we all don't "need" nor do I personally want to deal with a multi gpu config to game at 25x16 even with an x58 mb and a crossfire ready amd board.

    If a game is good its enjoyable to me just the same without high shadows or uber reflections whether I'm playing a multiplayer games online or a single player game.
    Last edited by highoctane; 11-10-2010 at 11:51 AM.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  19. #119
    Xtreme Enthusiast
    Join Date
    Sep 2008
    Location
    ROMANIA
    Posts
    687
    Sparkle GTX 580 at 509$ and with the promo code 10%(it has too) it cost's 459$.
    http://www.newegg.com/Product/Produc...82E16814187125
    460$ sounds much better...
    i5 2500K@ 4.5Ghz
    Asrock P67 PRO3


    P55 PRO & i5 750
    http://valid.canardpc.com/show_oc.php?id=966385
    239 BCKL validation on cold air
    http://valid.canardpc.com/show_oc.php?id=966536
    Almost 5hgz , air.

  20. #120
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Posts
    516
    Quote Originally Posted by saaya View Post
    the only thing is that availability right now is 0, and everybody claims itll be bad at best...
    Who is everybody? Charlie? Let's see what we have atm.

    http://www.newegg.com/Product/Produc...iption=gtx+580

    Nine models, all in stock at newegg.

    http://ncix.com/search/?categoryid=0&q=gtx+580

    Eight models, 5 in stock at ncix. This is anything but the paper launch/no cards till 2011 charlie claimed.

  21. #121
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by saaya View Post
    idk... do you think that makes a big diff?
    doubt those transistors were used a lot to begin with, otherwise nvidia wouldnt have cut them off, cause they want more perf, not less with gf110...
    lopping off 200M transistors is probably a lot simpler than what they actually did to fermi. maybe they found a way to do the same thing with less logic? i dont really know and there is very little information available that can offer insight into that subject.furthermore, the number of gates is not a good measurement of area because speed affects transistor size. if fermi were designed for half of the original clockspeed it could be half the size and an even smaller fraction of the power. there are so many other things i could list that they might have changed but they would only go unanswered.

    i have a hunch that the power circuit was improved. the ammeter and current limiter allows them to use a more efficient power circuit because peak current will be much lower. intel did something similar with montecito. it has 64 clockspeeds which saved them a lot of power when performance wasnt needed.
    not used a lot = dont add to power consumption...
    with leakage it does not matter if the transistors are being used or not. the power is still being consumed. dynamic power involves switching which is affected by activity. i would bet that the majority of g100 and gf110's power consumption comes from leakage, and more so than other chips.

  22. #122
    Xtreme Member
    Join Date
    Aug 2008
    Location
    Poland
    Posts
    199
    Quote Originally Posted by saaya View Post
    has anybody actually run gf100 and gf110 side by side at the same clocks and compared performance?
    im curious how much those tweaks boost performance and in what situation...
    Somebody did.

    http://nvision.pl/GeForce-GTX-580-GF...etails-18.html

  23. #123
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by JohnZS View Post
    I wonder if and when OC versions and custom versions appear whether it would be upgrading from a Single PCB GTX 295 to one of these puppies.
    The noise is what impresses me most and the performance appears to be good too
    Must admit the extra VRAM will come in handy for me in GTA IV and EFLC!
    John
    Evga have annonced 4 models ( 2 OC in reality, package differs)
    *EVGA GeForce GTX 580: 772MHz core, 1544MHz shaders, 4008MHz memory
    * EVGA GeForce GTX 580 Superclocked: 797MHz core, 1594MHz shaders, 4050MHz memory
    *EVGA GeForce GTX 580 Call of Duty: Black Ops Edition: Same as Superclocked but with a CoD: Black Ops style fan shroud and a poster. This model does not include the actual game.
    *EVGA GeForce GTX 580 FTW Hydro Copper 2: 850MHz core, 1700MHz shaders, 4196MHz memory

    Pricing:

    1.
    GTX 580 - 479.90 EUR
    2.
    GTX 580 Superclocked – 495 EUR
    3.
    GTX 580 Call of Duty :Black Ops edition – 499.90 EUR
    4.
    GTX 580 FTW Hydro Copper – 695.90 EUR ( lol, to the + 100 EUR for a waterblock, get an EK or other instead )
    Last edited by Lanek; 11-10-2010 at 12:18 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  24. #124
    Xtreme Enthusiast
    Join Date
    Jul 2009
    Location
    Perth Australia
    Posts
    651
    My apologies if I have missed it, but has the "Evga GTX 580 FTW Hydro Copper" been reviewed yet?
    I have read it's not yet released.

  25. #125
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    112
    Quote Originally Posted by zalbard View Post
    Higher temps than 5970... Almost as much noise.
    Not very good scaling at 2560 either, hmmm... 6870 definitely has better
    scaling.
    6870 is very slow card compared to 580
    so 2x580 will suffer from lack of cpu power.
    but despite this 580 SLI is doing it very well

    But what happens to AMD here??
    it is quite unplayable on AMD Hardware

    [IMG][/IMG]
    Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
    8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
    MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24¨in nVidia Surround

Page 5 of 9 FirstFirst ... 2345678 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •