There's a difference, but not a big one at same temperature.
http://www.hardwarecanucks.com/forum...review-22.html
There's a difference, but not a big one at same temperature.
http://www.hardwarecanucks.com/forum...review-22.html
Abit IC7 P4 2.8a @4.21 | P4 3.4e @4.9 | Gainward 6800GT GS @486/1386
Asus P4P800 SE Dothan 730-PM @ 2900 | EVGA 6800 Ultra GS @521/1376
e8400@4.3G & 8800GTS G92 800/1932/1132 as gaming rig 24/7
Custom self build chillbox with watercooling @-28c 24/7 | chilled wc " cpu -18c idle/-3c load
3DMark 2005 Score Dothan & 6800U
3DMark 2005 Score p4 & 6800GT
I think people are tired of the name game already. With 15~20% extra performance and basically no new features, not even hd audio bitstreaming support like on gf104, the gf110 hardly deserves a 500 moniker. Yet most people seem to just let it pass because it's Nvidia and that's what they do.
As for the 5970, I'm inclined to agree that it's not as good a solution as a single GTX 580. Crossfire support is flaky at best, and minimum framerates suck according to reviews. The fact of the matter is that it's outdated now, soon to be replaced, and not a good buy unless with considerably lower price. I see it can be had for 390€ in Germany, while the cheapest GTX 580 are around 450€. Now that's a price low enough for a 5970 that makes you reconsider buying a GTX 580. And I suppose that's the idea too, AMD dropped the price for a few SKU just to make GTX 580 look worse, even though the 5970 stock will be sold out in no time.
"No, you'll warrant no villain's exposition from me."
Nice performance. Will wait some numbers 580 vs 6970.
Intel Q9650 @500x9MHz/1,3V
Asus Maximus II Formula @Performance Level=7
OCZ OCZ2B1200LV4GK 4x2GB @1200MHz/5-5-5-15/1,8V
OCZ SSD Vertex 3 120Gb
Seagate RAID0 2x ST1000DM003
XFX HD7970 3GB @1111MHz
Thermaltake Xaser VI BWS
Seasonic Platinum SS-1000XP
M-Audio Audiophile 192
LG W2486L
Liquid Cooling System :
ThermoChill PA120.3 + Coolgate 4x120
Swiftech Apogee XT, Swiftech MCW-NBMAX Northbridge
Watercool HeatKiller GPU-X3 79X0 Ni-Bl + HeatKiller GPU Backplate 79X0
Laing 12V DDC-1Plus with XSPC Laing DDC Reservoir Top
3x Scythe S-FLEX "F", 4x Scythe Gentle Typhoon "15", Scythe Kaze Master Ace 5,25''
Apple MacBook Pro 17` Early 2011:
CPU: Sandy Bridge Intel Core i7 2720QM
RAM: Crucial 2x4GB DDR3 1333
SSD: Samsung 840 Pro 256 GB SSD
HDD: ADATA Nobility NH13 1GB White
OS: Mac OS X Mavericks
ASUS Sabertooth P67B3· nVidia GTX580 1536MB PhysX · Intel Core i7 2600K 4.5GHz · Corsair TX850W · Creative X-Fi Titanium Fatal1ty
8GB GSKill Sniper PC3-16000 7-8-7 · OCZ Agility3 SSD 240GB + Intel 320 SSD 160GB + Samsung F3 2TB + WD 640AAKS 640GB · Corsair 650D · DELL U2711 27"
Was there really an official number of transistors? Last I heard they were both ~3 billion, and Nvidia didn't give any more specific number. And since GTX 580 didn't drop any support afaik what exactly was in those now supposedly missing transistors?
"No, you'll warrant no villain's exposition from me."
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
I don't think they cut any functions, they've just reorganized the transzistor arangement they use third type of tranzistors, and they used less tranzistor because some of them were in excess.. I mean i think they cut some leakege tranzistors
http://www.anandtech.com/show/4008/n...orce-gtx-580/3Thus the trick to making a good GPU is to use leaky transistors where you must, and use slower transistors elsewhere. This is exactly what NVIDIA did for GF100, where they primarily used 2 types of transistors differentiated in this manner. At a functional unit level we’re not sure which units used what, but it’s a good bet that most devices operating on the shader clock used the leakier transistors, while devices attached to the base clock could use the slower transistors. Of course GF100 ended up being power hungry – and by extension we assume leaky anyhow – so that design didn’t necessarily work out well for NVIDIA.
For GF110, NVIDIA included a 3rd type of transistor, which they describe as having “properties between the two previous ones”. Or in other words, NVIDIA began using a transistor that was leakier than a slow transistor, but not as leaky as the leakiest transistors in GF100. Again we don’t know which types of transistors were used where, but in using all 3 types NVIDIA ultimately was able to lower power consumption without needing to slow any parts of the chip down. In fact this is where virtually all of NVIDIA’s power savings come from, as NVIDIA only outright removed few if any transistors considering that GF110 retains all of GF100’s functionality.
i5 2500K@ 4.5Ghz
Asrock P67 PRO3![]()
P55 PRO & i5 750
http://valid.canardpc.com/show_oc.php?id=966385
239 BCKL validation on cold air
http://valid.canardpc.com/show_oc.php?id=966536
Almost 5hgz , air.
I wonder if and when OC versions and custom versions appear whether it would be upgrading from a Single PCB GTX 295 to one of these puppies.
The noise is what impresses me most and the performance appears to be good too
Must admit the extra VRAM will come in handy for me in GTA IV and EFLC!
John
Stop looking at the walls, look out the window
http://www.guru3d.com/news/sparkle-g...0-and-calibre/
Besides the Sparkle card only Evga waterblock version is custom, the rest are afaik reference boards, though some might have meager OC clocks.
"No, you'll warrant no villain's exposition from me."
5-15W... and every chip comes binned to a different voltage which results in differences...
then again, on the other hand the 580 has more sps and is clocked higher... so even at the same temperature it consumes less, with more logic enabled at higher clocks... so there really is an improvement chip-wise... cool...
could also be the pwm... but i think its identical for 480 and 580 right?
thx for the link, and good job hwcanucks!
idk man... read the bittech review... at 2560x1600 with aa the 580 is notably faster and gets playable fps, especially min fps, while the 480 just doesnt cut it...
its still "only" 20-30% up there as well, but its the 20-30% that was missing to have it playable with a 480... so idk... for a 2560x1600 gaming rig a 580 sounds great... you no longer need sli or xfire...
the only thing is that availability right now is 0, and everybody claims itll be bad at best... and ati has a competing card around the corner, supposedly... so... even if i WANTED to upgrade, id HAVE to wait anyways, and by the time i could buy it, the 6900 is probably going to be out heh...
idk... do you think that makes a big diff?
doubt those transistors were used a lot to begin with, otherwise nvidia wouldnt have cut them off, cause they want more perf, not less with gf110...
not used a lot = dont add to power consumption...
Last edited by zalbard; 11-10-2010 at 08:44 AM.
Ryan said in the comments they didn't have two 5970 to use.
"No, you'll warrant no villain's exposition from me."
I'm not taking issue with your personal preference in regards to what you need/want/like simply that line about "People that run 2560x1600 need dual gpu" is an exaggeration because we all don't "need" nor do I personally want to deal with a multi gpu config to game at 25x16 even with an x58 mb and a crossfire ready amd board.
If a game is good its enjoyable to me just the same without high shadows or uber reflections whether I'm playing a multiplayer games online or a single player game.
Last edited by highoctane; 11-10-2010 at 11:51 AM.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
Sparkle GTX 580 at 509$ and with the promo code 10%(it has too) it cost's 459$.
http://www.newegg.com/Product/Produc...82E16814187125
460$ sounds much better...![]()
i5 2500K@ 4.5Ghz
Asrock P67 PRO3![]()
P55 PRO & i5 750
http://valid.canardpc.com/show_oc.php?id=966385
239 BCKL validation on cold air
http://valid.canardpc.com/show_oc.php?id=966536
Almost 5hgz , air.
Who is everybody? Charlie? Let's see what we have atm.
http://www.newegg.com/Product/Produc...iption=gtx+580
Nine models, all in stock at newegg.
http://ncix.com/search/?categoryid=0&q=gtx+580
Eight models, 5 in stock at ncix. This is anything but the paper launch/no cards till 2011 charlie claimed.
lopping off 200M transistors is probably a lot simpler than what they actually did to fermi. maybe they found a way to do the same thing with less logic? i dont really know and there is very little information available that can offer insight into that subject.furthermore, the number of gates is not a good measurement of area because speed affects transistor size. if fermi were designed for half of the original clockspeed it could be half the size and an even smaller fraction of the power. there are so many other things i could list that they might have changed but they would only go unanswered.
i have a hunch that the power circuit was improved. the ammeter and current limiter allows them to use a more efficient power circuit because peak current will be much lower. intel did something similar with montecito. it has 64 clockspeeds which saved them a lot of power when performance wasnt needed.with leakage it does not matter if the transistors are being used or not. the power is still being consumed. dynamic power involves switching which is affected by activity. i would bet that the majority of g100 and gf110's power consumption comes from leakage, and more so than other chips.not used a lot = dont add to power consumption...
Somebody did.
http://nvision.pl/GeForce-GTX-580-GF...etails-18.html
Evga have annonced 4 models ( 2 OC in reality, package differs)
*EVGA GeForce GTX 580: 772MHz core, 1544MHz shaders, 4008MHz memory
* EVGA GeForce GTX 580 Superclocked: 797MHz core, 1594MHz shaders, 4050MHz memory
*EVGA GeForce GTX 580 Call of Duty: Black Ops Edition: Same as Superclocked but with a CoD: Black Ops style fan shroud and a poster. This model does not include the actual game.
*EVGA GeForce GTX 580 FTW Hydro Copper 2: 850MHz core, 1700MHz shaders, 4196MHz memory
Pricing:
1.
GTX 580 - 479.90 EUR
2.
GTX 580 Superclocked – 495 EUR
3.
GTX 580 Call of Duty :Black Ops edition – 499.90 EUR
4.
GTX 580 FTW Hydro Copper – 695.90 EUR ( lol, to the + 100 EUR for a waterblock, get an EK or other instead )
Last edited by Lanek; 11-10-2010 at 12:18 PM.
CPU: - I7 4930K (EK Supremacy )
GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
Motherboard: Asus x79 Deluxe
RAM: G-skill Ares C9 2133mhz 16GB
Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0
My apologies if I have missed it, but has the "Evga GTX 580 FTW Hydro Copper" been reviewed yet?
I have read it's not yet released.
Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24¨in nVidia Surround
Bookmarks