NVIDIA GeForce GTX 580 TDP is 244W, includes 128 TMU, Benchmarks Leaked
http://vr-zone.com/articles/report-n...ked/10202.html
NVIDIA GeForce GTX 580 TDP is 244W, includes 128 TMU, Benchmarks Leaked
http://vr-zone.com/articles/report-n...ked/10202.html
It wasn't as bad as one would think though, atleast for the 6870.
http://www.legitreviews.com/article/1455/9/
A hair below the gtx 470 and slightly above the 5850 is where I would put the speed of a 6870.
The 6850 is an anomaly however, which I can imagine will change with future drivers.
Core i7 920@ 4.66ghz(H2O)
6gb OCZ platinum
4870x2 + 4890 in Trifire
2*640 WD Blacks
750GB Seagate.
Where is ZED_X to add some joy to this thread?
I'm sure hes already got a dual 580= GTX 595 sitting in his lab and wants to tell us all tales about AMD doomsday...
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
Yep, especially coming from a guy who kept trumpeting the propaganda of a certain nVidia fanboy from Czech, like this:
BRAVO !
Have you ever heard a thing called GOODWILL ? AMD still have some in the tank, not so sure about nVidia though, regarding the renaming game. You want to shed a bad light over AMD, atleast let it happen first, accusation can only take you so far. You calls out some members as certain side fanboy, very much the definition of the pot calling the kettle black.
I don't think Cayman XT will be at GF110's level TDP speaking, my guess is at around 210-220 w. They put a 8 pin PEG supporting 300 w max board power, perhaps because of OCing support or less efficient (compared to Cypress board) VRM design which doesn't use digital VRM again, judging from the leaked PCB screen.
And we already know the definition of nVidia's Fermi GF 100 TDP is quite, hmm what should we call it ? different ? Last leak suggested 244 w TDP for GTX 580, we have to see first whether it was like GTX 480's fake 250 w TDP or more appropiate GTX 460's 160 w TDP.
If the numbers are true, then it will fall close to what they had to work with on the 295. But we were given a lot of numbers for the 480 when it came to power usage until nvidia came clean with final Tesla PFD's.
Nvidia engineers are willing to push the edges more than ATI when it comes to power however, if they feel threatened they may just do it anyway I'm pretty sure they've already done some work towards such a card, they just won't sell it to OEM's. Lets not forget, ATI have already allowed card makers to follow this path on their own.
I guess it's something you only do when you're sure the effort will result in good market share, you could just sit back try not to lose too much money and strike back on the next node.
Heh, TDP of 244 W, GTX 480 had TDP of 225 W.
If it translates to similar power consumption, then GTX 580 will draw some 240 W at load, peak at about 270 W and max over 320 W.
IIRC, GTX 480 TDP is 250 w officially, though we know some leaked slide mentioned <299 w TDP which is closer to its actual worst case scenario power consumption.
Judging from the leaked cooler pic, the way it looks & designed, i think GTX 580 leaked TDP of 244 w seems more legit this time, atleast not overshoot as much as its previous brethren. But expecting a dual chip board out of it that was made officially by nVidia & complying with PCIE sig rule, that's rather too optimistic at the moment.
Last edited by spursindonesia; 11-02-2010 at 05:46 PM.
2 generations on a row????? no way!!!
I hope on the 9th we get more than the world's fastest excel graph...
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
Nvidia can do a paper launch IF they give all the outlets ala anandtech samples for full review's. If Nvdia makes me wait and the 6970 comes out first i see no reason to even consider a 580.
Only thing i'm iffy on is ATI's drivers which have been less then marginal for a long time.
According to whom? The hordes of nvidia die hards?
ATI fanboys love to make fun of specific things nvidia - aka tdp, lateness to market etc. But nvidia boys go to much greater lengths to make themselves special. Countless whine threads about ATI drivers, self proclaimed driver gods are nvidia fans. Lots of times the thread starters even pose their sigs where it says they own not one but- several nvidia cards. They whine about ati when they own nvidia!
Ill tell you what if you buy a single cayman and find significant driver bugs that it makes fun gaming impossible for you I will gladly buy that cayman off you. xfire and sli both carry bugs. Its just that ati boys are a little more accepting of that fact than nvidia boys.
ps: now watch as I soon get a reply saying that I'm wrong cause ati drivers suck and nvidia's are awesome.
Last edited by Dimitriman; 11-02-2010 at 06:54 PM.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
Intel i7 2600K 5GHZ Watercooled. 2x Asus DirectCU II TOP GTX670 SLI @1250/7000/Watercooled. Asus Maximus IV Extreme. PCI Express X-Fi Titanium Fatal1ty Champion Series.
8GB Corsair 2000Mhz Ram. 4x OCZ Vertex3 120GB SSD. .3xSamsung F1 1TB All in A Lian li Tyr PC-X2000 Chassi. Logitech diNovo Edge keybord
MX Revolution mouse and Z-5500 Digital 5.1 speakers Corsair HX-1200W PSU Samsung 244T 24"+ 3xPhilips 24Ļin nVidia Surround
Honestly, i think nVidia has given up the fastest graphic card title in this round, but not the fastest SINGLE GPU card title, certainly not without a good fight.
Putting two GF114 seems futile if the card would be compared with dual Cayman board named Antilles, that wouldn't shed the best light on nVidia. Getting the king of the hill title, it's kinda go for number one or none at all, because the stake are high, there's no 2nd place winner for such a halo card.
Unlike the old times when both HD 4870X2 and 3870X2 used two midrange chips & clearly aimed at nVidia single huge GPU chip, this time AMD will use two highend Cayman chips to power its dual GPU board in taking (preserving) the numero uno crown. Either nVidia complies with a dual GF110 on their own & making an uncompetitive, PCIE sig complying card, or let the AIBs do the dirty work creating a nuclear reactor card of their own.
nVidia will try to maintain the status quo, GTX 580 > Cayman XT, while covering as much as they can regarding the existence of Antilles.
Last edited by spursindonesia; 11-02-2010 at 07:42 PM.
I was asked by people here to translate his blog, because I speak the language...
Sure, he's biased towards NV, everybody knows that, but he got most of the things right. And don't forget, that AMD tried to find the leak this time, so the false info was coming from them.
I hope NVIDIA will allow their partners to make their own cooler, because I never liked reference card coolers, they usually run warmer. Would like to see a Gigabyte Windforce2 GTX 580.
AMD Threadripper 12 core 1920x CPU OC at 4Ghz | ASUS ROG Zenith Extreme X399 motherboard | 32GB G.Skill Trident RGB 3200Mhz DDR4 RAM | Gigabyte 11GB GTX 1080 Ti Aorus Xtreme GPU | SilverStone Strider Platinum 1000W Power Supply | Crucial 1050GB MX300 SSD | 4TB Western Digital HDD | 60" Samsung JU7000 4K UHD TV at 3840x2160
Accurate? His predictions are all over the place. First he says there will be no availability, then good availability for 6870 but not 6850. Then he says no news until next year but we know Cayman is still to be launched this year.
According to Newegg, there are 11/14 available 6800 cards. According to Prisjakt.nu (Swedish search engine) the availability for 6850 is quite good. Availability at launch at least in Sweden was good (much better than 5800 launch).
OBR is not what I would call a good source..
Bookmarks