There's a difference, but not a big one at same temperature.
http://www.hardwarecanucks.com/forum...review-22.html
Printable View
There's a difference, but not a big one at same temperature.
http://www.hardwarecanucks.com/forum...review-22.html
I think people are tired of the name game already. With 15~20% extra performance and basically no new features, not even hd audio bitstreaming support like on gf104, the gf110 hardly deserves a 500 moniker. Yet most people seem to just let it pass because it's Nvidia and that's what they do.
As for the 5970, I'm inclined to agree that it's not as good a solution as a single GTX 580. Crossfire support is flaky at best, and minimum framerates suck according to reviews. The fact of the matter is that it's outdated now, soon to be replaced, and not a good buy unless with considerably lower price. I see it can be had for 390€ in Germany, while the cheapest GTX 580 are around 450€. Now that's a price low enough for a 5970 that makes you reconsider buying a GTX 580. And I suppose that's the idea too, AMD dropped the price for a few SKU just to make GTX 580 look worse, even though the 5970 stock will be sold out in no time.
Nice performance. Will wait some numbers 580 vs 6970.
Was there really an official number of transistors? Last I heard they were both ~3 billion, and Nvidia didn't give any more specific number. And since GTX 580 didn't drop any support afaik what exactly was in those now supposedly missing transistors?
I'm not sure about the details on micro-level, but I guess they have taken out some of Tesla-like functions. Those function in original GF100 was meant to come handy for GPGPU, but then it got hot and power hungry. But I'm not sure, and trying to find out more about it.
I don't think they cut any functions, they've just reorganized the transzistor arangement they use third type of tranzistors, and they used less tranzistor because some of them were in excess.. I mean i think they cut some leakege tranzistors
http://images.anandtech.com/reviews/...transistor.jpg
http://www.anandtech.com/show/4008/n...orce-gtx-580/3Quote:
Thus the trick to making a good GPU is to use leaky transistors where you must, and use slower transistors elsewhere. This is exactly what NVIDIA did for GF100, where they primarily used 2 types of transistors differentiated in this manner. At a functional unit level we’re not sure which units used what, but it’s a good bet that most devices operating on the shader clock used the leakier transistors, while devices attached to the base clock could use the slower transistors. Of course GF100 ended up being power hungry – and by extension we assume leaky anyhow – so that design didn’t necessarily work out well for NVIDIA.
For GF110, NVIDIA included a 3rd type of transistor, which they describe as having “properties between the two previous ones”. Or in other words, NVIDIA began using a transistor that was leakier than a slow transistor, but not as leaky as the leakiest transistors in GF100. Again we don’t know which types of transistors were used where, but in using all 3 types NVIDIA ultimately was able to lower power consumption without needing to slow any parts of the chip down. In fact this is where virtually all of NVIDIA’s power savings come from, as NVIDIA only outright removed few if any transistors considering that GF110 retains all of GF100’s functionality.
I wonder if and when OC versions and custom versions appear whether it would be upgrading from a Single PCB GTX 295 to one of these puppies.
The noise is what impresses me most and the performance appears to be good too :)
Must admit the extra VRAM will come in handy for me in GTA IV and EFLC!
John
http://www.guru3d.com/news/sparkle-g...0-and-calibre/
Besides the Sparkle card only Evga waterblock version is custom, the rest are afaik reference boards, though some might have meager OC clocks.
5-15W... and every chip comes binned to a different voltage which results in differences...
then again, on the other hand the 580 has more sps and is clocked higher... so even at the same temperature it consumes less, with more logic enabled at higher clocks... so there really is an improvement chip-wise... cool...
could also be the pwm... but i think its identical for 480 and 580 right?
thx for the link, and good job hwcanucks! :toast:
idk man... read the bittech review... at 2560x1600 with aa the 580 is notably faster and gets playable fps, especially min fps, while the 480 just doesnt cut it...
its still "only" 20-30% up there as well, but its the 20-30% that was missing to have it playable with a 480... so idk... for a 2560x1600 gaming rig a 580 sounds great... you no longer need sli or xfire...
the only thing is that availability right now is 0, and everybody claims itll be bad at best... and ati has a competing card around the corner, supposedly... so... even if i WANTED to upgrade, id HAVE to wait anyways, and by the time i could buy it, the 6900 is probably going to be out heh...
idk... do you think that makes a big diff?
doubt those transistors were used a lot to begin with, otherwise nvidia wouldnt have cut them off, cause they want more perf, not less with gf110... :D
not used a lot = dont add to power consumption...
Ryan said in the comments they didn't have two 5970 to use.
I'm not taking issue with your personal preference in regards to what you need/want/like simply that line about "People that run 2560x1600 need dual gpu" is an exaggeration because we all don't "need" nor do I personally want to deal with a multi gpu config to game at 25x16 even with an x58 mb and a crossfire ready amd board.
If a game is good its enjoyable to me just the same without high shadows or uber reflections whether I'm playing a multiplayer games online or a single player game.
Sparkle GTX 580 at 509$ and with the promo code 10%(it has too) it cost's 459$.
http://www.newegg.com/Product/Produc...82E16814187125
460$ sounds much better...:)
Who is everybody? Charlie? Let's see what we have atm.
http://www.newegg.com/Product/Produc...iption=gtx+580
Nine models, all in stock at newegg.
http://ncix.com/search/?categoryid=0&q=gtx+580
Eight models, 5 in stock at ncix. This is anything but the paper launch/no cards till 2011 charlie claimed.
lopping off 200M transistors is probably a lot simpler than what they actually did to fermi. maybe they found a way to do the same thing with less logic? i dont really know and there is very little information available that can offer insight into that subject.furthermore, the number of gates is not a good measurement of area because speed affects transistor size. if fermi were designed for half of the original clockspeed it could be half the size and an even smaller fraction of the power. there are so many other things i could list that they might have changed but they would only go unanswered.
i have a hunch that the power circuit was improved. the ammeter and current limiter allows them to use a more efficient power circuit because peak current will be much lower. intel did something similar with montecito. it has 64 clockspeeds which saved them a lot of power when performance wasnt needed.with leakage it does not matter if the transistors are being used or not. the power is still being consumed. dynamic power involves switching which is affected by activity. i would bet that the majority of g100 and gf110's power consumption comes from leakage, and more so than other chips.Quote:
not used a lot = dont add to power consumption...
Somebody did.
http://nvision.pl/GeForce-GTX-580-GF...etails-18.html
Evga have annonced 4 models ( 2 OC in reality, package differs)
*EVGA GeForce GTX 580: 772MHz core, 1544MHz shaders, 4008MHz memory
* EVGA GeForce GTX 580 Superclocked: 797MHz core, 1594MHz shaders, 4050MHz memory
*EVGA GeForce GTX 580 Call of Duty: Black Ops Edition: Same as Superclocked but with a CoD: Black Ops style fan shroud and a poster. This model does not include the actual game.
*EVGA GeForce GTX 580 FTW Hydro Copper 2: 850MHz core, 1700MHz shaders, 4196MHz memory
Pricing:
1.
GTX 580 - 479.90 EUR
2.
GTX 580 Superclocked – 495 EUR
3.
GTX 580 Call of Duty :Black Ops edition – 499.90 EUR
4.
GTX 580 FTW Hydro Copper – 695.90 EUR ( lol, to the + 100 EUR for a waterblock, get an EK or other instead )
My apologies if I have missed it, but has the "Evga GTX 580 FTW Hydro Copper" been reviewed yet?
I have read it's not yet released.
6870 is very slow card compared to 580
so 2x580 will suffer from lack of cpu power.
but despite this 580 SLI is doing it very well:clap::clap:
But what happens to AMD here??
it is quite unplayable on AMD Hardware:confused::confused:
[IMG]http://img.photobucket.com/albums/v4...E30M/33815.png[/IMG]
maybe I should run the benchmark to confirm such fact... 2.8 fps looks like a too low number
Something does not add up on this slide. 5870 CF getting less FPS then single card? Driver bug perhaps?
I am busy now, but I will attempt to have a benchie done by tomorrow morning on 5970 under several resolutions, heaps of those numbers look odd
Sounds like newegg is vey greedy. Shame on them...
Having shadows on etc is the different in most cases of being killed by the guy behind you or killing him. I like to play with max details for this reason and like I said if I wanted anything less I would play games on a console. To get the full experience of any game at 2560x1600 you need a dual gpu configuration and anything less is unacceptable.
No it has not been reviewed yet. Most review sites are too chicken to touch water-cooled cards. It does look attractive at 850 clock at 1700 shader.
You can buy it off evga's site though
http://www.evga.com/products/prodlist.asp
ASUS GTX 580 SLI Review
http://www.overclock3d.net/reviews/g...usive_review/1
Is Hardwareheaven.com credible?
At the bottom of the page they post videos of the card's fans noise.
The GTX580 and GTX580 SLI videos are EXACTLY the same. :confused:
Load both of them up first, then try and start them at almost exactly the same time(i did very close:D) You will see they are the exact same video. Someone ought to tell them they made a booboo.
I've read the language of the article and it does have that alluring NV bias to it.
Hope they fix those vids before someone cries foul. ;)
I don't know if they can be trust, the only problem i have see was for F1, they was not able to run in CFX .... but here my result with CFX in F1 2010 with 10.10a+ CAP ( old bench screenshot, without 10.10c and new cap ) ... all maxed in game settings: 16xAF + 8xMSAA - DX11 ultra settings .... 1920x1080
They call that a driver problem ? they don't even know how to set it, or they have not use last drivers ... (I7@4ghz+ anyway)
http://img179.imageshack.us/img179/6/f12010bench.png
It is posible some one get burned by oc :shakes: :rofl:
If you see unreal performance drops, lags and weard behavior of your pc. It may be overclocked too hie.
Anyway i call that overclock data loss. It reparing data with more cycles then usial. Maybe pci-e voltages rise will help maybe gpu voltages,maybe graphic memory :shrug:
Off topic:
I know you ppl you havent play real game that slelge hammers fpu math coprosessor 100% and slight oc it makes crash of game.(is not abaut that game is bad written)
Anyways i wrote this: Tips for quality gaming + Networks And Windows Tips
for vets(old players)in game > Thats makes me master too not just admin :D :p:
http://www.bz2maps.com/phpBB/viewtopic.php?t=1597
Does a 15% performance update justifies upping the generation's number by a single digit
Either the evolution of GPU performance has *really* slowed lately, or we're dealing with fraud here (not a first for nVidia, I'm sure).
How come they can get away with it? Computer Hardware is a multi-billion industry, such tricks can make great difference (the card's real name is GTX 485)
my bad, i stand corrected!
but no, he wasnt the only one that said availability will suck... several others have mentioned that theres supposedly a small batch of cards for launch and thats it... supply after that will be bad or non-existant.
in the end, who cares though, cards ARE available right now, your right... and thats what matters... whoever wants one can get it. thats good news, and really well done by nvidia, gotta give them that :toast:
i didnt expect this... they really did a 180 after their fermi pre-pre-pre launch bs and delays...
thx! :toast:
maybe gf100 had dividers in place to adjust the clocks of individual segments of the chip? and before gf100 launched they had already figured out which ratios work best and in gf110 they cemented the ratios and could remove buffers and dividers?
and maybe the reworked power planes allowed them to move things around a bit, avoiding hotspots which in turn allowed them to pack transistors more tightly?
too bad we will never know... this is the kind of stuff id love to know, and its a shame that by the time its not sensitive information anymore, its just forgotten instead of made public... would be such a nice read...
sweet, thx! :toast:
too bad they didnt run more tests... too bad theres no way to limit a 580 to 480sps, looks like we will have to wait for a 570 for that... clock for clock a 580 is around 8% faster than a 480? and it has more sps which would result in a 6-8% boost, so... it looks like there is close to 0 ipc boost?
the boost seems to be sps and clocks only, right?
hmmm but that wouldnt explain the bittech numbers... they see huge gains at 2560x1600 of the 580 over the 480...
how can a 10% clock boost and 7% more sps result in some cases in a performance boost of more than double those theoretical gains combined?
so either gf110 is tweaked for 2560x1600 and/or aa, or maybe bittech used older 480 numbers of an older driver and compared it to a 580 with the latest driver? but i cant imagine them doing that...
hmmm
Product naming is pure marketing and both companies are doing what every one else would do (this time that is).
Both HD68xx and GTX580 have new bits introduced to them. Just to name few:
GTX580 - better culling, full speed FP16, other minor tweaks.
HD68xx - new UVD, improved display output capabilities, improved tesselation performance, other minor tweaks.
Is that enough for new generation moniker? I think it is, at least in current realities of I.T.
if we had a poll, i think we would see a majority choose 485 over 580
however since there is nothing really wrong with picking a new number, ofcoarse marketing would suggest they go with a new number to remove any implications with old issues (lack of cores, high tdp)
If they named it 485 what would they name all the other incoming GF11x parts?
so theres plans for more than just a 570 in the upcoming months till 28nm?
when is a 560 coming?
There was a few cards in Norway at launch, but there is no GTX580 to be found in stock now!
With AMD's back-out on 6990, and resent news about delay of 6870 & 50, it won't surprise if we get an artificial-shortage of GTX580 (the same shortage as 5970 that made the "real" price to go up up).
Are real prices (for in stock GTX580) is holding or going up?
Sorry guyz, have you seen a review where GTX 580 and GTX 480 have the same thermal solution ?
manicdan, some are already speculating that gf104 384 will be a 5xx part... wouldnt be surprised to see that happen...
is it not now such that
you will save a little power
then it is AMD
will you run fast and have the best of the best
it's nVidia
I just think that they making too much out of a few watts
it feels a bit like you'd rather run a little sh.t car
running a little longer mileage
than a high speed luxury car that
uses slightly more fuel.
if you put the same cooler on both cards, you can find out how much better the process has improved the efficiency.
and power turns to heat
heat needs to be removed
air movement turns to fan noise
noise turns to annoyance
annoyance turns to PC-out-window
its also just good to see how far nvidia has come with their learning of the 40nm process. people wanted to see improvement from 480-580, not amd vs nvidia.
dude you can't talk crap about his rig, if the specs in his sig are true it's a total beast. when you have that kinda money you expect massive performance and 480 SLI delivers the best performance (and game support) for the price he paid.
you would rather buy a Physx card? let me remind you that Physx cards were over $200 and offered much less physx performance compared to a $80 GT430 (or comparable older card) Nvidia also increased the number of games that support physx from 2 to like 8 (thats a 4x increase:p:) the ageia cards were not designed very well and the price was high I do however think that Nvidia should make it easier to integrate a dedicated Physx GPU into an ATI system (without hacks etc)
While that's true when there is even decent availability the 2 big local etailers always get some stock. Might be just that they are waiting on shipments or it could be that Nvidia had limited stock and shipped that towards Newegg and a few other large etailers to make it appear as if there is good supply. The card definitely looks very good though, despite being GF100 done right. Hopefully competition will drive prices from both companies down in the next few months. getting a 6950 or 570 at 250-300€ would be nice
Let's not forget that while certainly quite efficient, Physx isn't "free" on Nvidia cards.
It will reduce framerates. Whether you have frames to spare is debateable, but it does reduce overall performance.
You can but the joke needs to be funny :p:
lol I went onto ebay and found an old school PCI Physx card for $250 US, I made the guy a counter offer of $80 US, he answered that...
"those things retail for hundreds!! check out this link!"
which took me to a site that sells obsolete hardware at sky high prices... I loled so hard all I could do is wish him luck with the sale..
:D
......................
More like it annihilates overall performance. I remember seeing 30ish FPS with a GTX 480 on freaking BATMAN, which is a game you can maximize on a 8800GT. SWEAR TO ME!!!! Yeah, not 30 FPS all day, just on certain places, but come on, the card is GTX 480 and the game is Batman.
Flying paper sheets and the worst fog effect I have ever seen in my life (yeah, it's interactive fog... which means it kind of moves when you go through it, which is unfortunate because then you get to see the worst fog effect ever IN ACTION), is it worth it to halve the performance?
Still on i7 920s I see. Getting a little long in tooth, plenty of 32nm out there already. I'd loan you some but I use my stack of 980xs as space heaters.
Randomly trolling accomplishes nothing as I have just demonstrated. Troll another forum if you must troll. I highly suggest 4Chan's b/.
Both GTX 480 & GTX 580 clocked at 701/924. The game tested was Metro 2033
at a resolution of 1920x1200 (also 1280). At 1920:
580: 39.33 FPS
480: 38.00 FPS
This is a difference of 1.33 FPS. I only wished they tested more DX11 games.
source
fermi-done-right versus amd/ati's biggest and best chip ever -- a battle for the ages
Newegg has really gone to hell over the past couple years. They used to have the lowest prices online, and were typically 25% below what I'd pay if I walked into Fry's Electronics, MicroCenter, or CompUSA (when they had stores). Sad to say, but I usually drive 15 miles to MicroCenter in Tustin, CA to buy most computer parts since most of their prices are easily on par with the online e-tailers, plus I don't have to wait for UPS to come. They even price their CPU's according to the best prices at the internet stores. I paid $199 for my i7 860 a year ago, while Newegg is still $279 right this moment.
And FYI.... Microcenter isn't charging a dime over MSRP for this video card. :)
I'm quite surprised how nVidia (and ATi) even put up with retailers adding markups over MSRP to their products. A very common practice when drafting a contract between manufacturer/distributor and retailer is to have it contain a clause stating that authorized retailers are not permitted to charge over MSRP. In addition, contracts tend to contain rules about much below MSRP (above dealer cost) a product can be sold. We did that at Oettinger GmbH (a VW and Audi tuner) in order to keep our nationwide dealer network from killing each other, and their own businesses if one shop had a distinct advantage because of low overhead costs. It also ensured we could prevent gray market goods from sneaking in as well as making sure parts were sold and installed via our dealers. Inexperienced body shops tended to do horrible jobs on body kit fit, even simple exhaust installs would wind up crooked.
http://www.techpowerup.com/134460/Di...ing-GPU-Z.htmlQuote:
Disable GeForce GTX 580 Power Throttling using GPU-Z
NVIDIA shook the high-end PC hardware industry earlier this month with the surprise launch of its GeForce GTX 580 graphics card, which extended the lead for single-GPU performance NVIDIA has been holding. It also managed to come up with some great performance per Watt improvements over the previous generation. The reference design board, however, made use of a clock speed throttling logic which reduced clock speeds when an extremely demanding 3D application such as Furmark or OCCT is run. While this is a novel way to protect components saving consumers from potentially permanent damage to the hardware, it does come as a gripe to expert users, enthusiasts and overclockers, who know what they're doing.
GPU-Z developer and our boss W1zzard has devised a way to make disabling this protection accessible to everyone (who knows what he's dealing with), and came up with a nifty new feature for GPU-Z, our popular GPU diagnostics and monitoring utility, that can disable the speed throttling mechanism. It is a new command-line argument for GPU-Z, that's "/GTX580OCP". Start the GPU-Z executable (within Windows, using Command Prompt or shortcut), using that argument, and it will disable the clock speed throttling mechanism. For example, "X:gpuz.exe /GTX580OCP" It will stay disabled for the remainder of the session, you can close GPU-Z. It will be enabled again on the next boot....
W1zzard is the man. period.
Almost all of neweggs 580s have a 10% off promo and free shipping. Stupid I know but it still works out to msrp.
Metro 2033, to put it kindly, has problems with drivers. All that tells you is how poorly optimized it is. Same thing happens with fallout3 and oblivion... should rename their exe to falloutNV.
doesnt seem all that much better SLI then the 480s in SLI, wonder if ATI 69XX series will rape the 580??
That's not talking crap? I just thought the typo of watercooled vs wattercooled was hilarious on a rig with 2 480's. I also know very well that those cards deliver, sadly at a high price but they still deliver.
As for the 8800GT for physx that was mainly a test and I took it out not long after since I think it's a waste of power. *removes from sig*
@Johnny87au :
I was thinking the same, wonder how much that will change with never drivers.
Really love the fight nvidia & amd are having, giving us something interesting to look at and more hardware to play with :D
A liitle update from us, GTX 580 article number 3:
GTX 580 vs HD 5970 - The gamer's test | LAB501 | Google Translate
Performance tests in 21 games, with min/max/avg fps.
no 1600? spending 550-700$ on a 5970 to play at 1080/1200 is a bit weird imo
there are cards that cost 1/3 and do the job :D
with 8aa you CAN push those cards at 1080/1200, but it doesnt really make sense...
your better off getting a 1600 display and playing with 4aa or even 2aa... better experience if you ask me :)
if you spend this much on a vga, spending less than that on a display is an unbalanced decision imo
interesting numbers nevertheless... at high aa the 580 is doing well... too bad theres no 480 in there to compare to
More than you would think. 1600*2560 monitors are still pretty expensive at 1200 dollars+. But I do agree, they do need to test at both resolutions. I guess it is somewhat understandable that they only tested at one because of the amount of games.
Are 5970 still 700 dollars in romania?
i looked at all the average fps, quite a few were way past 60 average, but a decent chunk still were under or around 60, meaning 1920 is still very stressable.
but i do agree that if your going to spend 500+ on a gpu, get a 1000$ monitor that will last for many generations.
however it might mean you need to spend more on gpus just to use that resolution.
(ps, im really mad that gateway stopped making their 30", when it was going away the price was 940$ with free shipping, nothing is even close to that anymore)
Guys, the launch review I made has 1920x1200 8x/2xAA and 2560x1600 with 4x/noAA. That is my type of review, which I consider much more accurate then this one we are talking about. This was a "gamer's review", so I had to choose the best case scenario for these boards. Now as far as my Analitycs satistics tell me, and as far as Steam statistics tell me, very few people have 30" monitors.
I thought about it like this:
1920x1200 with everything maxed out, UD7 + i950, HD 5970 or GTX 580 = tipical high-end machine
2560x1600, UD9, i980X, 2xGTX 580 or more = extreme enthusiast machine
I simply had to choose the best scenario for these two cards, in order to be able to test 21 games. Also, this is a test for our readrs, so they can say which review they like the most, this one or our initial GTX 580. My point in the end is that both reviews have the same results (GTX 580 vs HD 5970 is pretty much a draw), even if you test in 6 or 21 games. The part I like the most about our first review is the fact that it is much more accurate, using 2 AA setings an 2 resolutions, no fraps, and so on.
Thing is both reviews take the same ammount of time to work on if I make them like this. If I would do the 21 games review with 2 res and 2 AA settings and 3cards, it would take me an enormous ammount of time we do not have. So this is pretty much a question of choice for our readers. Which one do you preffer, the acuurate launch review with less games, or the 21 games review with only one setting, min/max/avg fps with Fraps and so on.
wow wtf??? 30" prices went up by 30%??? when did that happen?
and what happened to the 27" 1600 panels?
but there are 13" 1600x900 panels for laptops... the disp industry is weird lol :confused:
well with 8aa... :D
monstru, your right... good research!
i didnt notice that 1600 panels were almost dead... i really wonder what happened...
<200$ 1080
300$ 1200
>3500$ 1536 :stick:
1200$ 1600
:confused:
so the only thing above 1080/1200 is basically dual 1080/1200?
but that sucks for most games... 3 displays is much better than 2... but it still sucks and its so many pixels it will be hard to feed again...
sigh... this shows how disconnected videocard makers and the display makers are... or could it be that this is some weird transition phase and we will have very cheap 1600 panels soon? :confused:
both nvidia and ati really need that, cause a 580 really doesnt make much sense for 1080/1200... and neither will a 6900 or 5970 make sense...
http://www.evga.com/forums/tm.aspx?m=694378Quote:
PALIT recalling their GTX 580 Sonic?
Dear customer
Thank you for purchasing Palit product.
We found few GPUs seem have unstable status in factory clock so we issue the recall notice.
Most of the GTX 580 Sonic products don't have such an issue but we rather to have more conservative consideration so we issue the recall notice.
Please contact your vendor to proceed further recall steps.
Thanks.
Palit Support
GeForce GTX 580 vs. Radeon HD 5970 2GB Performance
The HD 5970 delivers faster gaming framerates compared to the GeForce GT 580. However, if you look at the actual usability of those frame the picture is a bit different. The GeForce GTX 580 allows a consistently higher level of the gameplay experience compared to the Radeon HD 5970. We were able to game at higher settings with the GTX 580 than we were with the Radeon HD 5970. The most important factor, beyond framerates, is the visual quality and experience returned by the product. The GeForce GTX 580 allows a more immersive, smoother, and consistent quality of gameplay.
http://www.hardocp.com/article/2010/...gb_performance
Smoother gameplay, huh?.. I think I've seen it before...
if anything they should have reviewed with a triple 1080p surround/eyefinity considering buying 3 23inch 1080p monitors is still half as expensive as one 30inch...
and lets be honest if you have the money for a 1600p screen you should have the money and smarts to be running SLI GTX 470's or Crossfire 5870's at a MINIMUM. more like GTX 480 SLI with that kind of cash...
People avoid multi GPU setups for reasons other than cost. Being able to afford something doesnt make it an automatic purchase.
When the 5970 launched I purchased 2 5970s but I sold the second one later for cash. The 480 was not the full fermi, so I refused to buy it. The 580 GTX was late to the part and it is not even a thought at this point because it would not be an upgrade because it performed the same as my year old 5970.
If I use the drop down menu in your review on the translated page it changes languages. I don't see why you don't have an embedded translation link on your page similar to other non-English sites.
On to your point. Have you seen my sig? Why would i go out and buy 2 Nvidia cards when ATI's new cards (Cayman) are just around the corner and I would be able to make a clear decision. I have brains to not go out and waste my money. Who in their right mind would paid for a ud9 and 4 580GTX when SLI does not scale well beyond 2 cards. You basically did the same thing that guru3d did 9 days before you with there 580GTX review
http://www.guru3d.com/article/geforce-gtx-580-review/15
honestly there was no new information in your review.
+1 exactly power consumption being one of those.