Thanks for the enlightenment cold2010 :)
Printable View
http://www.expc.ca/i-17979-ASUS_ENGTX590_3DIS_3GD5.html
A bit too expensive...
E: posted before already... soz
This thread has degraded into a half assed flame war, so I'll give a different view.
I think this generation of dual-GPU cards is a joke. Seriously. In the past, when we've gotten dual GPU cards such as the 7950GX2, GTX 295, HD 4870x2, HD 5970, etc., these cards have always been fully within spec of PCI-express power limitations. It's also always been done in a way that pushes the envelope. However, while all of the solutions listed have been "extreme", not one of them has had major compromises. They're all power hungry, they all have beefy coolers, and they all use specialized bridge chips to communicate between GPUs. It's stuff that we've come to expect.
Now, this generation, it looks like we're going to get 2 flavors of dual GPU solutions. Either quiet and less powerful (GTX 590) or very loud and more powerful (HD 6990). It's sort of like choosing between the Republican and Democrat parties; both are fighting for your votes but on different topics and platforms. I, for one, hate it. Why? Because by the sound of it, the competition is no longer about who can build the "best card", but it's about who can build the "best solution". This market has now approached a point where the technology is pushed to such insane levels that we have to choose which wave of "insanity" you wish to pursue, and each one of them has severe compromises. What a joke.
If this is where the future of GPUs is going, I'll be retiring from PC gaming at the end of my system's life. I want no part of it. 375-watts from a video card?!?!?!? My big LCD CCFL monitor uses less energy and gives off less heat, and it's more useful. Both manufacturers have lost their way, and if they don't change quickly, I have a feeling that this market will begin shrinking. Nobody wants to pay for a dustbuster, and nobody wants to pay for a sub-par "top end" video card either. When single GPU variants are approaching the 300-watt cap, you can't make a dual GPU solution without giving something up. I welcome the advent of vapor chambers as the successor to large heatpipe solutions. That's wonderful. Does that mean we can now build GPUs that are quieter? Or can we just build GPUs that use more power and give off more heat?
Nvidia and AMD... you both lose. :down:
Nobody is forcing you to buy it.
No, they're not, but as somebody that loves to play PC games, I must keep up with current technology with periodic upgrades every 3 to 4 years. Each time I've upgraded, the GPUs have given off a little more heat and used a little more power. A couple of the times, the coolers actually got quieter, so I thought they were headed in the right direction. When I saw the cooler on the 8800 GTX for the first time, I was like :shocked: . Now, that style of cooler is the norm on virtually all cards from about $130 up.
This was a high-end GPU from 2002:
Geforce 4 Ti 4600
http://www.activewin.com/reviews/har...i4600board.jpg
This is a low-end GPU today:
Geforce GT 430 low-profile
http://www.takesontech.com/wp-conten...gt430-lg11.jpg
So what's the big deal? The cooling solution on the GT 430 is bigger, and I bet anything that the GT 430 draws more power than the Ti 4600 did 9 years ago. This is a $70 card we're talking about.
I mean, CPU's still only require very small HSF solutions to actually run. We put larger ones on by choice. What's Nvidia's and AMD's excuse?
GPUs have advanced many times more than CPUs in the same time period so that's not a fair comparison. If you make a personal choice to only purchase products with a given price or power envelope you would still see significant gains in performance each generation. It really shouldn't matter if the fastest products get hotter or louder as long as there is a wide range of products to choose from. Just pick the one that matches your needs!
So why is it that there is it that a $230 CPU can accept virtually any multi-GPU configuration and offer virtually no bottlenecks in gameplay? The Sandy Bridge i5/i7's are very powerful and offer up extremely high framerates if the GPU is fast enough to keep up with it.
Sorry, but I think GPUs are holding back current CPUs, and those CPUs are on dinky coolers too. In essence, a 300+ watt GPU is holding back a 95-watt CPU from reaching its full potential... something is wrong with that picture.
EDIT: Also, I'm aware of a GPUs compute capabilities. They are far greater than that of a CPU. Until more programs come out that actually use the power of a GPU in a useful way (folding and GRID is neat, but that's about it) it's really not a good example of how GPUs are being used.
607 mhz?
Translated link:
http://translate.google.com/translat...590gtx-p3d3gd5
:shocked:Quote:
Support performance improvement over 38% of the voltage function
In addition to basic fan speed control and burn-in testing, MSI N590GTX-P3D3GD5 supports Afterburner super voltage function, pressurized by a 607MHz core clock setting raised to 840MHz, an increase of up to 38% overclocking potential, and then with the advanced graphics technology game , gives players more smooth gaming experience.
Taking a different view (i.e. gamesplayers = beta-testers for numeric co-processors),
these rumoured clocks are faster than used in Tesla Fermis. So for all us cuda.elves,
this card may be an interesting upgrade for our fleets of dusty 295's.
:shrug:
Nvidia to launch GeForce GTX 590
Quote:
Nvidia will release its GeForce GTX 590 graphics chip on March 22 to take on AMD's recently released AMD Radeon HD 6990, according to industry sources. The AMD Radeon HD 6990 is priced at US$699.
Facing competition from Nvidia, AMD has begun to cut prices by 20-30% for a number of older models, including Radeon HD6870/6850/5870 and 5850. Nvidia followed suit, lowering prices for its GT220, GTS450, GTX460/465 also in a range of 20-30%.
Consequently, Asustek Computer has lowered the price of its Radeon HD5870 graphics card to NT$6,000 (US$203) recently from NT$14,000 in 2010.
huum digitmes live in the past or ? 5870-5850 are completely EOL, GTX465 is no more produced, same for other Nvidia cards listed..
840Mhz? Not bad. Considering an over clocked 6990 doesn't match the performance of stock clocked 580's in SLI in 95% of benchmarks, the GTX590 might turn out to be faster after all.
Would be very eager to see: 6990 + 6970 VS GTX 590 + GTX 580.
You cannot do GTX590+GTX580. Its dual SLI (1 590) or Quad SLI (2 590) only.
Wrong. Full Fermi consumer chips like GF110 have full DP support. It's just that performance is artificially limited to 1/4 of a similar Tesla chip. DP performance is still much faster than with the older Nvidia chips.
DP isn't practically ever used in games and not even used in GPGPU apps like Folding@Home. I'm actually considering putting 2-4 of these GTX590's to my folding rig :cool:
590 OC ability depends on the cooling and how much power you can draw from the pci-e connectors and how well the circuitry handles it. With stock cooling it's going to be hard to keep the overvolted card cool even if it could draw enough power from the psu. And the PSU 12V rails need to be quite beefy if the amperage exceeds the limits considerably.
So, Fermi does not have full DP support.... :shakes:
If you are able to devote 1600-3200$ towards humanity's good then awesome! Just hurry up with the orders, rumours has it there won't be a big supply of those... :rolleyes:Quote:
I'm actually considering putting 2-4 of these GTX590's to my folding rig :cool: