Page 28 of 42 FirstFirst ... 182526272829303138 ... LastLast
Results 676 to 700 of 1028

Thread: NVIDIA GTX 595 (picture+Details)

  1. #676
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    Quote Originally Posted by cold2010 View Post
    Thanks for the enlightenment cold2010

  2. #677
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    247
    http://www.expc.ca/i-17979-ASUS_ENGTX590_3DIS_3GD5.html

    A bit too expensive...

    E: posted before already... soz

  3. #678
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    This thread has degraded into a half assed flame war, so I'll give a different view.

    I think this generation of dual-GPU cards is a joke. Seriously. In the past, when we've gotten dual GPU cards such as the 7950GX2, GTX 295, HD 4870x2, HD 5970, etc., these cards have always been fully within spec of PCI-express power limitations. It's also always been done in a way that pushes the envelope. However, while all of the solutions listed have been "extreme", not one of them has had major compromises. They're all power hungry, they all have beefy coolers, and they all use specialized bridge chips to communicate between GPUs. It's stuff that we've come to expect.

    Now, this generation, it looks like we're going to get 2 flavors of dual GPU solutions. Either quiet and less powerful (GTX 590) or very loud and more powerful (HD 6990). It's sort of like choosing between the Republican and Democrat parties; both are fighting for your votes but on different topics and platforms. I, for one, hate it. Why? Because by the sound of it, the competition is no longer about who can build the "best card", but it's about who can build the "best solution". This market has now approached a point where the technology is pushed to such insane levels that we have to choose which wave of "insanity" you wish to pursue, and each one of them has severe compromises. What a joke.

    If this is where the future of GPUs is going, I'll be retiring from PC gaming at the end of my system's life. I want no part of it. 375-watts from a video card?!?!?!? My big LCD CCFL monitor uses less energy and gives off less heat, and it's more useful. Both manufacturers have lost their way, and if they don't change quickly, I have a feeling that this market will begin shrinking. Nobody wants to pay for a dustbuster, and nobody wants to pay for a sub-par "top end" video card either. When single GPU variants are approaching the 300-watt cap, you can't make a dual GPU solution without giving something up. I welcome the advent of vapor chambers as the successor to large heatpipe solutions. That's wonderful. Does that mean we can now build GPUs that are quieter? Or can we just build GPUs that use more power and give off more heat?

    Nvidia and AMD... you both lose.
    Last edited by Mad Pistol; 03-19-2011 at 12:57 PM.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  4. #679
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Nobody is forcing you to buy it.

  5. #680
    Xtreme Enthusiast
    Join Date
    Dec 2009
    Location
    Burbank, CA
    Posts
    563
    Quote Originally Posted by Mad Pistol View Post
    This thread has degraded into a half assed flame war, so I'll give a different view.

    I think this generation of dual-GPU cards is a joke. Seriously. In the past, when we've gotten dual GPU cards such as the 7950GX2, GTX 295, HD 4870x2, HD 5970, etc., these cards have always been fully within spec of PCI-express power limitations. It's also always been done in a way that pushes the envelope. However, while all of the solutions listed have been "extreme", not one of them has had major compromises. They're all power hungry, they all have beefy coolers, and they all use specialized bridge chips to communicate between GPUs. It's stuff that we've come to expect.

    Now, this generation, it looks like we're going to get 2 flavors of dual GPU solutions. Either quiet and less powerful (GTX 590) or very loud and more powerful (HD 6990). It's sort of like choosing between the Republican and Democrat parties; both are fighting for your votes but on different topics and platforms. I, for one, hate it. Why? Because by the sound of it, the competition is no longer about who can build the "best card", but it's about who can build the "best solution". This market has now approached a point where the technology is pushed to such insane levels that we have to choose which wave of "insanity" you wish to pursue, and each one of them has severe compromises. What a joke.

    If this is where the future of GPUs is going, I'll be retiring from PC gaming at the end of my system's life. I want no part of it. 375-watts from a video card?!?!?!? My big LCD CCFL monitor uses less energy and gives off less heat, and it's more useful. Both manufacturers have lost their way, and if they don't change quickly, I have a feeling that this market will begin shrinking. Nobody wants to pay for a dustbuster, and nobody wants to pay for a sub-par "top end" video card either. When single GPU variants are approaching the 300-watt cap, you can't make a dual GPU solution without giving something up. I welcome the advent of vapor chambers as the successor to large heatpipe solutions. That's wonderful. Does that mean we can now build GPUs that are quieter? Or can we just build GPUs that use more power and give off more heat?

    Nvidia and AMD... you both lose.
    I kind of feel like graphics are going nowhere in the past 3 years, its mainly because of consoles holding back games. GPU's today are preety dissapointing, from nvidia and amd.

  6. #681
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by trinibwoy View Post
    Nobody is forcing you to buy it.
    No, they're not, but as somebody that loves to play PC games, I must keep up with current technology with periodic upgrades every 3 to 4 years. Each time I've upgraded, the GPUs have given off a little more heat and used a little more power. A couple of the times, the coolers actually got quieter, so I thought they were headed in the right direction. When I saw the cooler on the 8800 GTX for the first time, I was like . Now, that style of cooler is the norm on virtually all cards from about $130 up.

    This was a high-end GPU from 2002:

    Geforce 4 Ti 4600




    This is a low-end GPU today:

    Geforce GT 430 low-profile



    So what's the big deal? The cooling solution on the GT 430 is bigger, and I bet anything that the GT 430 draws more power than the Ti 4600 did 9 years ago. This is a $70 card we're talking about.

    I mean, CPU's still only require very small HSF solutions to actually run. We put larger ones on by choice. What's Nvidia's and AMD's excuse?
    Last edited by Mad Pistol; 03-19-2011 at 06:15 PM.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  7. #682
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    GPUs have advanced many times more than CPUs in the same time period so that's not a fair comparison. If you make a personal choice to only purchase products with a given price or power envelope you would still see significant gains in performance each generation. It really shouldn't matter if the fastest products get hotter or louder as long as there is a wide range of products to choose from. Just pick the one that matches your needs!

  8. #683

  9. #684
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by trinibwoy View Post
    GPUs have advanced many times more than CPUs in the same time period so that's not a fair comparison. If you make a personal choice to only purchase products with a given price or power envelope you would still see significant gains in performance each generation. It really shouldn't matter if the fastest products get hotter or louder as long as there is a wide range of products to choose from. Just pick the one that matches your needs!
    So why is it that there is it that a $230 CPU can accept virtually any multi-GPU configuration and offer virtually no bottlenecks in gameplay? The Sandy Bridge i5/i7's are very powerful and offer up extremely high framerates if the GPU is fast enough to keep up with it.

    Sorry, but I think GPUs are holding back current CPUs, and those CPUs are on dinky coolers too. In essence, a 300+ watt GPU is holding back a 95-watt CPU from reaching its full potential... something is wrong with that picture.




    EDIT: Also, I'm aware of a GPUs compute capabilities. They are far greater than that of a CPU. Until more programs come out that actually use the power of a GPU in a useful way (folding and GRID is neat, but that's about it) it's really not a good example of how GPUs are being used.
    Last edited by Mad Pistol; 03-19-2011 at 07:33 PM.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  10. #685
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    607 mhz?
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  11. #686
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    Quote Originally Posted by cold2010 View Post
    Translated link:
    http://translate.google.com/translat...590gtx-p3d3gd5

    Support performance improvement over 38% of the voltage function

    In addition to basic fan speed control and burn-in testing, MSI N590GTX-P3D3GD5 supports Afterburner super voltage function, pressurized by a 607MHz core clock setting raised to 840MHz, an increase of up to 38% overclocking potential, and then with the advanced graphics technology game , gives players more smooth gaming experience.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

  12. #687
    Registered User
    Join Date
    Nov 2007
    Posts
    9
    Taking a different view (i.e. gamesplayers = beta-testers for numeric co-processors),
    these rumoured clocks are faster than used in Tesla Fermis. So for all us cuda.elves,
    this card may be an interesting upgrade for our fleets of dusty 295's.


  13. #688
    Xtreme Enthusiast
    Join Date
    Jun 2010
    Posts
    588
    Nvidia to launch GeForce GTX 590

    Nvidia will release its GeForce GTX 590 graphics chip on March 22 to take on AMD's recently released AMD Radeon HD 6990, according to industry sources. The AMD Radeon HD 6990 is priced at US$699.

    Facing competition from Nvidia, AMD has begun to cut prices by 20-30% for a number of older models, including Radeon HD6870/6850/5870 and 5850. Nvidia followed suit, lowering prices for its GT220, GTS450, GTX460/465 also in a range of 20-30%.

    Consequently, Asustek Computer has lowered the price of its Radeon HD5870 graphics card to NT$6,000 (US$203) recently from NT$14,000 in 2010.

  14. #689
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    huum digitmes live in the past or ? 5870-5850 are completely EOL, GTX465 is no more produced, same for other Nvidia cards listed..
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  15. #690
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    399
    Quote Originally Posted by nnunn View Post
    So for all us cuda.elves,
    this card may be an interesting upgrade for our fleets of dusty 295's.
    Fermi's don't have full DP support.

    Nvidia followed suit, lowering prices for its GT220, GTS450, GTX460/465 also in a range of 20-30%.
    That would make the GTX550 officialy pointless, as long as 460's are available.

  16. #691
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    466
    Quote Originally Posted by Mad Pistol View Post
    So again Fermi cards scream and OC like demons. Fastest card (dual gpu) is a given.

  17. #692
    Xtreme Enthusiast
    Join Date
    Sep 2008
    Location
    Fort Rucker, Alabama
    Posts
    626
    840Mhz? Not bad. Considering an over clocked 6990 doesn't match the performance of stock clocked 580's in SLI in 95% of benchmarks, the GTX590 might turn out to be faster after all.
    GPU: 4-Way SLI GTX Titan's (1202 MHz Core / 3724 MHz Mem) with EK water blocks and back-plates
    CPU: 3960X - 5.2 GHz with Koolance 380i water block
    MB: ASUS Rampage IV Extreme with EK full board water block
    RAM: 16 GB 2400 MHz Team Group with Bitspower water blocks
    DISPLAY: 3x 120Hz Portrait Perfect Motion Clarity 2D Lightboost Surround
    SOUND: Asus Xonar Essence -One- USB DAC/AMP
    PSU: EVGA SuperNOVA NEX1500
    SSD: Raid 0 - Samsung 840 Pro's
    BUILD THREAD: http://hardforum.com/showthread.php?t=1751610

  18. #693
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    399
    Quote Originally Posted by Callsign_Vega View Post
    840Mhz? Not bad. Considering an over clocked 6990 doesn't match the performance of stock clocked 580's in SLI in 95% of benchmarks, the GTX590 might turn out to be faster after all.
    ... when overclocked and volted properly. Not suprising when it's downclocked that much from the start, is it?

    Why isn't it launched with higher clocks from the start if it clocks so well?

  19. #694
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Hong Kong
    Posts
    1,905
    Would be very eager to see: 6990 + 6970 VS GTX 590 + GTX 580.
    -


    "Language cuts the grooves in which our thoughts must move" | Frank Herbert, The Santaroga Barrier
    2600K | GTX 580 SLI | Asus MIV Gene-Z | 16GB @ 1600 | Silverstone Strider 1200W Gold | Crucial C300 64 | Crucial M4 64 | Intel X25-M 160 G2 | OCZ Vertex 60 | Hitachi 2TB | WD 320

  20. #695
    Xtreme Member
    Join Date
    Oct 2010
    Location
    香港 , Hong Kong
    Posts
    463
    Quote Originally Posted by CedricFP View Post
    Would be very eager to see: 6990 + 6970 VS GTX 590 + GTX 580.
    I think nvidia will win
    If you live each day as if it was your last, someday you'll most certainly be right.

  21. #696
    Xtreme Enthusiast
    Join Date
    Sep 2008
    Location
    Fort Rucker, Alabama
    Posts
    626
    You cannot do GTX590+GTX580. Its dual SLI (1 590) or Quad SLI (2 590) only.
    GPU: 4-Way SLI GTX Titan's (1202 MHz Core / 3724 MHz Mem) with EK water blocks and back-plates
    CPU: 3960X - 5.2 GHz with Koolance 380i water block
    MB: ASUS Rampage IV Extreme with EK full board water block
    RAM: 16 GB 2400 MHz Team Group with Bitspower water blocks
    DISPLAY: 3x 120Hz Portrait Perfect Motion Clarity 2D Lightboost Surround
    SOUND: Asus Xonar Essence -One- USB DAC/AMP
    PSU: EVGA SuperNOVA NEX1500
    SSD: Raid 0 - Samsung 840 Pro's
    BUILD THREAD: http://hardforum.com/showthread.php?t=1751610

  22. #697
    Registered User
    Join Date
    Apr 2010
    Posts
    11
    Quote Originally Posted by DarthShader View Post
    Fermi's don't have full DP support.
    Wrong. Full Fermi consumer chips like GF110 have full DP support. It's just that performance is artificially limited to 1/4 of a similar Tesla chip. DP performance is still much faster than with the older Nvidia chips.

    DP isn't practically ever used in games and not even used in GPGPU apps like Folding@Home. I'm actually considering putting 2-4 of these GTX590's to my folding rig

  23. #698
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    590 OC ability depends on the cooling and how much power you can draw from the pci-e connectors and how well the circuitry handles it. With stock cooling it's going to be hard to keep the overvolted card cool even if it could draw enough power from the psu. And the PSU 12V rails need to be quite beefy if the amperage exceeds the limits considerably.
    "No, you'll warrant no villain's exposition from me."

  24. #699
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    399
    Quote Originally Posted by ahu View Post
    Wrong. Full Fermi consumer chips like GF110 have full DP support. It's just that performance is artificially limited to 1/4 of a similar Tesla chip.
    So, Fermi does not have full DP support....

    I'm actually considering putting 2-4 of these GTX590's to my folding rig
    If you are able to devote 1600-3200$ towards humanity's good then awesome! Just hurry up with the orders, rumours has it there won't be a big supply of those...

  25. #700
    Xtreme Member
    Join Date
    Aug 2010
    Location
    Athens, Greece
    Posts
    116
    Quote Originally Posted by DarthShader View Post
    Fermi's don't have full DP support.
    You mean Desktop cards based on Fermi architecture, because Tesla cards based on Fermi architecture DO have full DP support.
    Intel Core i7 920@4GHz, ASUS GENE II, 3 x 4GB DDR-3 1333MHz Kingston, 2x ASUS HD6950 1G CU II, Intel SSD 320 120GB, Windows 7 Ultimate 64bit, DELL 2311HM

    AMD FX8150 vs Intel 2500K, 1080p DX-11 gaming evaluation.

Page 28 of 42 FirstFirst ... 182526272829303138 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •