MMM
Results 1 to 25 of 1028

Thread: NVIDIA GTX 595 (picture+Details)

Threaded View

  1. #11
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    442
    This thread has degraded into a half assed flame war, so I'll give a different view.

    I think this generation of dual-GPU cards is a joke. Seriously. In the past, when we've gotten dual GPU cards such as the 7950GX2, GTX 295, HD 4870x2, HD 5970, etc., these cards have always been fully within spec of PCI-express power limitations. It's also always been done in a way that pushes the envelope. However, while all of the solutions listed have been "extreme", not one of them has had major compromises. They're all power hungry, they all have beefy coolers, and they all use specialized bridge chips to communicate between GPUs. It's stuff that we've come to expect.

    Now, this generation, it looks like we're going to get 2 flavors of dual GPU solutions. Either quiet and less powerful (GTX 590) or very loud and more powerful (HD 6990). It's sort of like choosing between the Republican and Democrat parties; both are fighting for your votes but on different topics and platforms. I, for one, hate it. Why? Because by the sound of it, the competition is no longer about who can build the "best card", but it's about who can build the "best solution". This market has now approached a point where the technology is pushed to such insane levels that we have to choose which wave of "insanity" you wish to pursue, and each one of them has severe compromises. What a joke.

    If this is where the future of GPUs is going, I'll be retiring from PC gaming at the end of my system's life. I want no part of it. 375-watts from a video card?!?!?!? My big LCD CCFL monitor uses less energy and gives off less heat, and it's more useful. Both manufacturers have lost their way, and if they don't change quickly, I have a feeling that this market will begin shrinking. Nobody wants to pay for a dustbuster, and nobody wants to pay for a sub-par "top end" video card either. When single GPU variants are approaching the 300-watt cap, you can't make a dual GPU solution without giving something up. I welcome the advent of vapor chambers as the successor to large heatpipe solutions. That's wonderful. Does that mean we can now build GPUs that are quieter? Or can we just build GPUs that use more power and give off more heat?

    Nvidia and AMD... you both lose.
    Last edited by Mad Pistol; 03-19-2011 at 12:57 PM.
    PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •