Page 19 of 42 FirstFirst ... 91617181920212229 ... LastLast
Results 451 to 475 of 1028

Thread: NVIDIA GTX 595 (picture+Details)

  1. #451
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by Pantsu View Post
    And I don't see how they could pull it off considering a single GTX 570 has TDP of 220 W.
    Well according to TPU's power consumption numbers for a single 570 it's very realistic for a dual 570 to operate at a 375w level.

    http://www.techpowerup.com/reviews/H...D_6970/27.html
    Last edited by highoctane; 03-07-2011 at 04:28 PM.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  2. #452
    Banned
    Join Date
    May 2006
    Location
    Brazil
    Posts
    580
    2x GF110 with 512 SPs @ 650-670mhz may be able to beat HD6990 at default clocks (830mhz)

    700mhz seems high but not impossible with handpicked chips and agressive power control, c'mon nvidia, surprise us...

  3. #453
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Pantsu View Post
    For GTX 590 to be faster than 6990 it would have to be faster than 570 sli. Now can Nvidia pull of a 570 sli in one card? They can of course use full GF110 chips, but they need to drop the voltage and clocks to a 375W level. And I don't see how they could pull it off considering a single GTX 570 has TDP of 220 W.

    Personally I think performance wise Nvidia will admit defeat, but the dual GPU card could still be a good offer if priced accordingly. It would give a good option for Nvidia Surround. Also the reference cooler might actually be something usable.
    I hope ATI and NVIDIA don't get caught up in a perf race... both cards will be monstrous regardless of clocks. Who cares if a is 10% faster than b if both are MORE than fast enough. I hope the don't release super hot n noisy cards which make sli and xfire look like a better alternative lol.

    Looking back into the past my guess is ATI will be faster but hotter and noisier.

  4. #454
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by JohnZS View Post
    Thank you for clearing that up for me DarthShader
    Although I am sure I did hear somewhere that PhysX 3.0 would be kinder in multiGPU situations. At the moment on the single PCB GTX 295 all PhysX processing is done on GPU B

    Rendering is done on both GPU A+B. So in games which use PhysX GPU B is working a lot harder than GPU A. If the work could be split across multiple GPU's, then PhysX would have less of an impact.

    But hey, nothing wrong with having some SSE and multi-threading love

    John
    I dont think you will see PhysX across multiple GPU's being used in games as it already induces too much latency to be used real-time with just a single GPU. Thats why all PhysX effects on "GPU" are only effects and not interactive. Interactive physics stay on the CPU.

    All along the watchtower the watchmen watch the eternal return.

  5. #455
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by -Sweeper_ View Post
    2x GF110 with 512 SPs @ 650-670mhz may be able to beat HD6990 at default clocks (830mhz)

    700mhz seems high but not impossible with handpicked chips and agressive power control, c'mon nvidia, surprise us...
    you are optimisit, ( this is not a troll ) 2x 580 @ 650mhz will never close them ..

    580 are @ 772mhz ............ 570 SLI have 732mhz base speed

    And we know all how the core clock or shader speed have a big impact on Nvidia ALU.
    I don't want say AMD or Nvidia will be faster ( waiting the test and for what i care lol ), just the 650-670mhz look to don't be enough compared to a 6950 cfx with 30mhz more and full 6970SP or a 880mhz version with full 6970 cfx core speed and SP ..

    This is not for enter a " fight of who will win or loose " ( for what i care ) but just for comment your numbers.
    Last edited by Lanek; 03-07-2011 at 05:45 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  6. #456
    Banned
    Join Date
    May 2006
    Location
    Brazil
    Posts
    580
    Quote Originally Posted by Lanek View Post
    you are optimisit, ( this is not a troll ) 2x 580 @ 650mhz will never close them ..

    580 are @ 772mhz and the 6990 should be a little bit under of this ( 5-7% ) ....

    And we know all how the core clock or shader speed have a big impact on Nvidia ALU. I don't want say AMD or Nvidia will be faster ( waiting the test ), just the 650-670mhz look don't enough.
    at 670mhz it should be just a hair under 2x GTX570s in SLI and = HD6990 @ 830MHz

    heck, lets just wait for the actual reviews

  7. #457
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    I don't think ... underclock 2x GTX 580 and you will understand the problem. but yes you are right, let's wait " review " .
    Last edited by Lanek; 03-07-2011 at 05:51 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  8. #458
    Xtreme Enthusiast
    Join Date
    Jun 2008
    Posts
    660
    An unfortunate person is one tries to fart but sh1ts instead...

    My Water Cooling Case Build (closed)

  9. #459
    Xtreme Addict Chrono Detector's Avatar
    Join Date
    May 2009
    Posts
    1,142
    ^^ 3 DVI's? No HDMI? Thats kinda odd because I would have expected it to have HDMI but whatever, I prefer DVI.
    AMD Threadripper 12 core 1920x CPU OC at 4Ghz | ASUS ROG Zenith Extreme X399 motherboard | 32GB G.Skill Trident RGB 3200Mhz DDR4 RAM | Gigabyte 11GB GTX 1080 Ti Aorus Xtreme GPU | SilverStone Strider Platinum 1000W Power Supply | Crucial 1050GB MX300 SSD | 4TB Western Digital HDD | 60" Samsung JU7000 4K UHD TV at 3840x2160

  10. #460
    Xtreme Member
    Join Date
    Jul 2007
    Posts
    371
    Quote Originally Posted by -Sweeper_ View Post
    at 670mhz it should be just a hair under 2x GTX570s in SLI and = HD6990 @ 830MHz

    heck, lets just wait for the actual reviews
    You're forgetting that memory will be downclocked as well (I'm guessing to 900-950 mhz) to lower power consumption. So 670 mhz still may come short.

  11. #461
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    Quote Originally Posted by saaya View Post
    Who cares if a is 10% faster than b if both are MORE than fast enough.

    ATI and nV care Saaya.... you don't think they're doing this for us, do you?

    THEY compete for the bragging rights and HALO effect... we just buy the resulting product
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  12. #462
    Registered User
    Join Date
    Dec 2010
    Location
    Sweden
    Posts
    66
    I think Nvidia is in alot of trouble if they really want that performance crown.
    They will need 2 downclocked GTX 570's to compete in the same powerconsumption as 6990:



    BUT who knows what lies or tricks nvidia is ready to use
    CPU: Intel i5 2500K + Antec Khuler 620 Memory: 4GB DDR3 Corsair DHX @ 1600MHz CL7 GPU: Nvidia GTX 560Ti + Antec Khuler 620
    Motherboard: Zotac Z68ITX-A-E HDD: Crucial M4 128GB + 2TB Samsung F4EG Chassi: Lian Li Q11B PSU: Cooler Master Silent Pro 850W OS: Windows 7 x64
    Welcome to my home theater! | mattBLACK Gallery | Minima "H20" Gallery

  13. #463
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Kristers Bensin View Post
    I think Nvidia is in alot of trouble if they really want that performance crown.
    They will need 2 downclocked GTX 570's to compete in the same powerconsumption as 6990:



    BUT who knows what lies or tricks nvidia is ready to use
    i think that will be their (our) benefit

    if you have to pay for 2 downclocked chips, hopefully the price you pay is for their current performance (so like 600$ instead of 800+), then all we gotta do is watercool and overclock and get 30% more perf out of it and catch up to 580sli OC. most people paying for such cards either dont care about a higher price, or know how to overclock.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  14. #464
    Registered User
    Join Date
    Dec 2010
    Location
    Sweden
    Posts
    66
    Quote Originally Posted by Manicdan View Post
    i think that will be their (our) benefit

    if you have to pay for 2 downclocked chips, hopefully the price you pay is for their current performance (so like 600$ instead of 800+), then all we gotta do is watercool and overclock and get 30% more perf out of it and catch up to 580sli OC. most people paying for such cards either dont care about a higher price, or know how to overclock.
    The performance crown isnt won by watercooled OCed cards, but the reference design. If 6990 is the better on paper nvidia might loose some PR, but then again gain some if they manage some WR with it.
    CPU: Intel i5 2500K + Antec Khuler 620 Memory: 4GB DDR3 Corsair DHX @ 1600MHz CL7 GPU: Nvidia GTX 560Ti + Antec Khuler 620
    Motherboard: Zotac Z68ITX-A-E HDD: Crucial M4 128GB + 2TB Samsung F4EG Chassi: Lian Li Q11B PSU: Cooler Master Silent Pro 850W OS: Windows 7 x64
    Welcome to my home theater! | mattBLACK Gallery | Minima "H20" Gallery

  15. #465
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Kristers Bensin View Post
    The performance crown isnt won by watercooled OCed cards, but the reference design. If 6990 is the better on paper nvidia might loose some PR, but then again gain some if they manage some WR with it.
    remember the 5970, they advertised non stop the ability to OC it past specs. and thats how its going to be in the future if people still try to squeeze in as much as possible into 300W, or they built it for more and just have a profile for 300W. the perf crown back in the day was just a simple, whos the strongest, but now it seems to be who has the more efficient design at 300w exactly. and they are trying to be smarter about packing in more perf while maintaining that pcie compliance, and testers need to be aware of that too so they can give a better idea of real world use efficiency and perf, instead of looking at power consumption by just one benchmark that is no where near real life use.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  16. #466
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    IF the GTX 590 is within 10% of the Radeon 6990, and costs £100 less and is a lot quieter and a lot cooler then nVidia will win this round...... in my opinion.
    However I can see the GTX 590 being more expensive, hotter and potentially louder too
    = A DRAW!
    John
    Stop looking at the walls, look out the window

  17. #467
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Kristers Bensin View Post
    I think Nvidia is in alot of trouble if they really want that performance crown.
    They will need 2 downclocked GTX 570's to compete in the same powerconsumption as 6990:



    BUT who knows what lies or tricks nvidia is ready to use
    What the......

  18. #468
    Xtreme Enthusiast
    Join Date
    Jun 2010
    Posts
    588
    What the......
    lol

  19. #469
    Registered User
    Join Date
    Dec 2010
    Location
    Sweden
    Posts
    66
    Quote Originally Posted by SKYMTL View Post
    What the......
    That is Sweclockers.com test and those numbers are wattage during a normal Vantage run, wich represent the real powerdraw better than Furmark. Sweclockers 6990 review
    CPU: Intel i5 2500K + Antec Khuler 620 Memory: 4GB DDR3 Corsair DHX @ 1600MHz CL7 GPU: Nvidia GTX 560Ti + Antec Khuler 620
    Motherboard: Zotac Z68ITX-A-E HDD: Crucial M4 128GB + 2TB Samsung F4EG Chassi: Lian Li Q11B PSU: Cooler Master Silent Pro 850W OS: Windows 7 x64
    Welcome to my home theater! | mattBLACK Gallery | Minima "H20" Gallery

  20. #470
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Kristers Bensin View Post
    That is Sweclockers.com test and those numbers are wattage during a normal Vantage run, wich represent the real powerdraw better than Furmark. Sweclockers 6990 review
    How many runs were done?

    As many know, Vantage peaks in several different areas; many of which are less than a second long and may not be picked up by a standard power meter.

    In addition, CPU usage is a HUGE factor and can increase / decrease number accordingly and in a non-linear fashion.

    Looking at that chart, it seems like the calculations for some cards are VERY high while others are low. It could be that the monitor is picking up the areas where CPU + GPU peaks converge in some situations and registering situations of non-convergence in others.

  21. #471
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by SKYMTL View Post
    What the......
    thats what I thought. the 6990 taking less power then a GTX 560 SLI..... not happening.
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  22. #472
    Xtreme Enthusiast
    Join Date
    Jun 2007
    Location
    Long Beach, CA
    Posts
    972
    yeah I agree that chart looks really fishy.....


    Kristers Bensin can you please link us that REVIEW?????

    Last edited by Lu(ky; 03-08-2011 at 11:24 AM.
    CPU: Intel Core i7-4770K 4.8GHz
    MOBO: GIGABYTE GA-G1.Sniper M5 MATX 1150
    MEMORY: G.SKILL Trident X 8GB 2400MHz 9-11-11-31 1T
    GPU: 2 x eVGA GTX 780 SC
    SOUND KRK Rokit 5 Limited Edition White Studio Monitors
    SSD: 4 x Samsung 128GB Pro's Raid 0
    PSU: SeaSonic Platinum 1000W
    COOLING: 2 x Alphacool NexXxoS UT60 Full Copper 420mm 6 x Swiftech Helix 140mm Fans
    CASE: Lian Li PC-C32B TECH STATION MOD build log coming soon
    MONITOR: ASUS VG278HE Black 27" 149Hz
    O.S: Windows 7 Pro x64

  23. #473
    Registered User
    Join Date
    Dec 2010
    Location
    Sweden
    Posts
    66
    Quote Originally Posted by Lu(ky View Post
    yeah I agree that chart looks really fishy.....


    Kristers Bensin can you please link us that REVIEW?????
    Its already linked and here is the furmark part:



    As you can see Furmark doesnt show a realworld perspective, either because the card gets downclocked by amd powertune or they just show the "peak" wattage consumption.
    Here is also a review from Nordichardware wich shows a 472W draw for the whole system, these results of course differs depending on the equipment used, different examples of gpu and cpu. Click here.

    Last edited by Kristers Bensin; 03-08-2011 at 11:51 AM.
    CPU: Intel i5 2500K + Antec Khuler 620 Memory: 4GB DDR3 Corsair DHX @ 1600MHz CL7 GPU: Nvidia GTX 560Ti + Antec Khuler 620
    Motherboard: Zotac Z68ITX-A-E HDD: Crucial M4 128GB + 2TB Samsung F4EG Chassi: Lian Li Q11B PSU: Cooler Master Silent Pro 850W OS: Windows 7 x64
    Welcome to my home theater! | mattBLACK Gallery | Minima "H20" Gallery

  24. #474
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    The Nordic hardware chart doesn't include other comparative solutions.

  25. #475
    Registered User
    Join Date
    Dec 2010
    Location
    Sweden
    Posts
    66
    Quote Originally Posted by SKYMTL View Post
    The Nordic hardware chart doesn't include other comparative solutions.
    Whatever, as i was saying, nvidia will have a hard time battling the 6990 within the same powerconsumption. Especialy when u look at how close the single GTX 580 is to 6990. Even the SLI 570 is above 6990 in terms of powerconsumtion.

    It will be interesting to see nvidia's binned 580 cores compeeting against amd's binned 6970 cores.
    CPU: Intel i5 2500K + Antec Khuler 620 Memory: 4GB DDR3 Corsair DHX @ 1600MHz CL7 GPU: Nvidia GTX 560Ti + Antec Khuler 620
    Motherboard: Zotac Z68ITX-A-E HDD: Crucial M4 128GB + 2TB Samsung F4EG Chassi: Lian Li Q11B PSU: Cooler Master Silent Pro 850W OS: Windows 7 x64
    Welcome to my home theater! | mattBLACK Gallery | Minima "H20" Gallery

Page 19 of 42 FirstFirst ... 91617181920212229 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •