Page 3 of 42 FirstFirst 12345613 ... LastLast
Results 51 to 75 of 1028

Thread: NVIDIA GTX 595 (picture+Details)

  1. #51
    I am Xtreme
    Join Date
    Oct 2004
    Location
    U.S.A.
    Posts
    4,743
    Quote Originally Posted by LAMB OF GOD View Post
    is it possible that we see a dual 470gtx or gf114x2 ...
    cause as most r concerned and nvidia has never brought a gpu which is a pair of its top notch chips ... gtx295 was also never a 2x285gtx ...
    may be another contender can be 2x570gtx ...
    but making a 2x580gtx doesnt make sense from marketing point of view ... a pair of downclocked 512sp chips which will be giving enventually the horsepower equal to 2x488 sp chips then y use a full gf110???
    apart from that ... can a vapour chamber cooling handle a 400w+ combo with a dual slot width of cooler???
    The ares is a 436w part http://www.techpowerup.com/reviews/ASUS/ARES/29.html

    the cooler works fine. scroll down in this link
    http://www.techpowerup.com/reviews/ASUS/ARES/34.html

    Nvidia was staying within pci-e spec and underclocked their dual gpus with full spec chips, but used bad cooling designs so the temps were high.

    The whole point of the dual gpu setups is to target the high end market for those that want more power out of their gpus to run resolutions like mine of 2560x1600. That is why I own a 5970.

    Saying it can't be done is different from can they actually do it. Nvidia could do it they would just need to put more R&D into getting it done right to meet the user's demands of the target segment.
    Last edited by safan80; 11-19-2010 at 01:20 PM.


    Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card

    LSI series raid controller
    SSDs: Crucial C300 256GB
    Standard drives: Seagate ST32000641AS & WD 1TB black
    OSes: Linux and Windows x64

  2. #52
    Xtreme Addict
    Join Date
    Apr 2005
    Posts
    1,087
    Looks like nVIDIA has intentions of breaking the power consumption record.


    All systems sold. Will be back after Sandy Bridge!

  3. #53
    Assistant Administrator systemviper's Avatar
    Join Date
    Nov 2007
    Location
    Newtown, CT
    Posts
    2,875
    me want one, or what the hell 2.....
    HWbot - Team: XtremeSystems
    XS cruncher - Team: XtremeSystems
    OCN Feedback
    HEAT


    *** Being kind is sometimes better then being right.

  4. #54
    Xtreme Addict
    Join Date
    Mar 2009
    Posts
    1,116
    it is ironic that there are people on this forum who own gtx 275, gtx 295, and do volt mods. yet the people in this thread are trying to wrap their brain around a dual gf110.

    hello. just because a gtx 580 uses 240 watts doesn't mean two of those chips have to use 480 watts. nvidia can choose chips that run at lower voltages. nvidia can choose board components that use less power. they've done this before. gtx 295 does not use twice the power of gtx 275.

  5. #55
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Posts
    939
    Quote Originally Posted by bamtan2 View Post
    it is ironic that there are people on this forum who own gtx 275, gtx 295, and do volt mods. yet the people in this thread are trying to wrap their brain around a dual gf110.

    hello. just because a gtx 580 uses 240 watts doesn't mean two of those chips have to use 480 watts. nvidia can choose chips that run at lower voltages. nvidia can choose board components that use less power. they've done this before. gtx 295 does not use twice the power of gtx 275.
    This is true, but it is still a bit of a shock. We all knew that the higher end GT200's used a lot of power, but it was kinda drowned out by the release of the 480.

  6. #56
    Xtreme Member
    Join Date
    Oct 2010
    Location
    192.168.1.1
    Posts
    221
    2x580 wouldn't be 488W since a PCB is saved and that should shave off some power, to maybe around 450W.

    Staying within the 300W limit would be suicide for dual 580s - just think, one 580 is 250W and if you downclock two 580's so that they consume a total of 300W, then you've gained only 50W worth of power in your card, PLUS all the negativities associated with SLI. It wouldn't be incredibly faster than a single 580 but would cost at least $800.

    Sure, you can always "overclock" the GPUs to their normal clocks yourself, provided Nvidia puts enough PCIE power connectors. But still all review sites would do the review, naturally, at the stock clocks. The cards have to be insanely powerful at stock clocks for Nvidia to be able to market it as a card that's worth $800.

  7. #57
    Xtreme Addict
    Join Date
    Oct 2006
    Location
    new jersey
    Posts
    1,100
    Why would you think they would use the 580 chip and not the 570 chip?
    Rumor has it the 570 is a 480 that runs cooler and less power hungry, some say same as 470? so 22ow or so.
    22oW meams the GTX 570 has almost the TDP than the GeForce GTX 275 (219W) and they made deulies at of that chip np.
    all based on limited info on the 570 so just speculation on my part ofcoarse.
    _________________

  8. #58

  9. #59
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Jakarta, Indonesia
    Posts
    924
    GT200b cards TDP is quite "different" with GF100 and subsequently GF110 cards TDP. While i won't put past nVidia's engineer ability to pull such a coup with this card, IMHO it would require a massive engineering effort, extreme binning, and most likely, drastic clock and/or units reduction.

  10. #60
    Xtreme Guru
    Join Date
    Jul 2004
    Location
    10009
    Posts
    3,628
    does this mean ati 6900 series is so great that nvidia is going to extremes to keep up? or are they testing a new HW bot benching category for highest possible electric bill?
    Massman?

  11. #61
    Xtreme Addict Chrono Detector's Avatar
    Join Date
    May 2009
    Posts
    1,142
    If this card will really have dual GF110 chips, I wonder how much power it will consume, unless if NVIDIA really did pull of a miracle here and lowered the power consumption. Also, I wonder if this card will cost over $900 as well.
    AMD Threadripper 12 core 1920x CPU OC at 4Ghz | ASUS ROG Zenith Extreme X399 motherboard | 32GB G.Skill Trident RGB 3200Mhz DDR4 RAM | Gigabyte 11GB GTX 1080 Ti Aorus Xtreme GPU | SilverStone Strider Platinum 1000W Power Supply | Crucial 1050GB MX300 SSD | 4TB Western Digital HDD | 60" Samsung JU7000 4K UHD TV at 3840x2160

  12. #62
    Xtreme Member
    Join Date
    May 2010
    Posts
    154
    Lets be logical here, there is no way NV would release a card with a $900 price tag in today's market, its just not feasible to have these insanely priced cards that target such a tiny part of the market, especially if there is anything that can remotely compete with it. If you want to price a card that much higher than anything else, it has to stand on its own in terms of performance and just overall quality, and even then 900 is absurd.
    Last edited by Monkeyface; 11-19-2010 at 08:24 PM.

  13. #63
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Posts
    939
    Owning the crown is reason enough, AMD have released more expensive gaming cards.

  14. #64
    Xtreme Mentor
    Join Date
    Feb 2009
    Location
    Bangkok,Thailand (DamHot)
    Posts
    2,693
    Fight with Dual GPU card from AMD
    Intel Core i5 6600K + ASRock Z170 OC Formula + Galax HOF 4000 (8GBx2) + Antec 1200W OC Version
    EK SupremeHF + BlackIce GTX360 + Swiftech 655 + XSPC ResTop
    Macbook Pro 15" Late 2011 (i7 2760QM + HD 6770M)
    Samsung Galaxy Note 10.1 (2014) , Huawei Nexus 6P
    [history system]80286 80386 80486 Cyrix K5 Pentium133 Pentium II Duron1G Athlon1G E2180 E3300 E5300 E7200 E8200 E8400 E8500 E8600 Q9550 QX6800 X3-720BE i7-920 i3-530 i5-750 Semp140@x2 955BE X4-B55 Q6600 i5-2500K i7-2600K X4-B60 X6-1055T FX-8120 i7-4790K

  15. #65
    Xtreme Member
    Join Date
    May 2004
    Location
    somewhere called australia
    Posts
    121
    Now Im half interested in upgrading.. prolly will take GTX 695 at least for me to bother.
    i7 4770K (EK-HF)
    ASRock Z87 Professional Fatal1ty (BIOS L1.42)
    Trident X F3-2400C10D-8GTX
    GTX 480 SLI (Koolance NX-480) / E-MU 1212M / Antec Quattro 1.2kW / HP ZR30w / Lian Li 1010B
    Cooling; TFC Xchanger 240 & RX360 / 655 w/EK-Top V2 / Tygon / BP Compressions

  16. #66
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Wild West, USA
    Posts
    655
    This is where AMD has clear advantage over Nvidia. With dual cards you want cool efficient GPU. Not sure how Nvidia will make it work if its indeed full blown GF110. I just hope it's not another urban legend. Why not make a dual card with full blown GF104 for starters. Be much easier to meet the power envelope and it could be afordable too.

    Personally I'm not so found of dual gpu cards but the show gets more interesting day by day.
    Abit IC7 P4 2.8a @4.21 | P4 3.4e @4.9 | Gainward 6800GT GS @486/1386
    Asus P4P800 SE Dothan 730-PM @ 2900 | EVGA 6800 Ultra GS @521/1376

    e8400@4.3G & 8800GTS G92 800/1932/1132 as gaming rig 24/7

    Custom self build chillbox with watercooling @-28c 24/7 | chilled wc " cpu -18c idle/-3c load
    3DMark 2005 Score Dothan & 6800U
    3DMark 2005 Score p4 & 6800GT

  17. #67
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    quad slot cooling solution anyone?

    All along the watchtower the watchmen watch the eternal return.

  18. #68
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    Quote Originally Posted by safan80 View Post
    The ares is a 436w part http://www.techpowerup.com/reviews/ASUS/ARES/29.html

    the cooler works fine. scroll down in this link
    http://www.techpowerup.com/reviews/ASUS/ARES/34.html

    Nvidia was staying within pci-e spec and underclocked their dual gpus with full spec chips, but used bad cooling designs so the temps were high.

    The whole point of the dual gpu setups is to target the high end market for those that want more power out of their gpus to run resolutions like mine of 2560x1600. That is why I own a 5970.

    Saying it can't be done is different from can they actually do it. Nvidia could do it they would just need to put more R&D into getting it done right to meet the user's demands of the target segment.

    and the ares isnt a refference model ...


    so???
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  19. #69
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    Quote Originally Posted by Iconyu View Post
    Owning the crown is reason enough, AMD have released more expensive gaming cards.
    owning the crown in 1% market means nothing when your competition is hammering the markets that matter the most .... but hey your right
    Last edited by Sn0wm@n; 11-20-2010 at 12:36 AM.
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  20. #70
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Lancaster, UK
    Posts
    473
    ^ halo effect?
    CPU: Intel 2500k (4.8ghz)
    Mobo: Asus P8P67 PRO
    GPU: HIS 6950 flashed to Asus 6970 (1000/1400) under water
    Sound: Corsair SP2500 with X-Fi
    Storage: Intel X-25M g2 160GB + 1x1TB f1
    Case: Sivlerstone Raven RV02
    PSU: Corsair HX850
    Cooling: Custom loop: EK Supreme HF, EK 6970
    Screens: BenQ XL2410T 120hz


    Help for Heroes

  21. #71
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    and how will this halo effect have on deals for laptops or netbooks etc.... or even low power cheapo desktops .... ahh yes ... they have the highend gpu's so they sell more low end crap ... of course ...
    WILL CUDDLE FOR FOOD

    Quote Originally Posted by JF-AMD View Post
    Dual proc client systems are like sex in high school. Everyone talks about it but nobody is really doing it.

  22. #72
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    halo effect is for the fanboys

  23. #73
    Xtreme Enthusiast
    Join Date
    Jun 2008
    Posts
    619
    Very cool. I like dual gpu cards. More competition is great for us.
    ASRock 990FX Extreme4
    AMD FX 8350
    Kingston 16GB (4GBx4) DDR3 1333
    Gigabyte NVidia GTX 680 2GB
    Silverstone 1000W PSU

  24. #74
    Xtreme Guru
    Join Date
    Jun 2010
    Location
    In the Land down -under-
    Posts
    4,452
    Quote Originally Posted by trans am View Post
    does this mean ati 6900 series is so great that nvidia is going to extremes to keep up? or are they testing a new HW bot benching category for highest possible electric bill?
    Massman?
    +1, the 570 now the 580, now the 595 lol ?? still waiting on barts XT.. i think its rumours!

    Another thing I find funny is AMD/Intel would snipe any of our Moms on a grocery run if it meant good quarterly results, and you are forever whining about what feser did?

  25. #75
    Xtreme Member
    Join Date
    Aug 2010
    Location
    Athens, Greece
    Posts
    116
    It could be done with the following specs

    2x GF110 chips
    2x 480 Shaders @ 600MHz
    2x 320-Bit memory and 2560MB

    Max board power 289W

    Performance could be the same as ~GTX470 SLI, but better performance in DX-11 tessellation due to 2 more polymorph engines
    Last edited by Aten-Ra; 11-21-2010 at 01:56 AM.
    Intel Core i7 920@4GHz, ASUS GENE II, 3 x 4GB DDR-3 1333MHz Kingston, 2x ASUS HD6950 1G CU II, Intel SSD 320 120GB, Windows 7 Ultimate 64bit, DELL 2311HM

    AMD FX8150 vs Intel 2500K, 1080p DX-11 gaming evaluation.

Page 3 of 42 FirstFirst 12345613 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •