Page 3 of 18 FirstFirst 12345613 ... LastLast
Results 51 to 75 of 436

Thread: nVidia 'Kepler' GeForce GTX 680 Reviews

  1. #51
    Xtreme Member
    Join Date
    Aug 2011
    Posts
    180
    3 times more shaders! but only 30% or so more performance ? Not really a worthy upgrade over GTX580 TBH.

  2. #52
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,738
    if it was 3x the old shaders it would have been a 450w gpu and around 800mm2, after the shrink!
    worthy upgrade or not is your opinion though.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  3. #53
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,544
    Quote Originally Posted by ice_chill View Post
    3 times more shaders! but only 30% or so more performance ? Not really a worthy upgrade over GTX580 TBH.
    It is important to understand the architectural development on Fermi and Kepler and how they differ.

    Essentially, Fermi was die size limited which meant NVIDIA had to add performance without increasing the transistor count. In order to do this, they ran the shader domain at double the speed of the other processing stages. Unfortunately doing so increased heat production and power consumption.

    GK104 meanwhile doesn't have the same limitation partially due to the 28nm manufacturing process and partially due to optimizations within the architecture that limit the number of transistors needed for certain processing stages. This has allowed for a drastic increase in the core count but in order to keep power consumption to reasonable levels, NVIDIA is now running the clocks at a 1:1 ratio. There are some other changes like PolyMorph Engine reductions (even though they do run at close to double the speed) and caching changes that can also count towards the non-linear performance increase from Kepler GPC to Fermi GPC.

    If you have any particular questions. Let me know.

  4. #54
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,635
    Quote Originally Posted by [XC] gomeler View Post
    Wow, NVIDIA has a a killer core on their hands. The price, performance and power consumption are all massive improvements. I like this new NVIDIA a lot. Now I can just hope this pushes the HD 7970 prices down to something like $450.
    Well then again, if the 7950 / 7970 had been much more appropriately priced, the GTX 680 would have cost even less

    Remember that GK104 was always meant to have been this generations mid range, not the high end. Now Nvidia get to sit back and take it easy for a year or so with the GK104 performing so well, and then whenever AMD manage to catch up yet again spurting their 'Verdetrol' garbage marketing, out comes the GK110 which beats AMD again with a year old architecture.

    And please note I was an ATI fanboy up until the HD 5000 range, both the 6000 and 7000 ranges have disappointed me significantly compared to what Nvidia have had on offer since their Fermi refresh.

    Quote Originally Posted by ice_chill View Post
    Not really a worthy upgrade over GTX580 TBH.
    Well the GK104 was meant to have been an 'upgrade' over the GTX 560ti, not an upgrade of the GTX 580. GK110 was meant to have been the GTX 580 replacement. I was meant to have been purchasing two GK104s right now for <450 for my next SLI setup, but due to AMD's underwhelming HD 7000 cards that wont be happening.
    Last edited by Mungri; 03-22-2012 at 10:33 AM.

  5. #55
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,738
    Quote Originally Posted by bhavv View Post
    Well then again, if the 7950 / 7970 had been much more appropriately priced, the GTX 680 would have cost even less
    no way to know that
    the 7000 gpus are nothing special when it comes to price/perf, but if you wanted lower power, and more perf, you paid for it, simple as that. it beat the competition in every way but pricing. if they launched at 450$ for the 7970, i doubt nvidia would have sold the 680 for anything under 450$ since they can still push it as being a faster gpu and worth more.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  6. #56
    Xtreme Addict
    Join Date
    Feb 2007
    Posts
    1,674
    I was hoping it would fare better in luxrender =(

  7. #57
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,635
    Quote Originally Posted by Manicdan View Post
    no way to know that
    the 7000 gpus are nothing special when it comes to price/perf, but if you wanted lower power, and more perf, you paid for it, simple as that. it beat the competition in every way but pricing. if they launched at 450$ for the 7970, i doubt nvidia would have sold the 680 for anything under 450$ since they can still push it as being a faster gpu and worth more.
    Sorry, I didnt just mean the pricing, but also if the 7950 / 7970 performed significantly better and had given Nvidia a reason to have to release the GK110 to be able to compete in the high end.

    That didnt happen.

  8. #58
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,738
    Quote Originally Posted by bhavv View Post
    Sorry, I didnt just mean the pricing, but also if the 7950 / 7970 performed significantly better and had given Nvidia a reason to have to release the GK110 to be able to compete in the high end.

    That didnt happen.
    trust me, if they had GK110 ready to go, it would be out now and selling for 800$
    if they wait until september, they sure dont save money by sitting on a ready part, but they will loose about 200-300$ per gpu that would have sold between now and then.
    and 2 months ago when the 7900s launched, they would have been to late in the process to make any drastic changes like launching in 2 months vs 10 months. their schedule internally has been set for september (if that really is their launch time) for atleast a year now.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  9. #59
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,635
    I just dont believe that the 256 bit GK104 was ever designed or meant to have been released as a competitor for the high end GPU segment if AMD had pulled off what was expected from their 28 nm architecture.

  10. #60
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,198
    There goes the Nvidia's haters, if it outperforms the 7970, its going to cost more. Nvidia was being modest actually when they said they expected more from AMD.

    Just imagine if gk110 came out first and gk104 was released second. This would have been a beat down worse than gtx 8800.

    Gk110 is going to be a beast. At 649 or 699, gk110 will completely warrants its price as it should step on the toes of the 7990.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  11. #61
    Xtreme Member
    Join Date
    Sep 2006
    Location
    UK
    Posts
    141
    Does the 680 support Bitstreaming audio?
    Intel i7 3770K @ 4.5ghz
    Asus P8Z77-V
    8GB Crucial 1866Mhz CL9
    AMD Sapphire Radeon HD 6970
    Crucial RealSSD M4 128GB
    2x WD Raptor X
    Enermax Galaxy 1000W DXX
    NEC LCD2690WUXi
    Yamaha RX-V667 Receiver
    Monitor Audio Vector 5.1

  12. #62
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,544
    Quote Originally Posted by kam03 View Post
    does the 680 support bitstreaming audio?
    yes.

  13. #63
    Xtreme Addict
    Join Date
    Feb 2008
    Location
    America's Finest City
    Posts
    2,077
    Here's my review if anyone's interested - http://bit.ly/GTX680Review
    Quote Originally Posted by FUGGER View Post
    I am magical.

  14. #64
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,414
    Quote Originally Posted by Manicdan View Post
    trust me, if they had GK110 ready to go, it would be out now and selling for 800$
    if they wait until september, they sure dont save money by sitting on a ready part, but they will loose about 200-300$ per gpu that would have sold between now and then.
    and 2 months ago when the 7900s launched, they would have been to late in the process to make any drastic changes like launching in 2 months vs 10 months. their schedule internally has been set for september (if that really is their launch time) for atleast a year now.
    Exactly. Holding back a new core that could absolutely stomp AMD would make very little financial sense. Launching GK104, which is a smaller die on the unknown 28nm process gives NVIDIA the chance to figure things out before they release a massive 250-300w monster. That is of course if NVIDIA chooses that route. Who knows, maybe they'll instead release a dual-GPU card based on binned GK104 cores. I'd be happier to see NVIDIA adopt the AMD route of more smaller, efficient cores.


    Now, how in the heck is AMD going to counter this? Can't really ramp up the clocks on Tahiti to gain another 30% in performance. Couldn't imagine AMD would want to get in to the gigantic GPU market that NVIDIA has traditionally run in. Do they release a 7990 with binned Tahiti cores? Or figure out their scaling issues between pitcairn and tahiti(7870/7970)? Would be great if it all came down to driver optimizations.

  15. #65
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,635
    Quote Originally Posted by [XC] gomeler View Post
    Exactly. Holding back a new core that could absolutely stomp AMD would make very little financial sense.
    I disagree, holding it back means that they dont need to spend as much on R+D for their next card. They have a card out now that comfortably beats the 7970, so they had no need to release a better one.

    The GTX 680 was actually originally meant to have been the 670 Ti:

    http://www.xtremesystems.org/forums/...-as-GTX-670-Ti

    Im sure that the GTX 680 would have at least been a 384 bit card with an 8 + 6 pin power conector.

  16. #66
    Xtreme Enthusiast
    Join Date
    Jun 2006
    Location
    Space
    Posts
    769
    Mmmm, the card looks great until I read the XBit review and saw the overclocking results. Is it me or does it perform slightly worse than the 7970 when they are both overclocked. If this is the case, then I'll have to think hard and long about which to go for, as I generally OC anything I buy.

  17. #67
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,414
    Quote Originally Posted by bhavv View Post
    I disagree, holding it back means that they dont need to spend as much on R+D for their next card. They have a card out now that comfortably beats the 7970, so they had no need to release a better one.

    The GTX 680 was actually originally meant to have been the 670 Ti:

    http://www.xtremesystems.org/forums/...-as-GTX-670-Ti

    Im sure that the GTX 680 would have at least been a 384 bit card with an 8 + 6 pin power conector.
    Perhaps. With the GTX 680 in theory being cheaper to manufacture than the HD 7970 it would make sense to hold the GTX 680 as the flagship card until AMD releases their HD 7990. Then, releasing a GTX 685/690 based on a 300w Kepler core(with a die size somewhere around GF100-G200) would set them up to compete on performance with less expensive silicon going in to this performance crown card.

    Still, I doubt they developed GK104 and GK110 at the same time. Makes more sense from a risk management point to develop a small core on 28nm. Beats the hell out of having another GTX 480.

  18. #68
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,635
    Quote Originally Posted by Motiv View Post
    Mmmm, the card looks great until I read the XBit review and saw the overclocking results. Is it me or does it perform slightly worse than the 7970 when they are both overclocked. If this is the case, then I'll have to think hard and long about which to go for, as I generally OC anything I buy.
    When both cards were overclocked in that review, there is so little difference between them that I wouldnt base my purchase decision on FPS alone. You have to remember that a 7970 is being overclocked from 925 to 1150 Mhz, and the GTX 680 from 1006 to 1186 Mhz in that review, so the 7970 is being given a larger boost over its reference clocks than the GTX 680 is.

    Quote Originally Posted by [XC] gomeler View Post
    Still, I doubt they developed GK104 and GK110 at the same time. Makes more sense from a risk management point to develop a small core on 28nm. Beats the hell out of having another GTX 480.
    I dont think that they did either, but I think that a 384 bit GK104 was being developed as the GTX 680, and Techpowerup have shown solid evidence that this 256 bit version was originally the GTX 570 Ti. GK110 is coming out much later.
    Last edited by Mungri; 03-22-2012 at 12:57 PM.

  19. #69
    Xtreme Member
    Join Date
    Sep 2008
    Location
    Chattanooga, TN
    Posts
    372
    Quote Originally Posted by SKYMTL View Post
    It is important to understand the architectural development on Fermi and Kepler and how they differ.

    Essentially, Fermi was die size limited which meant NVIDIA had to add performance without increasing the transistor count. In order to do this, they ran the shader domain at double the speed of the other processing stages. Unfortunately doing so increased heat production and power consumption.

    GK104 meanwhile doesn't have the same limitation partially due to the 28nm manufacturing process and partially due to optimizations within the architecture that limit the number of transistors needed for certain processing stages. This has allowed for a drastic increase in the core count but in order to keep power consumption to reasonable levels, NVIDIA is now running the clocks at a 1:1 ratio. There are some other changes like PolyMorph Engine reductions (even though they do run at close to double the speed) and caching changes that can also count towards the non-linear performance increase from Kepler GPC to Fermi GPC.

    If you have any particular questions. Let me know.
    I thought shaders were still running twice the core frequency on the 680?

    Core i5 2500K @ 4.8GHz 1.31v [L041C123] | Gigabyte P67-UD3P-B3 | Corsair H60 | 4GB G.Skill Ripjaws X DDR3-1866 8-9-8-24-1T | Samsung F3 1TB |
    ASUS GTX 670 2GB| PCP&C Silencer 500W | Antec Three Hundred "Blacked Out" | Dell Ultrasharp 23" U2311H IPS 1920x1080



  20. #70
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,657
    Quote Originally Posted by OCX600RR View Post
    I thought shaders were still running twice the core frequency on the 680?
    I think it was the early gpuz that was reporting that in error, 1:1 now.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  21. #71
    Registered User
    Join Date
    Mar 2008
    Location
    Norway
    Posts
    9
    Quote Originally Posted by Russian View Post
    Here's my review if anyone's interested - http://bit.ly/GTX680Review
    nice one thanks

  22. #72
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,326
    As usual I like sweclocker's reviews: http://translate.google.com/translat...mt-sli&act=url

    Luckily I'm a swedish speaking Finn so don't need to use google translate either. It has SLI vs Crossfire tests too, overclocking, power consumption under both cirumstances for both ati and nvidia etc. Nice to see a maxed out stable OC on GTX 680 draws just marginally more power than a stock HD 7970.



    On the other hand HD 7970 allows for a slightly bigger OC potential comparing with standard GPU bioses at least (GTX 680 could probably use a little higher than 1.1v even on standard cooler I think, at least 1.125~1.150 IMO and I personally wish 1.150 had been max with default bios which would be probably enough to get to a roughly similar OCability ratio as HD 7970 too).

    Last edited by RPGWiZaRD; 03-22-2012 at 02:02 PM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  23. #73
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,624
    I would imagine the GK110 will add the features that were subtracted from this release. maybe its something like Intel has started to do with separate consumer target arc and the professionals with a separate arc. So that means GK104 = SNB and GK110 = SNB-E.

    Very good card a excellent alternative to 7970 or for that matter 7950. Its almost weird that early reviews pointed to a huge loaded consumption whereas the latest one dont....

    Non the less AMD cant not just OC 7970 and sell it off against the 680 it has too much in efficiency.
    Coming Soon

  24. #74
    Xtreme Addict
    Join Date
    Feb 2008
    Location
    America's Finest City
    Posts
    2,077
    I'm hearing whisperings from the AMD camp... They don't sound scared.
    Quote Originally Posted by FUGGER View Post
    I am magical.

  25. #75
    Xtreme Addict
    Join Date
    Sep 2010
    Location
    Australia / Europe
    Posts
    1,309
    Quote Originally Posted by Russian View Post
    I'm hearing whisperings from the AMD camp... They don't sound scared.
    Well can you go into any detail?
    Not worried as in dual card coming? Or single.chip faster card scenario?

    Sent from my GT-I9100 using Tapatalk
    この世界には 人の運命を司る 何らかの超越的な 〝律〝...... 〝神の手〝が 存在するのだろうか? 少なくとも 人は 自らの意志さえ 自由にはできな

Page 3 of 18 FirstFirst 12345613 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •