Page 41 of 143 FirstFirst ... 31383940414243445191141 ... LastLast
Results 1,001 to 1,025 of 3567

Thread: Kepler Nvidia GeForce GTX 780

  1. #1001
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Quote Originally Posted by xBanzai89 View Post
    Kyle expects the 680 to be faster at this point in time.

    If and when we finally see the GK110 its likely going to be a monster performer. AMD may really have set the standard to low this time.
    Okay, didn't he say a few posts back he knew how fast it is?

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  2. #1002
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Kyle claims: http://hardforum.com/showpost.php?p=...9&postcount=89

    You will start to see benchmark leaks and such probably in the next couple of days. I think most of those will be at less than 2560 resolution. And let's face it, you don't need a flagship card for resolution that small. Honestly, like always, we are going to have to put it through [H] large resolution gaming paces before we make a call on where it is compared to 7970. I think the 680 will be faster but by how much, we don't know yet, and I have seen NO MSRPs confirmed.

  3. #1003
    Xtreme Addict
    Join Date
    Jun 2005
    Location
    Madison, WI
    Posts
    1,004
    The rising tension of this thread is really starting to build...
    \Project\ Triple Surround Fury
    Case:
    Mountain Mods Ascension (modded)
    CPU: i7 920 @ 4GHz + EK Supreme HF (plate #1)
    GPU: GTX 670 3-Way SLI + XSPC Razor GTX670 water blocks
    Mobo: ASUS Rampage III Extreme + EK FB R3E water block
    RAM: 3x 2GB Mushkin Enhanced Ridgeback DDR3 @ 6-8-6-24 1T
    SSD: Crucial M4 256GB, 0309 firmware
    PSU: 2x Corsair HX1000s on separate circuits
    LCD: 3x ASUS VW266H 26" Nvidia Surround @ 6030 x 1200
    OS: Windows 7 64-bit Home Premium
    Games: AoE II: HD, BF4, MKKE, MW2 via FourDeltaOne (Domination all day!)

  4. #1004
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Show me the card!!!
    I going to the great country United States of America next month, second best country after Kazakhstan... and I going to buy brand new PeeCees! Very niiice!
    Choosing between 7870 OC or 680.. please bring it out soon!
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  5. #1005
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by SubZero.it View Post
    How can you believe this
    @Damien I totally agree with you
    I have a feeling it's a mis-quote and they meant 200w for the video card. That part may be believable.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  6. #1006
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    This dynamic clocking rumor is very interesting. I hope it's sophisticated. I could imagine the TDP budget being shifted around on the GPU as needed. Highly utilized units are clocked higher while waiting/idle units are clocked lower. This would increase power consumption on average but would stay within the TDP. I hope one can deactivate it for older games though. Or when you are CPU bound or have vsync enabled that it doesn't kick in when the chip doesn't run at full capacity.

  7. #1007
    all outta gum
    Join Date
    Dec 2006
    Location
    Poland
    Posts
    3,390
    Quote Originally Posted by Lanek View Post
    I wish good luck for reviewers for bench the card, know at what speed is running the card, and what happend during this time.. is the card at 750-780-800-900? what will be the result if you bench the card after 1 hour of play ? will it be the same after the card got high heat? Is it OC by App for gain some fps in benchmark, is the result will be the same when someone play 1hour of BF3? Does it detect games benchmark and adapt the OC profile, but is it working the same when gaming ?
    It's been the same thing with benchmarking CPUs with turbo mode. You run the bench 10 times time after time and you get 10 different results... numbers in reviews are more like close approximations, that's why a score difference of 1-3% between competing products is called a draw, not a win.
    www.teampclab.pl
    MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12

    Test bench: empty

  8. #1008
    Xtreme Member
    Join Date
    Nov 2010
    Location
    Valencia, Espaņa
    Posts
    146
    relative to 7970 for the advantages of the new stuff is as follows: low-voltage, high frequency (broken GHz), low power consumption low-noise, high-performance (the DX11), AA, and obtained the support of the business of the game

    PHK Expreview -> http://bbs.expreview.com/thread-49806-1-1.html

  9. #1009
    Xtreme Enthusiast
    Join Date
    Feb 2007
    Location
    So near, yet so far.
    Posts
    737
    GTX6xx pics, teh card, another PCB shot, and those power-connectors.

    Source:http://we.pcinlife.com/thread-1849001-1-1.html






    [[Daily R!G]]
    Core i7 920 D0 @ 4.0GHz w/ 1.325 vcore.
    Rampage II Gene||CM HAF 932||HX850||MSI GTX 660ti PE OC||Corsair H50||G.Skill Phoenix 3 240GB||G.Skill NQ 6x2GB||Samsung 2333SW

    flickr

  10. #1010
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by boxleitnerb View Post
    This dynamic clocking rumor is very interesting. I hope it's sophisticated. I could imagine the TDP budget being shifted around on the GPU as needed. Highly utilized units are clocked higher while waiting/idle units are clocked lower. This would increase power consumption on average but would stay within the TDP. I hope one can deactivate it for older games though. Or when you are CPU bound or have vsync enabled that it doesn't kick in when the chip doesn't run at full capacity.
    Do you realize what sort of tranny budget that would add?
    Nvidia didn't even want to go simple for Fermi, they went software to try to keep the cards under control and we all know how well that worked out...

    Not saying it is impossible but I would be very surprised to see them jump into that all at once rather than gradually implementing the feature over a generation or two.
    I see them implementing into the hardware to control TDP, such as AMD's Powertune, but anything more seems unlikely at this point especially something relatively complex.
    Last edited by LordEC911; 03-09-2012 at 02:10 AM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  11. #1011
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Several sources speak of decoupled shader and core clocks. This is already different from Powertune. It seems, both clock domains are dynamic (705-950MHz for the core and up to 1411 MHz for the ALUs).

  12. #1012
    Xtreme Addict
    Join Date
    Oct 2008
    Location
    The Curragh.
    Posts
    1,294
    As long as those connectors don't make the card thicker thana standard 2 slot design I don't care.

    If they do, well others and myself might have trouble getting it into a M-ITX system.

  13. #1013
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by boxleitnerb View Post
    Several sources speak of decoupled shader and core clocks. This is already different from Powertune. It seems, both clock domains are dynamic (705-950MHz for the core and up to 1411 MHz for the ALUs).
    It seems I have missed a few articles over the last 24hours. Hard trying to stay up to date with this silly season.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  14. #1014
    Xtreme Member
    Join Date
    May 2005
    Posts
    193

  15. #1015
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by jam2k View Post
    ROFL, well done.

  16. #1016

  17. #1017
    all outta gum
    Join Date
    Dec 2006
    Location
    Poland
    Posts
    3,390
    Quote Originally Posted by boxleitnerb View Post
    Several sources speak of decoupled shader and core clocks. This is already different from Powertune. It seems, both clock domains are dynamic (705-950MHz for the core and up to 1411 MHz for the ALUs).
    This is nonsense :P So the shaders would work at less than 2:1 clock ratio? That might have been possible two GPU generations ago.
    www.teampclab.pl
    MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12

    Test bench: empty

  18. #1018
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Why nonsense? Please elaborate and remember that we're talking about 1536 of those bad boys here, not just 512 or so.

  19. #1019
    all outta gum
    Join Date
    Dec 2006
    Location
    Poland
    Posts
    3,390
    Ask yourself:
    1) what kind of clocks can Nvidia expect from TSMC 28 nm?
    2) what kind of transistor budget they have?
    3) what was the shader clock in every generation of GPUs that had hot clocks?

    And finally:
    4) is it therefore possible to have 1536 SPs at 1500+ MHz and still fit into thermal and power constraints?
    www.teampclab.pl
    MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12

    Test bench: empty

  20. #1020
    Xtreme Addict
    Join Date
    Mar 2010
    Posts
    1,079
    Quote Originally Posted by jam2k View Post
    MWA HA HA A HA
    Awesome!

  21. #1021
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by G.Foyle View Post
    Ask yourself:
    1) what kind of clocks can Nvidia expect from TSMC 28 nm?
    2) what kind of transistor budget they have?
    3) what was the shader clock in every generation of GPUs that had hot clocks?

    And finally:
    4) is it therefore possible to have 1536 SPs at 1500+ MHz and still fit into thermal and power constraints?
    When you saw G70 did you think G80 was possible?

  22. #1022
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by GoldenTiger View Post
    I beg to differ.

    It is nearly impossible to play games like BF3, Skyrim (plus official texture pack), Batman AC and several other AAA titles with image quality maximized WITHOUT a flagship card at 2560.

  23. #1023
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by SKYMTL View Post
    I beg to differ.

    It is nearly impossible to play games like BF3, Skyrim (plus official texture pack), Batman AC and several other AAA titles with image quality maximized WITHOUT a flagship card at 2560.
    Agreed, though I think he was referring to 1920x1080... at least I hope he was haha.

  24. #1024
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by SKYMTL View Post
    I beg to differ.

    It is nearly impossible to play games like BF3, Skyrim (plus official texture pack), Batman AC and several other AAA titles with image quality maximized WITHOUT a flagship card at 2560.
    Pretty sure he was talking about resolutions under 2560x. I misread it too the first time, so don't feel too bad.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  25. #1025
    Xtreme Member
    Join Date
    May 2005
    Posts
    193
    IMHO What Kyle is saying is, what everyone suspects, gk104 won't stand a chance against Tahiti in 2560x1600 gaming, let alone multi-monitor resolutions

Page 41 of 143 FirstFirst ... 31383940414243445191141 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •