Page 133 of 143 FirstFirst ... 3383123130131132133134135136 ... LastLast
Results 3,301 to 3,325 of 3567

Thread: Kepler Nvidia GeForce GTX 780

  1. #3301
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by BababooeyHTJ View Post
    You can buy a QNIX QX2710 for $300 now.
    True, but I'm sure those will stop being good batches for oc'ing soon enough just like Catleaps did before them. I have an X-Star DP2710 (which is the same thing as the Qnix) that goes to 120hz nicely, just dims a little towards the top-right due to PLS tech.

  2. #3302
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    Quote Originally Posted by Chickenfeed View Post
    Had my fingers crossed for a $499-549 USD MSRP but with the lack of other competitive options in that bracket I doubt this will happen... Until Maxwell I guess!
    If Maxwell has no competitive options, you won't see that price for it either.

    NVIDIA may well have become the "Intel of GPUs" with this generation. With people buying all the 690s and Titans they can make for months on end for $1000. in 2012 and 2013, AMD only able to put out a chip the eclipsed the 580 by 20% (until they OCd it), then AMD leaving that 20% GPU their flagship for two whole years, I think NVIDIA has turned the corner.

    From here on out we may have a $1000 "Extreme Edition" NVIDIA card, and then $400-$700 cards for the masses.

    It's gotten so bad I actually saw "Gee do you think AMD will have a new driver to counter the 700 launch?" on another forum. Think about that: AMD fans are reduced to hoping they'll get a driver refresh with significant performance increases one and a half years after the card hit the market.
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  3. #3303
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    144


    The GeForce GTX 780 reference card will use the same cooler that is featured on the GTX Titan, and should keep noise levels down to around 40-45 dBA. This means the reference GTX 780 will be much quieter than the GTX 680. We should expect the GeForce GTX 780 to be around 25% to 50% faster than the Radeon HD 7970 GHz Edition, except for Tomb Raider, where the AMD will excel against most GeForce GPU's.


    http://www.tweaktown.com/news/30453/...-23/index.html
    i7 4930k @ 4.4 Vcore 1.39
    Custom Water Cooling Setup
    Rampage IV BE
    Dominator GT 2133mhz @cas 9 16G
    Samsung 850 Pro 512GB SSD
    2 x Evga GTX 780 Ti SC SLI @ 1120mhz
    Corsair AX1500i
    2 x WD 6TB Caviar Green

  4. #3304
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by BababooeyHTJ View Post
    You can buy a QNIX QX2710 for $300 now.
    "Widespread" and "Quality" does no equate a grey market Korean panel that doesn't have a native refresh rate of 120Hz. All they are doing is overdriving a lower refresh rate panel.

  5. #3305
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by Rollo View Post
    If Maxwell has no competitive options, you won't see that price for it either.
    AMD is releasing Hawaii this year. They will no doubt also beat Nvidia to a process shrink.

  6. #3306
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    144
    Last edited by TRANCEFORMER; 05-19-2013 at 01:47 PM.
    i7 4930k @ 4.4 Vcore 1.39
    Custom Water Cooling Setup
    Rampage IV BE
    Dominator GT 2133mhz @cas 9 16G
    Samsung 850 Pro 512GB SSD
    2 x Evga GTX 780 Ti SC SLI @ 1120mhz
    Corsair AX1500i
    2 x WD 6TB Caviar Green

  7. #3307
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by SKYMTL View Post
    "Widespread" and "Quality" does no equate a grey market Korean panel that doesn't have a native refresh rate of 120Hz.
    You'll probably never see a 2560x1440 display rated for a 120hz input. I don't even think that the next hdmi standard will accept that.

    I'm not sure what your point is. Those displays are wildly popular among the target market for these $500 or so video cards.

    All they are doing is overdriving a lower refresh rate panel.
    I'm not sure what you point is here either. I do remember you saying that anything more than 40fps isn't noticeable though.

  8. #3308
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    263
    Quote Originally Posted by BababooeyHTJ View Post
    You'll probably never see a 2560x1440 display rated for a 120hz input. I don't even think that the next hdmi standard will accept that.

    I'm not sure what your point is. Those displays are wildly popular among the target market for these $500 or so video cards.



    I'm not sure what you point is here either. I do remember you saying that anything more than 40fps isn't noticeable though.
    If it's a constant 40fps vs constant 120fps, there is no difference
    Gigabyte z68 UD3
    2600k @ stock
    8Gb G Skill
    2x eVGA GTX 680 @ stock
    Intel 320 80Gb
    Intel 320 120Gb x2
    Intel 320 160Gb x2

  9. #3309
    Xtreme Addict
    Join Date
    Apr 2011
    Location
    North Queensland Australia
    Posts
    1,445
    Quote Originally Posted by BababooeyHTJ View Post
    You can buy a QNIX QX2710 for $300 now.
    Exactly what I've ordered.

    -PB
    -Project Sakura-
    Intel i7 860 @ 4.0Ghz, Asus Maximus III Formula, 8GB G-Skill Ripjaws X F3 (@ 1600Mhz), 2x GTX 295 Quad SLI
    2x 120GB OCZ Vertex 2 RAID 0, OCZ ZX 1000W, NZXT Phantom (Pink), Dell SX2210T Touch Screen, Windows 8.1 Pro

    Koolance RP-401X2 1.1 (w/ Swiftech MCP35X), XSPC EX420, XSPC X-Flow 240, DT Sniper, EK-FC 295s (w/ RAM Blocks), Enzotech M3F Mosfet+NB/SB

  10. #3310
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by schoolslave View Post
    If it's a constant 40fps vs constant 120fps, there is no difference
    Quote Originally Posted by SKYMTL View Post
    "Widespread" and "Quality" does no equate a grey market Korean panel that doesn't have a native refresh rate of 120Hz. All they are doing is overdriving a lower refresh rate panel.
    There's a huge difference, and these do not skip frames (they do true 120hz and have been reviewed/tested by end users and one major monitor review site). It's a beautiful thing in motion. It's just overclocking, same as taking a base reference card that's capable of Superclock speeds from eVGA and overclocking it to Superclock speeds. Same deal here. They are not "overdriven" regular panels, that is an entirely different technical term that many manufacturers use to help reduce ghosting on their native 120hz panels. Also, the manufacturer has NOT marketed these ever as 120hz, implied or otherwise. People realized they were highly overclockable and bought them to do it.

    They also aren't grey market as I see people referring to them as, they are manufactured products by real companies that sell them to vendors, who then re-sell them as any other store would, to people anywhere in the world with international shipping (with no implied or any form of restriction ever mentioned, implied, or intended, by the manufacturer). Just as someone else would buy something from newegg.ca who's in Canada and wants something from the US newegg. No, they aren't from amazon, but they're in no way tainted by anything... you can call cdkeys designed to be sold in a low income market then being sold digitally to Americans grey, sure, but this? They're nothing of the sort. Unless you think a piece of clothing sold by Macy's that was manufactured in China is somehow grey market too.

  11. #3311
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by schoolslave View Post
    If it's a constant 40fps vs constant 120fps, there is no difference
    No, just no.

    All along the watchtower the watchmen watch the eternal return.

  12. #3312
    Xtreme Member
    Join Date
    Jun 2007
    Posts
    495
    I was so hoping for a 770GTX with 4GB ram and a 780GTX with 5GB ram.
    Gaming/Rendering rig:
    eVGA X58 Tri-SLI
    Intel i7-970 w/ Corsair H100
    24gigs Corsair 2000s
    eVGA GTX580 3GB
    Too many HDD's
    LG Blu-ray player
    Corsair hx1050 psu
    Corsair 800D case

  13. #3313
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Rollo View Post
    NVIDIA may well have become the "Intel of GPUs" with this generation.
    GPUs are quite a bit more expensive than CPUs.
    Quote Originally Posted by clo007 View Post
    I was so hoping for a 770GTX with 4GB ram and a 780GTX with 5GB ram.
    4GB 770 = 4GB 680, just buy the latter. What's the problem?
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  14. #3314
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    263
    Quote Originally Posted by STEvil View Post
    No, just no.
    I've tested this with Skyrim and Cod MW3 at a LAN with two monitors side by side, there is no difference, at least not to me and about 10-15 of my friends who were there.
    I can only conclude that people read way too much into this whole FPS thing, end up spending tons of money for no improvement, and then need to justify their purchase.
    Gigabyte z68 UD3
    2600k @ stock
    8Gb G Skill
    2x eVGA GTX 680 @ stock
    Intel 320 80Gb
    Intel 320 120Gb x2
    Intel 320 160Gb x2

  15. #3315
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    Quote Originally Posted by clo007 View Post
    I was so hoping for a 770GTX with 4GB ram and a 780GTX with 5GB ram.
    To what end?

    As you currently have one 670 2GB odds are good you're not doing surround. 3GB should be plenty for any single panel. Even 2GB is good enough 95% of the time for 25X16.

    What's your resolution you want all the VRAM for?
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  16. #3316
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Posts
    872
    Quote Originally Posted by BababooeyHTJ View Post
    AMD is releasing Hawaii this year. They will no doubt also beat Nvidia to a process shrink.
    I'm skeptical we'll see any performance chips on 20nm this year. Mobile and OEM, sure, but if we see 8970 I'll be pretty surprised.
    Intel 990x/Corsair H80 /Asus Rampage III
    Coolermaster HAF932 case
    Patriot 3 X 2GB
    EVGA GTX Titan SC
    Dell 3008

  17. #3317
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by schoolslave View Post
    I've tested this with Skyrim and Cod MW3 at a LAN with two monitors side by side, there is no difference, at least not to me and about 10-15 of my friends who were there.
    I can only conclude that people read way too much into this whole FPS thing, end up spending tons of money for no improvement, and then need to justify their purchase.
    Humorous fantasy or your test was flawed, because it has been scientifically proven people can see in excess of 200fps and to most everyone, 40 vs 60 is obvious, let alone 40 vs 120. To me, anything beneath 50 instantly is obviously chopper, and 85 or higher looks better and better.

  18. #3318
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by schoolslave View Post
    I've tested this with Skyrim and Cod MW3 at a LAN with two monitors side by side, there is no difference, at least not to me and about 10-15 of my friends who were there.
    I can only conclude that people read way too much into this whole FPS thing, end up spending tons of money for no improvement, and then need to justify their purchase.
    Two identical monitors, or one LCD/LED and one CRT? Also were they both 60hz inputs or 60hz and 120hz? 40fps vs 120fps (constant) isnt as bad as say 30fps vs 60fps but its still a noticeable difference to many people. Maybe not you or due to something you had configured, but to say 40fps and 120fps are just as good as each other just isnt true. 40fps is quite acceptable in some cases but if 40fps was always acceptable then why would we bother allowing more than 40fps?

    Here's a few items that can help on your adventure
    3dfx 60/30/15 demo http://www.falconfly.de/artwork.htm (you will need a glide wrapper)
    15/30/60 demo using flash http://boallen.com/fps-compare.html

    Someone should make a flash demo that you can enter your own custom FPS rates into and stretch to whatever size you want.. Maybe add a "stutter" option that allows you to insert short or long frames or double/triple frames periodically.
    Last edited by STEvil; 05-19-2013 at 07:10 PM.

    All along the watchtower the watchmen watch the eternal return.

  19. #3319
    Xtreme Member
    Join Date
    Jun 2007
    Posts
    495
    Quote Originally Posted by Rollo View Post
    To what end?

    As you currently have one 670 2GB odds are good you're not doing surround. 3GB should be plenty for any single panel. Even 2GB is good enough 95% of the time for 25X16.

    What's your resolution you want all the VRAM for?
    Let me clarify. I work at an architecture firm. i do renders at home. The more vram, the safer I am with high-poly models. I can't justify spending $999 on a Titan. I was hoping for a 780 with 5gB vram b/c the 780 is spec'd with more CUDA cores than the 680GTX as well.

    Oh, I need to update my sig. I sold my 670 FTW and currently only have my GTX580 3GB. And I use to play BF3 fine on surround on medium settings.
    Last edited by clo007; 05-19-2013 at 07:18 PM.
    Gaming/Rendering rig:
    eVGA X58 Tri-SLI
    Intel i7-970 w/ Corsair H100
    24gigs Corsair 2000s
    eVGA GTX580 3GB
    Too many HDD's
    LG Blu-ray player
    Corsair hx1050 psu
    Corsair 800D case

  20. #3320
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    235
    Hmm, not sure if I should keep my 2 x 4GB GTX 680 FTW+ cards or upgrade to the GTX 780. Do you guys think the drop to 3GB Vram will make a difference in 2D surround? I wouldn't really like losing 1GB of Vram.

    I'm thinking that I will stick with the GTX 680s.

  21. #3321
    Xtreme Addict
    Join Date
    Sep 2006
    Location
    California
    Posts
    1,917
    Quote Originally Posted by Ice009 View Post
    Hmm, not sure if I should keep my 2 x 4GB GTX 680 FTW+ cards or upgrade to the GTX 780. Do you guys think the drop to 3GB Vram will make a difference in 2D surround? I wouldn't really like losing 1GB of Vram.

    I'm thinking that I will stick with the GTX 680s.
    I can't imagine why you'd want to "upgrade" to a rebranded version of the same card, unless you think two slightly better heatsinks are worth dropping $1,000 for.
    My Videos
    GRID Demolition Derby * GRID Camaro vs. Mustang * Audiosurf - Speed Racer
    I Shot the Hosties * Slightly Stupid * Dump Truck


    Intel Haswell 4770K * 2x8GB Mushkin Redline DDR3 1866 CL9 * Asus Maximus VI Gene * Sapphire 7870 GHz Edition
    500GB Samsung 840 Series SSD + 2TB WD Raid Edition 3 magnetic * SilverStone Temjin case * Corsair TX750 PSU * Corsair H60 water cooler * Win7 Pro x64

  22. #3322
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    263
    Quote Originally Posted by STEvil View Post
    Two identical monitors, or one LCD/LED and one CRT? Also were they both 60hz inputs or 60hz and 120hz? 40fps vs 120fps (constant) isnt as bad as say 30fps vs 60fps but its still a noticeable difference to many people. Maybe not you or due to something you had configured, but to say 40fps and 120fps are just as good as each other just isnt true. 40fps is quite acceptable in some cases but if 40fps was always acceptable then why would we bother allowing more than 40fps?

    Here's a few items that can help on your adventure
    3dfx 60/30/15 demo http://www.falconfly.de/artwork.htm (you will need a glide wrapper)
    15/30/60 demo using flash http://boallen.com/fps-compare.html

    Someone should make a flash demo that you can enter your own custom FPS rates into and stretch to whatever size you want.. Maybe add a "stutter" option that allows you to insert short or long frames or double/triple frames periodically.
    I may be way off from reality here, but I think that it also had to do with "smoothness". With 120Hz, you are effectively doubling the range of possible FPS values. Unfortunately, and I think this is what happened (GPU was a 460 I think), if your hardware is weaker, then wouldn't there theoretically be more "jumps", from say 120 FPS in less intensive scenes to like 80 70 40 in more intensive scenes? With 60 Hz, the inconsistencies are more limited in range, and to me a jump from a max FPS of 60 to 45 is more bearable than from 120 to 45.
    Gigabyte z68 UD3
    2600k @ stock
    8Gb G Skill
    2x eVGA GTX 680 @ stock
    Intel 320 80Gb
    Intel 320 120Gb x2
    Intel 320 160Gb x2

  23. #3323
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by xMrBunglex View Post
    I can't imagine why you'd want to "upgrade" to a rebranded version of the same card, unless you think two slightly better heatsinks are worth dropping $1,000 for.
    Huh? The 780 isn't going to be $1000, and it's not a rebrand of the 680, it's basically a slightly cut-down Titan.

    Quote Originally Posted by Ice009 View Post
    Hmm, not sure if I should keep my 2 x 4GB GTX 680 FTW+ cards or upgrade to the GTX 780. Do you guys think the drop to 3GB Vram will make a difference in 2D surround? I wouldn't really like losing 1GB of Vram.

    I'm thinking that I will stick with the GTX 680s.
    The VRAM loss would be meaningless.

  24. #3324
    Moderator
    Join Date
    Oct 2007
    Location
    Oregon - USA
    Posts
    830
    There is alot of misinformation floating about. The human eye does not see in frames per second, we do not have shutters in our eyes that open and close and let light in. A single frame is a still image. As frames are flashed faster and faster our brains begin to process this information and somewhat perceive it as fluid motion. In "real" life the eye tracks smooth movement and transmits that information to the brain. Part of the brain will group parts of information together, or omit information as necessary for the situation. Take for instance while you are driving and are calm and at rest, your brain will omit unnecessary data which is why you may not notice something dart into your path at first, once the brain is "startled" it will begin to processor visual information much more rapidly and omit quite a bit less which produces the "slow motion" feel people encounter while under duress.
    That being said, in the past they standardized 24fps as the standard because at that setting while the body is at rest and relaxed that is the acceptable minimum amount of frames the brain can process into "fluid motion" or what we would perceive it as. During gaming, or highly stressful movies our brains begin to process more and more information more quickly and it is now widely accepted that fps much higher than 24fps will yield a more desirable outcome.
    The argument then becomes, not what is the maximum fps the eye can see, but what fps will yield the more disirable experience for the situation. Now everyone's brain reacts differently to a given situation, while some remain very calm during gaming 60fps may be enough and anything higher may be unnoticeable, for some people who become excited quicker 80-120 fps may become noticeably smoother and yield a more disirable experience.
    Last edited by Ace123; 05-20-2013 at 01:25 PM.
    Asus Rampage IV Extreme
    4930k @4.875
    G.Skill Trident X 2666 Cl10
    Gtx 780 SC
    1600w Lepa Gold
    Samsung 840 Pro 256GB


  25. #3325
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by Ace123 View Post
    There is alot of misinformation floating about. The human eye does not see in frames per second, we do not have shutters in our eyes that open and close and let light in. A single frame is a still image. As frames are flashed faster and faster our brains begin to process this information and somewhat perceive it as fluid motion. In "real" life the eye tracks smooth movement and transmits that information to the brain. Part of the brain will group parts of information together, or omit information as necessary for the situation. .
    Yep, as you said, our eyes are not digital... we don't "see" once every X number of milliseconds. We see continuously and a lot of things go into how our eyes receive and then relay the info to our brains. The reason I was bringing up the "people have been tested to be able to see higher than X fps" was just to give a quick rebuttal to the whole "people only see 30/40/whatever fps" remark. In reality it does indeed depend on the person, their situation/mood/etc., how good their eyes are, and of course the equipment being used by them. I don't know offhand of any testing that's been done to show "most people notice a big difference with X vs. Y refresh rate & fps" but, to say that the eye can't tell/notice/detect it is completely wrong. Anecdotally I see most people who do try high-end hardware and high refresh-rate setups say they really enjoy the smoothness and can tell, though occasionally you see someone saying they didn't really notice anything.

Page 133 of 143 FirstFirst ... 3383123130131132133134135136 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •