Page 50 of 143 FirstFirst ... 404748495051525360100 ... LastLast
Results 1,226 to 1,250 of 3567

Thread: Kepler Nvidia GeForce GTX 780

  1. #1226
    Xtreme Guru
    Join Date
    Dec 2002
    Posts
    4,046
    since you guys mentioned 1080p/2gb/skyrim

    heres skyrim/latest update/official hd pack/misc pack/step v201a pack/2x ultra settings/1080p/3x gtx580 3gb

    i can only show @ 4096 shadowmaprez because @ 8192.. 3060mb is reached very fast and i have to esc the game before it crashes

    iShadowMapResolution=4096
    fShadowDistance=2000.0000

    high rez short distance shadows


    no shadows at long distance


    iShadowMapResolution=4096
    fShadowDistance=32000.0000

    low rez short distance shadows


    shadows at long distance

  2. #1227
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Fallout 3 even maxes out the vram on my GTX480s with the mods that I'm using at 1920x1080. I'm sure that higher resolutions could easily top 2Gb.

  3. #1228
    Xtreme Addict
    Join Date
    Jun 2005
    Location
    Madison, WI
    Posts
    1,004
    Quote Originally Posted by memmem View Post
    But IF, (everything based on rumors)

    1) GK104 has almost the same die size as 7970
    2) GK104 has almost the same price as 7970
    3) GK104 will be called GTX680,

    canīt we consider it high end?

    ... Will we see a GTX 685 as a high end card?

    ... GTX 690 wonīt be two GPUs on the same PCB?

    I think thereīs been a long time since AMD and Nvidia are not sharing the same vision on what a high end part is.
    My thoughts exactly. If Nvidia is calling this so called "mid-range" card the GTX 680, what will the so called "high-end" card be called? GTX 680 Ti? GTX 680 Ultra? GTX 685? GTX 780?

    Oh, Nvidia, you so craaazay!
    \Project\ Triple Surround Fury
    Case:
    Mountain Mods Ascension (modded)
    CPU: i7 920 @ 4GHz + EK Supreme HF (plate #1)
    GPU: GTX 670 3-Way SLI + XSPC Razor GTX670 water blocks
    Mobo: ASUS Rampage III Extreme + EK FB R3E water block
    RAM: 3x 2GB Mushkin Enhanced Ridgeback DDR3 @ 6-8-6-24 1T
    SSD: Crucial M4 256GB, 0309 firmware
    PSU: 2x Corsair HX1000s on separate circuits
    LCD: 3x ASUS VW266H 26" Nvidia Surround @ 6030 x 1200
    OS: Windows 7 64-bit Home Premium
    Games: AoE II: HD, BF4, MKKE, MW2 via FourDeltaOne (Domination all day!)

  4. #1229
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    Quote Originally Posted by memmem View Post
    But IF, (everything based on rumors)

    1) GK104 has almost the same die size as 7970
    2) GK104 has almost the same price as 7970
    3) GK104 will be called GTX680,

    canīt we consider it high end?

    ... Will we see a GTX 685 as a high end card?

    ... GTX 690 wonīt be two GPUs on the same PCB?

    I think thereīs been a long time since AMD and Nvidia are not sharing the same vision on what a high end part is.
    the 7970 is not their high end the 7990 is, and the duel card has been AMD's high end for the past few years

    maybe nv is doing the same thing now

  5. #1230
    Xtreme Member
    Join Date
    Apr 2011
    Location
    Alberta Canada
    Posts
    288
    Quote Originally Posted by SKYMTL View Post
    Food for thought: maybe NVIDIA realized that they don't NEED what would be deemed a "high end Kepler" to trounce AMD this round. Maybe their mid range core is more than enough.

    /runs back out of thread
    All the rumors definitely point to that possibility and I'm willing to bet that is the case. Midrange card with a high price tag and it's all thanks to AMD/ATI...lol
    My toys...
    Asus X79 Deluxe | i7 4820K | Koolance CPU-380I w/Triple Rad/Swiftech Pump | RipjawsX 16GB 1866MHz | eVGA GTX 780 TRI-SLI | X-Fi Surround 5.1 Pro USB | Intel 530 120GB *2 RAID 0, Intel 510 250GB, Samsung 840 Pro 120GB, Samsung 840 500GB, Kingston V300 240GB | Corsair AX1200i | In Win D-Frame Orange | Win 8.1 Pro 64
    Asus Sabertooth Z77 | i7 3770K | NH-C12P SE14 | Vengeance 32GB LP | eVGA GT 240 | X-Fi Titanium Fatality | LSI SAS 9211-4i | Intel 330 120GB, Seagate 500GB *2, Samsung 200GB, WD 320GB *4 RAID 10, 500GB, Raptor 74GB | Antec TPQ-1200W | Corsair 650D | Win 8.1 Pro 64
    Asus Sabertooth P67 | i7 2600K | NH-U12P SE2 | Vengeance Pro 16GB 1866MHz | eVGA GTX 680 | Sound via HDMI | Intel 330 60GB, Samsung 840 Pro 120GB, WD VRaptor 300GB, 150GB *2 | Antec HCG-750W | Lian Li PC-60FNWB | Win 8.1 Pro 64
    Asus P8H77-M/CSM | i3 3220 | Shuriken | Vengeance 16GB LP | eVGA GT 610 | Sound Blaster Play | Hauppauge WinTV-HVR-1600 & HD PVR | Asus PCE-AC66 | Kingston V100 128GB, WD 1GB, 500GB, Seagate 2TB | Enermax Liberty 500W | Fractal Design Core 1000 | Win 8 Pro 64 w/Media Center
    Asus P8H77-M/CSM | i3 3220T | Hyper 212 Evo | Vengeance 8GB | eVGA 210 | Hauppauge WinTV-PVR-250 | Intel 330 60GB, WD 750GB, 250GB | Enermax Liberty 500W | Antec 300 | Win 7 Premium 32

    Axial SCX10 2012 Jeep Wrangler Unlimited Rubicon Modified

  6. #1231
    Registered User
    Join Date
    Feb 2010
    Location
    Cebu, Philippines
    Posts
    59
    $550 launch prize
    $339 in 2-3 months after launch...

    Gigabyte GA-X38-DQ6
    Core 2 Quad Q9450 @ 3.4Ghz (Zalman CNPS9700 LED)
    Corsair Twin2X4096-6400C4DHX @ DDR2-1066
    RIP GeForce 9800 GX2 715/1720/1050
    2 x 500GB WD Caviar SE (RAID 0)
    Corsair HX-620W
    ACER P243WAID 1920 x 1200

  7. #1232
    Xtreme Member
    Join Date
    Jun 2008
    Location
    Winter Springs, FL
    Posts
    278
    Quote Originally Posted by zalbard View Post
    A mid-range card sold for $550 would be taking the piss. Smells like price fixing.
    With a small die like that and just 2GB of VRAM it can't possibly cost more than $200 to manufacture (probably WAY less).
    This seems to be a common misconception from lots of people. CPUs and lots of other hardware cost very little to manufacture (imagine the SB-E chips are probably $50 a piece, if that). However, the reason they cost so much is because of R&D. It costs hundreds of millions in research for each chipset and years of work. I once spoke to a man who tried to start manufacturing his own CPU and by the time he got done he was in about $50 million and the chips were $10 each to make.
    Why yes, yes I do use Koolance..*Flame Wall Inbound*

  8. #1233
    Xtreme Addict
    Join Date
    Mar 2009
    Posts
    1,116
    Quote Originally Posted by UrbanSmooth View Post
    If Nvidia is calling this so called "mid-range" card the GTX 680, what will the so called "high-end" card be called?
    you've missed the plot! they skipped the big chip. gk104 (which is late) IS the high end.

    the next big chip will be so late it need not be 600-anything. expect gtx 780 for christmas...

  9. #1234
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    are we there yet?

    All along the watchtower the watchmen watch the eternal return.

  10. #1235
    Xtreme Addict
    Join Date
    Sep 2010
    Location
    Australia / Europe
    Posts
    1,310
    ^^ LOL I wanna put down bets that we'll hit 75 pages... any takers?

    oh by the way Stevil
    http://www.xtremesystems.org/forums/...=1#post5067626

  11. #1236
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    480
    Quote Originally Posted by dasa View Post
    if i had to guess i would say that the shadow res is stretched across the entire shadow field of view rather than implemented on a per shadow basis as reducing the shadow view distance also sharpens the shadows
    Ah ok.

  12. #1237
    Xtreme Member
    Join Date
    Jul 2010
    Location
    california
    Posts
    150
    GK104=GTX680, NDA ends on March 22nd

    Pictured at chiphell:





    As of now, the author says he still haven't got the driver yet (so no benchmark possible).

    Link: http://www.chiphell.com/thread-367824-1-1.html
    This guy is xtremely lazy

  13. #1238
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Sydney , Australia
    Posts
    1,600
    If chiphell can't benchmark it because they don't have drivers then are all other leaked benchmarks fakes?

    High end GK110 will be a GTX780 ..... Unless AMD can pull something magical out of their hat/bag/ass lolol

    Nvidia should call this a 670 and leave the 680 name in reserve, otherwise we have the weird situation where next release will have no new tech/new architecture etc. I want the 110 now, even of its $1000 per vid card, not like we haven't done that before ( maybe you guys haven't paid us$1000 but we have paid over au$1000, that's for sure)

    Bencher/Gamer(1) 4930K - Asus R4E - 2x R9 290x - G.skill Pi 2200c7 or Team 2400LV 4x4GB - EK Supreme HF - SR1-420 - Qnix 2560x1440
    Netbox AMD 5600K - Gigabyte mitx - Aten DVI/USB/120Hz KVM
    PB 1xTitan=16453(3D11), 1xGTX680=13343(3D11), 1x GTX580=8733(3D11)38000(3D06) 1x7970=12059(3D11)40000(vantage)395k(AM3) Folding for team 24

    AUSTRALIAN DRAG RACING http://www.youtube.com/watch?v=OFsbfEIy3Yw

  14. #1239
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    294mm2
    3.54 billion transistors
    195W
    http://www.chiphell.com/forum.php?mo...9&pid=11424899

  15. #1240
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by The Jesus View Post
    This seems to be a common misconception from lots of people. CPUs and lots of other hardware cost very little to manufacture (imagine the SB-E chips are probably $50 a piece, if that). However, the reason they cost so much is because of R&D. It costs hundreds of millions in research for each chipset and years of work.
    Oh, I am fully aware of this. However, it makes the calculation far more complex, since you'd have to account for money gains from future products as well (R&D doesn't have to pay off from a single card model, there will be a whole new line-up using the same tech, quite similar professional cards, and future refreshes may be largely based on it as well).

    My point simply is, while there may be lot of R&D involved, there is no doubt that calculated profits from selling these cards (at rumoured $549 MSRP) are going to be FAR greater (per card) than what you'd typically expect from such products, thus implying that they are priced too high and Nvidia is hardly trying to undercut competition and bring more reasonable pricing to the table.

    To put it short, this is a rip-off. These cost way too much (due to poor competition, price fixing or just sheer greed). If all of this is true, I sure hope that these sell in low quantities, so while profits per card are going to be WAY up, the overall profits from selling these are lower than expected, teaching Nvidia a lesson.

    And talking about performance... So, let's say these are 5% faster in games than 7970 (which is 25% faster than GTX580), making them ~30% faster than GTX580. Is this impressive? Not really. A simple die shrink of GTX580 (which has a hot and large die) would easily allow for a 30% overclock, so we'd end up with a cheaper (smaller die), 30%-higher-clocked-than-GTX580 card with the same 250W TDP. So, according to the rumours, we are going to pay good $250 extra for Nvidia lowering the TDP by 50W (and some GPGPU performance improvements most people don't care about). Doesn't really strike me as an awesome investment.

    We need GK100 (GK110?).
    Last edited by zalbard; 03-12-2012 at 11:23 PM.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  16. #1241
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by SKYMTL View Post
    Food for thought: maybe NVIDIA realized that they don't NEED what would be deemed a "high end Kepler" to trounce AMD this round. Maybe their mid range core is more than enough.

    /runs back out of thread
    Something is sure with 2x SLI connectors on it, i doubt they will realise anything faster before a while.. Or they have drastically change their politic about it. ( allow for tri and 4 quad SLI with midrange.. huum. )..

    IF there's a higher end parts who should drop, i hope we will get will get real infos. Not just marketing and rumors for 4-5 months more.
    Last edited by Lanek; 03-12-2012 at 11:31 PM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  17. #1242
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    420
    Quote Originally Posted by SKYMTL View Post
    Food for thought: maybe NVIDIA realized that they don't NEED what would be deemed a "high end Kepler" to trounce AMD this round. Maybe their mid range core is more than enough.

    /runs back out of thread
    If that's true then Nvidia is adorable. I have no problem hanging onto a GTX580 3GB until they release GK110 or GK112 (whichever model number is the high-end single GPU). If the 7970 were a better value (or anything remotely approaching value) I'd have jumped on it already.

    Unless GK104 gives significantly better performance than an overclocked GTX580 3GB @ 2560x1600, Nvidia can bugger off. And if they're serious with 2GB VRAM then that'll mean a few games will either be pushing or maxing out its VRAM already, let alone any games that haven't even been released yet. I mean 2GB VRAM on a "top end" card.........in 2012? I sure hope not.


    Quote Originally Posted by zalbard View Post
    And talking about performance... So, let's say these are 5% faster in games than 7970 (which is 25% faster than GTX580), making them ~30% faster than GTX580. Is this impressive? Not really. A simple die shrink of GTX580 (which has a hot and large die) would easily allow for a 30% overclock, so we'd end up with a cheaper (smaller die), 30%-higher-clocked-than-GTX580 card with the same 250W TDP. So, according to the rumours, we are going to pay good $250 extra for Nvidia lowering the TDP by 50W (and some GPGPU performance improvements most people don't care about). Doesn't really strike me as an awesome investment.

    We need GK100 (GK110?).
    Whoa, whoa, whoa wait a minute. a 7970 is 25% faster than a stock GTX580 1.5GB and that's in best case scenarios. A brand new 3GB GTX580 is still 40-50 Euros cheaper than the 7970.

    Minimum frame rate is equally (or more) important than the average. A 45FPS average is worthless if you're dipping down into the teens. At 1920x1200 and below the stock GTX580 1.5GB actually beat the 7970 in minimum frame rate in Dirt 3 (coincidentally the same title that lots of 7970 manufacturers seem to bundle with their cards) http://www.anandtech.com/show/5261/a...7970-review/18

    40-50 Euros higher for a whopping 6.7 fps increase in minimum frame rate @ 2560x1600 and potentially equal or lower minimums at lower resolutions? I don't see the incentive. It's still overpriced. If you ignore brand-specific features (eyefinity) I fail to see any justifiable reason for paying the 7970 MSRP.

    I have zero loyalty to Nvidia but let's try to avoid Apple-esque reality distortion.

    I agree that we need GK110/112/100/whatever the hell the high-end single GPU is called.
    Last edited by kgk; 03-13-2012 at 12:32 AM.
    Bill Cosby: Stewie, what do you think candy is made out of?
    Stewie Griffin: Sunshine and farts! What the hell kind of question is that?!

  18. #1243
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Has anybody actually discussed how awesome it would be if 190w beats the current high end cards by a fair margin? I mean 190w!!! That's fantastic! (if true)

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  19. #1244
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by bamtan2 View Post
    you've missed the plot! they skipped the big chip. gk104 (which is late) IS the high end.

    the next big chip will be so late it need not be 600-anything. expect gtx 780 for christmas...
    A few months behind a competitor isn't late. 6+ months is late. We've discussed that fact earlier.

    GK104 is only being looked at as high end because NVidia found themselves in a unique position where their midrange gpu was strong enough to beat their oppositions high-end. As such, they can sit and reap major profits and force AMD to show their next card, which will have to deal with the GK110.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  20. #1245
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by Tim View Post
    Has anybody actually discussed how awesome it would be if 190w beats the current high end cards by a fair margin? I mean 190w!!! That's fantastic! (if true)
    Fantastic? Hm, I would say it's normal

  21. #1246
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    While competing with Tahiti using a smaller chip sounds like a great improvement for Nvidia, the same can't be said about GK110 launch schedule, so I guess it balances out. Most 580 owners probably won't "upgrade" for Tahiti level performance even if GK104 were a lot more efficient. It's not surprising though that Nvidia will follow AMD in slapping their performance GPU with an enthusiast price tag.
    "No, you'll warrant no villain's exposition from me."

  22. #1247
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    420
    Quote Originally Posted by DilTech View Post
    A few months behind a competitor isn't late. 6+ months is late. We've discussed that fact earlier.

    GK104 is only being looked at as high end because NVidia found themselves in a unique position where their midrange gpu was strong enough to beat their oppositions high-end. As such, they can sit and reap major profits and force AMD to show their next card, which will have to deal with the GK110.
    That's my reasoning as well. Until we see actual GK104 benchmarks though it's hard to be certain.

    The problem this presents for some people is that if GK104 is only a smidgen ahead of the 7970 that we wind up with two overpriced and unimpressive cards because AMD doesn't have anything ready to regain the top spot (dual GPU monstrosities are irrelevant in this case). So Nvidia rides out GK104 until AMD can release an 8000-series card at which point they finally dump the flagship GK110/112/100/whatever it's called.

    Quote Originally Posted by Pantsu View Post
    While competing with Tahiti using a smaller chip sounds like a great improvement for Nvidia, the same can't be said about GK110 launch schedule, so I guess it balances out. Most 580 owners probably won't "upgrade" for Tahiti level performance even if GK104 were a lot more efficient. It's not surprising though that Nvidia will follow AMD in slapping their performance GPU with an enthusiast price tag.
    And that's what I don't understand. The GTX580 has been out since November of 2010. Plenty of 580 owners are ready to upgrade (myself included) but I'm not going to do it for a little gnat-fart GK104 when everyone knows the big leap will be the GK100/110. GK104 is seeming more and more like a stop-gap card to keep Nvidia from hemorrhaging any more customers to AMD.
    Last edited by kgk; 03-13-2012 at 12:44 AM.
    Bill Cosby: Stewie, what do you think candy is made out of?
    Stewie Griffin: Sunshine and farts! What the hell kind of question is that?!

  23. #1248
    Xtreme Member
    Join Date
    Jul 2010
    Location
    california
    Posts
    150
    Quote Originally Posted by kgk View Post
    If that's true then Nvidia is adorable. I have no problem hanging onto a GTX580 3GB until they release GK110 or GK112 (whichever model number is the high-end single GPU). If the 7970 were a better value (or anything remotely approaching value) I'd have jumped on it already.

    Unless GK104 gives significantly better performance than an overclocked GTX580 3GB @ 2560x1600, Nvidia can bugger off. And if they're serious with 2GB VRAM then that'll mean a few games will either be pushing or maxing out its VRAM already, let alone any games that haven't even been released yet. I mean 2GB VRAM on a "top end" card.........in 2012? I sure hope not.
    I'm with you. Recall that the S3 ViRGE had 2MB VRAM about 15 years ago. Now we have 2GB VRAM on mainstream cards. It's a 1024-fold increase, which is strictly and strikingly consistent with Moore's Law (doubles every 18 months)!

    For an enthusiast dumping serious money into high-end graphics cards (especially for SLI to max out graphics including AA), it doesn't make sense to make a quad-SLI with only 2GB VRAM in Year 2012, not even for 1920x1080.
    Last edited by minpayne; 03-13-2012 at 12:48 AM.
    This guy is xtremely lazy

  24. #1249
    Xtreme Member
    Join Date
    Nov 2010
    Location
    Valencia, Espaņa
    Posts
    146

  25. #1250
    Xtreme Member
    Join Date
    Jan 2008
    Posts
    299
    I guess 50% better than 580 in 3dmark11 isn't to be sniffed at.
    Last edited by spicypixel; 03-13-2012 at 01:40 AM.

Page 50 of 143 FirstFirst ... 404748495051525360100 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •