Page 4 of 11 FirstFirst 1234567 ... LastLast
Results 76 to 100 of 257

Thread: Nvidia unveils the GeForce GTX 780 Ti

  1. #76
    Xtreme Member
    Join Date
    Apr 2010
    Posts
    145
    Quote Originally Posted by tajoh111 View Post
    So now lets look at GK110 to GK180 again for the professional class of cards.

    732mhz -->900mhz
    2688 shaders --> 2880 shaders
    235watt TDP --> 225 watt tdp

    Now lets fill in the blanks for titan to gtx 780 ti
    876mhz --> ????
    2688 shaders --> 2880 shaders
    250watt TDP --> ???
    Doesn't the TITAN have a 837 MHz core clock? Even so, we get 1029 MHz core [5.93 TFLOPS] and 239 W TDP [24.8 GFLOPS/W] if the increase is proportional (probably won't be the case) and 933 MHz core [5.4 TFLOPS] and 245 W TDP [22.0 GFLOPS/W] if the increase is half that of the K20X to K6000 transition.

    We can also look at memory speed. The M2070 has 3.132 Gbps memory and the 6000 has 3 Gbps memory, so somewhat close.

    K20X to K6000: 5.2 Gbps to 6.0 Gbps

    TITAN to 780 Ti (taking into account the difference between the M2070 and 6000): 6.0 Gbps to 7.2 Gbps [347 GB/s] (yeah right) proportional and 6.6 Gbps [317 GB/s] with half the increase.
    Last edited by iMacmatician; 10-22-2013 at 03:15 PM.
    Quote Originally Posted by defect9 View Post
    Will the 9000 series will be named Pen Island?
    Quote Originally Posted by eXa View Post
    GTX 650 Ti Ghz edition?

  2. #77
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by iMacmatician View Post
    Doesn't the TITAN have a 837 MHz core clock? Even so, we get 1029 MHz core [5.93 TFLOPS] and 239 W TDP [24.8 GFLOPS/W] if the increase is proportional (probably won't be the case) and 933 MHz core [5.4 TFLOPS] and 245 W TDP [22.0 GFLOPS/W] if the increase is half that of the K20X to K6000 transition.

    We can also look at memory speed. The M2070 has 3.132 Gbps memory and the 6000 has 3 Gbps memory, so somewhat close.

    K20X to K6000: 5.2 Gbps to 6.0 Gbps

    TITAN to 780 Ti (taking into account the difference between the M2070 and 6000): 6.0 Gbps to 7.2 Gbps [347 GB/s] (yeah right) proportional and 6.6 Gbps [317 GB/s] with half the increase.
    It has a core clock of 837 but never runs at that speed, in gaming scenarios, is typically greater than 900mhz. Nonetheless, the 732mhz clock of k20x is constant. So in practice, there is an even larger difference between titan and k20x.

    There's also one more things that going to result in power savings for gtx 780 ti. Versus the k6000, it only has 3gb of memory to address rather than 12gb. They is going to likely be balanced out by using 7ghz samsung chips, rather than 6ghz hynix chips(if these choose to use difference memory).

    As I said early, I am not assuming the best case scenario where the increase is proportional, but even if just a little bit, like half the increase between k6000 and k20x, we still have very impressive increase for a respin of gk110. But this depends on a couple things, they actually use GK180 and the variability between bins on gk180.

    But knowing Nvidia as of late, they are going to clock it low, so they can brag about the fantastic performance per watt. But card partners can pull the slack here. With cards like a lightning gtx 780 ti or classified, I would expect cards with 1000-1050mhz clocks if only the process improvement allows gtx 780 ti to clock as well as gtx 780's. This is a pretty low bar and I would expect them to at least match this.
    Last edited by tajoh111; 10-22-2013 at 03:46 PM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  3. #78
    Xtreme Member
    Join Date
    Apr 2010
    Posts
    145
    Quote Originally Posted by tajoh111 View Post
    But knowing Nvidia as of late, they are going to clock it low, so they can brag about the fantastic performance per watt.
    I'm half-expecting the 780 Ti to have "only" a 225 W TDP.
    Quote Originally Posted by defect9 View Post
    Will the 9000 series will be named Pen Island?
    Quote Originally Posted by eXa View Post
    GTX 650 Ti Ghz edition?

  4. #79
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    Ok, so where are the facts confirming that 780 Ti is going to be faster than the Titan? Is there any other evidence out there besides Oj101 says so? (no offense)

    Strange how it suddenly has become fact in this thread... let me know if I missed something.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  5. #80
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by Dimitriman View Post
    Ok, so where are the facts confirming that 780 Ti is going to be faster than the Titan? Is there any other evidence out there besides Oj101 says so? (no offense)

    Strange how it suddenly has become fact in this thread... let me know if I missed something.
    Jens Huang did say it was their fastest graphics card we ever built when he unveiled it. So I guess you missed that.
    Last edited by tajoh111; 10-22-2013 at 07:05 PM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  6. #81
    Xtreme Member KiSUAN's Avatar
    Join Date
    Oct 2007
    Location
    Banana Republic
    Posts
    133
    Quote Originally Posted by tajoh111 View Post
    Jens Huang did say it was their fastest graphics card we ever built when he unveiled it. So I guess you missed that.
    And the prophet said:

    Take some screws, some glue and make the most powerful video cardzzzz...

    Click image for larger version. 

Name:	20091012nvidia11.jpg 
Views:	694 
Size:	284.1 KB 
ID:	131621

    All the fanboyz hail the prophet


    Guess he didn't missed anything

  7. #82
    Xtreme Addict
    Join Date
    Mar 2005
    Location
    Rotterdam
    Posts
    1,553
    That kinda sucks if you're a Titan user then. Buys a card last month for 1000 bucks and now someone else buys something faster for 650. I think if this mystic card is faster than Titan I find it hard to believe it will sell only for 650.
    Gigabyte Z77X-UD5H
    G-Skill Ripjaws X 16Gb - 2133Mhz
    Thermalright Ultra-120 eXtreme
    i7 2600k @ 4.4Ghz
    Sapphire 7970 OC 1.2Ghz
    Mushkin Chronos Deluxe 128Gb

  8. #83
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by Dimitriman View Post
    That kinda sucks if you're a Titan user then. Buys a card last month for 1000 bucks and now someone else buys something faster for 650. I think if this mystic card is faster than Titan I find it hard to believe it will sell only for 650.
    It really does. Titans $1000 dollar pricing was ridiculous and I personally hoped it would have been a failure for sales because people buying a 1000 dollar graphics card sends a bad message to Nvidia. That being, people will buy a $1000 if no competition is present. I would hope people who were looking for that type of performance after the gtx 780 released just bought an overclocked gtx 780 rather than a Titan.


    Nvidia was really conservative when they released Titan and gtx 780.

    The gtx 780 had 20% of its shaders disabled which is ridiculous for a part 650 dollars. Its not even close to a full chip and its being charged as such. This is even more cut down than the gtx 470 was which was pretty cut down.

    For a chip the size of GK110, its easy to make a card faster than titan when you think about it. Titan is clocked like a lame duck and as a result, there are gtx 780's which are faster than it, even with the big shader disadvantage.

    Cards like the gtx 780 ACX that are 5% faster than titan at high resolutions.

    http://www.techpowerup.com/reviews/E...Cooler/26.html

    All at the slightly less than mythical price of $660.

    As long as Nvidia uses gk110 with a decent amount of shaders enabled and doesn't clock it as low as Titan, then they should have a part that is easily faster.

    And at this point what do you think is smarter?

    Continue to charge 1000 dollars when your competitor has something competitive with it at 650 dollars and thus sales of that 1000 dollar product and your current 650 dollar product won't sell at all.

    Or release something thats faster at the 650 dollar range to take on that competitors product and thus, get sales.

    The gtx 580 was released prior to the 6970 and was 14 % faster than a gtx $480, it was faster and yet the cost was the same. Sure it might have pissed off gtx 480 owners that their card was being replaced with a card the same price, faster and better everywhere(sound and heat and overclocking). But Nvidia released it anyways 8 or 9 months later(the same as this launch).

    Personally I still think the gtx 780 Ti will have less shaders than 2880. There's a good chance it won't have 7ghz memory either but what it should be is faster than Titan. There are too many OC versions of the gtx 780 that are faster than titan already at $650. Releasing something slower than titan at 650 dollars, is a regression from the current gtx 780 OC edition lineup. It won't have any impact and won't be a counter.

    Well atleast those Titan owners still have 6gb of ram and non-crippled Double precision(which is the only reason they should have bought it in the first place).
    Last edited by tajoh111; 10-22-2013 at 08:18 PM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  9. #84
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by tajoh111 View Post
    If we examine the specs between gk110 based k20x tesla, the clocks of the cards were only 732 mhz, yet it had a rated tdp of 235watts. Pretty low clocks, but GK110 based titan still goes to 876mhz(even more in practice).

    Look at GK180 based K6000 and you have a card that is clocked at 900mhz, has more shaders and has a tdp 10 watts lower TDP than k20x. That's a massive improvement. Before you say they aren't comparable since Tesla is different than the workstation quadro, lets look at the fermi generation.

    Tesla M2070 and Quadro 6000 are both based on the same GF100 and if you look at the clocks, they are exactly the same at 574mhz and not only that, they have the same TDP at 225 Watts. They have the same clocks and TDP rating

    So now lets look at GK110 to GK180 again for the professional class of cards.

    732mhz -->900mhz
    2688 shaders --> 2880 shaders
    235watt TDP --> 225 watt tdp

    Now lets fill in the blanks for titan to gtx 780 ti
    876mhz --> ????
    2688 shaders --> 2880 shaders
    250watt TDP --> ???

    So if any of GK180's improvements carry over, even just a little bit(and they used GK180 instead of Gk110), then you have room for monster clocked cards, just as much as gtx 780 if not more.

    The 900mhz clocks and 225 watt TDP on a fully enabled big keplar is very impressive for quadro.
    One important question:
    Does Quadro have full DP rate or not? Afaik, DP is more power intensive, hence the lower clocks on the Teslas and on Titan once you enable DP. It was always my understanding that Quadro is a workstation card. Geometry, graphics etc. And Tesla is for DP. That's the whole sense of Maximus: Put one Quadro and one Tesla together, each with their specialized tasks.

    Second question:
    Will a potential 15 SMX-GeForce get GK180 or GK110 and if there really is an improved energy efficiency, is it due to design or due to better binning/process improvements over time?

    Third question:
    TDP != power consumption. Or better: Is TDP (Quadro/Tesla) comparable to TDP (GeForce)?

    Under sustained gaming load, Titan and the GTX 780 often clock near the base clock due to the low temperature target:
    http://ht4u.net/reviews/2013/nvidia_...st/index10.php
    http://ht4u.net/reviews/2013/nvidia_...iew/index9.php

    http://www.hardware.fr/articles/887-...ost-tests.html
    http://www.hardware.fr/articles/894-...ost-tests.html

    https://www.computerbase.de/artikel/...-gtx-titan/19/
    https://www.computerbase.de/artikel/...80-im-test/11/

    Now with those lower clocks, power efficiency is much better already. The Titan uses about 206W on average, the 780 uses 189W (cards only, no efficiency losses at the power supply, direct measurement):
    http://www.3dcenter.org/artikel/eine...rafikkarten-st

    I simply don't see that much potential for improving energy efficiency between comparable operating points (either no boost vs no boost or full boost vs full boost). Voltage is key here. The difference between no boost and boost is a whopping 0.16V! With my Titan, I measure 50-70W difference between base clock@1.0V and 1006 MHz@1.162V (whole system), and this is in line with the measurements of other reviews that made this kind of investigation.
    Last edited by boxleitnerb; 10-22-2013 at 09:36 PM.

  10. #85
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    if 780 ti > titan and 780 ti approx 650 then this shows me how nvidia paniced because of 290x. i still don't get it totally (console deals explain some panic of nvidia but not all the panic after 290x) but of course they saw something i don't see, i am not able to see.

    EDIT: maybe they didn't expected 290x to take this much attention and didn't expected amd to make this much performance card with this price.

    What makes me happy is all my predictions before even release of Titan seems to come real. Of course still way to go but they are still on track. After today i will not call them predictions, i will call them forecast
    Last edited by kromosto; 10-22-2013 at 09:50 PM.


    When i'm being paid i always do my job through.

  11. #86
    Xtreme Addict
    Join Date
    Apr 2011
    Location
    North Queensland Australia
    Posts
    1,445
    Oh well, buyers remorse for buying top tier gear I suppose.

    -PB
    -Project Sakura-
    Intel i7 860 @ 4.0Ghz, Asus Maximus III Formula, 8GB G-Skill Ripjaws X F3 (@ 1600Mhz), 2x GTX 295 Quad SLI
    2x 120GB OCZ Vertex 2 RAID 0, OCZ ZX 1000W, NZXT Phantom (Pink), Dell SX2210T Touch Screen, Windows 8.1 Pro

    Koolance RP-401X2 1.1 (w/ Swiftech MCP35X), XSPC EX420, XSPC X-Flow 240, DT Sniper, EK-FC 295s (w/ RAM Blocks), Enzotech M3F Mosfet+NB/SB

  12. #87
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by kromosto View Post
    if 780 ti > titan and 780 ti approx 650 then this shows me how nvidia paniced because of 290x. i still don't get it totally (console deals explain some panic of nvidia but not all the panic after 290x) but of course they saw something i don't see, i am not able to see.
    I wouldn't call it panic, more like a natural response.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  13. #88
    I am Xtreme
    Join Date
    Feb 2007
    Posts
    5,413
    Still installing dual Titans in my LAN rig Four 780s already in my main one. Three extra 780s perhaps to upgrade my son's 3 680s
    "Thing is, I no longer consider you a member but, rather a parasite...one that should be expunged."

  14. #89
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    Quote Originally Posted by LordEC911 View Post
    I wouldn't call it panic, more like a natural response.
    if they made all these after the release of 290x yes i would call this a natural response but a fast organized event before amd's event announcing a new card with even haven't decided the price and some other things... these are all seem panic to me.


    When i'm being paid i always do my job through.

  15. #90
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by boxleitnerb View Post
    One important question:
    Does Quadro have full DP rate or not? Afaik, DP is more power intensive, hence the lower clocks on the Teslas and on Titan once you enable DP. It was always my understanding that Quadro is a workstation card. Geometry, graphics etc. And Tesla is for DP. That's the whole sense of Maximus: Put one Quadro and one Tesla together, each with their specialized tasks.

    Second question:
    Will a potential 15 SMX-GeForce get GK180 or GK110 and if there really is an improved energy efficiency, is it due to design or due to better binning/process improvements over time?

    Third question:
    TDP != power consumption. Or better: Is TDP (Quadro/Tesla) comparable to TDP (GeForce)?

    Under sustained gaming load, Titan and the GTX 780 often clock near the base clock due to the low temperature target:
    http://ht4u.net/reviews/2013/nvidia_...st/index10.php
    http://ht4u.net/reviews/2013/nvidia_...iew/index9.php

    http://www.hardware.fr/articles/887-...ost-tests.html
    http://www.hardware.fr/articles/894-...ost-tests.html

    https://www.computerbase.de/artikel/...-gtx-titan/19/
    https://www.computerbase.de/artikel/...80-im-test/11/

    Now with those lower clocks, power efficiency is much better already. The Titan uses about 206W on average, the 780 uses 189W (cards only, no efficiency losses at the power supply, direct measurement):
    http://www.3dcenter.org/artikel/eine...rafikkarten-st

    I simply don't see that much potential for improving energy efficiency between comparable operating points (either no boost vs no boost or full boost vs full boost). Voltage is key here. The difference between no boost and boost is a whopping 0.16V! With my Titan, I measure 50-70W difference between base clock@1.0V and 1006 MHz@1.162V (whole system), and this is in line with the measurements of other reviews that made this kind of investigation.
    Quadro has uncrippled double precision like k20x. Its was one of the main bullet points for it. FP64 is set at 1/3 like the rest of the keplar Tesla line up, not 1/24 like the gaming cards.

    http://www.anandtech.com/show/7166/n...s-quadro-k6000

    That is what makes the frequency and TDP so impressive. Although TDP doesn't exactly mean power consumption there is a correlation and heat(which is TDP) was one of the things sort of holding back clocks. So improved TDP = improved clocks.

    No one knows the exact specs of gtx 780 ti, just that its going to be faster.

    But there is one thing that should tell us if it is based on gk110 or gk180, the number of enabled shaders. My impression was gk110, it was difficult to say the least to get 2880 shaders. Thus for gtx 780 ti to be mass producible and it has 2880 shaders, it has to made on a respin, otherwise they would run out too quickly.

    This is akin to GF110 and Gf100. GF100 couldn't get a fully enabled 512 core part to the public. However GF110 or the respin with some optimizations could. Not only did the respin show improved thermal characteristics it also allowed 10% higher clocks while doing it. 10% increase in clocks, 9 percent on the memory while decreasing power consumption by 15% was huge. I am not expecting this type of improvement as GF100 was crap in the first place, but if Nvidia is willing to push gk180 to a similar level of let say a 7970 ghz edition combined with the improved efficiency, it would have a product just a touch slower than a gtx 690. That is a product that is 15 percent faster(10-15% more clocks, 7 percent more cores, 17% more bandwidth) than Titan while not using anymore power. But the only way to achieve all this is they use gk180.

    Cutting out the DP is bound to save some power and allow for higher clocks. Just look at gf100 quadro 6000 and gtx 480. Quadro 6000 only had clocks of 574mhz while the gtx 480 was clocked at 700. The same thing repeated it self sort as a I mentioned early with k20x and gtx titan.

    I am not saying its going to be a 2880 shader part or even 2688 shader part by the way. I have no clue the specs besides it being faster. If we get a leak that shows one of the fore mentioned figures, its gk180 based. If its 2496 or in the worst case, 2304 shader based, then it gk110.
    Last edited by tajoh111; 10-22-2013 at 11:12 PM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  16. #91
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    How would disabling DP save power? The DP-units are dedicated, they are dormant under gaming loads, aren't they?

    I heard that GK110 has indeed some problems when having all 15 SMX enabled, due to the cache system. My source was very vague on this, so I don't know specifics.

    780 Ti will have 2688 shaders at least, that much is certain. Only questions left are if it has even more and where the clocks are at.

    Btw the 902 MHz of the K6000 is only boost, base clock is 797 MHz:
    NVIDIA Quadro K6000 Graphics Card based on the GK180 GPU
    Core Count: 2880
    Base Clock: 797 MHz
    Boost Clock: 902 MHz

    The Quadro K6000 is the first NVIDIA Professional solution to incorporate Quadro Boost, a mechanism that automatically maximizes application performance while staying within the specified power envelope. For workloads that do not reach the allowed power level, the GPU clock is automatically increased to "Boost Clock" in order to leverage the remaining power budget for additional increased performance. The GPU will always try to reach the higher "Boost Clock" in order to maximize application performance.
    The Quadro Boost feature will disable "Boost Clocking" in the following scenarios:
    If clock jitter sensitive "Workstation features" like Sync, SDI, or SLI is enabled, the Quadro K6000 will clock to Base Clock automatically.
    If the user manually selects the "Prefer Consistent Performance" Control Panel option to lock explicitly to Base Clock (can be useful for performance tuning of applications during development).
    http://h18000.www1.hp.com/products/q...a/14708_na.PDF

    So (K20X -> K40/K6000):
    732 -> 797+
    2688 -> 2880

    I wouldn't expect more than 900 MHz base clock for the 780 Ti with 2880 shaders.
    Last edited by boxleitnerb; 10-23-2013 at 12:15 AM.

  17. #92
    Xtreme Member
    Join Date
    Oct 2012
    Posts
    338
    780 Ti has little bit higher then Titan performance, less memory, price will be higher then GTX 780 (little bit). In detail, 780 Ti IS TITAN with 3GB and higher clocks, BUT this puupy will be available in custom versions! What about (780Ti) Titan Lightning?

  18. #93
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    So this time around absolutely no fully unlocked version with 2880 cores for GeForce?

  19. #94
    Xtreme Member
    Join Date
    Oct 2012
    Posts
    338
    We can still hope for that, i smell something with BLACK cooler here

  20. #95
    NooB MOD
    Join Date
    Jan 2006
    Location
    South Africa
    Posts
    5,799
    And I smell some 6 GB non-reference cards, but they won't be the cheapest.
    Xtreme SUPERCOMPUTER
    Nov 1 - Nov 8 Join Now!


    Quote Originally Posted by Jowy Atreides View Post
    Intel is about to get athlon'd
    Athlon64 3700+ KACAE 0605APAW @ 3455MHz 314x11 1.92v/Vapochill || Core 2 Duo E8500 Q807 @ 6060MHz 638x9.5 1.95v LN2 @ -120'c || Athlon64 FX-55 CABCE 0516WPMW @ 3916MHz 261x15 1.802v/LN2 @ -40c || DFI LP UT CFX3200-DR || DFI LP UT NF4 SLI-DR || DFI LP UT NF4 Ultra D || Sapphire X1950XT || 2x256MB Kingston HyperX BH-5 @ 290MHz 2-2-2-5 3.94v || 2x256MB G.Skill TCCD @ 350MHz 3-4-4-8 3.1v || 2x256MB Kingston HyperX BH-5 @ 294MHz 2-2-2-5 3.94v

  21. #96
    NooB MOD
    Join Date
    Jan 2006
    Location
    South Africa
    Posts
    5,799
    I might also be smelling 6 GB reference cards, hmmm smells good
    Xtreme SUPERCOMPUTER
    Nov 1 - Nov 8 Join Now!


    Quote Originally Posted by Jowy Atreides View Post
    Intel is about to get athlon'd
    Athlon64 3700+ KACAE 0605APAW @ 3455MHz 314x11 1.92v/Vapochill || Core 2 Duo E8500 Q807 @ 6060MHz 638x9.5 1.95v LN2 @ -120'c || Athlon64 FX-55 CABCE 0516WPMW @ 3916MHz 261x15 1.802v/LN2 @ -40c || DFI LP UT CFX3200-DR || DFI LP UT NF4 SLI-DR || DFI LP UT NF4 Ultra D || Sapphire X1950XT || 2x256MB Kingston HyperX BH-5 @ 290MHz 2-2-2-5 3.94v || 2x256MB G.Skill TCCD @ 350MHz 3-4-4-8 3.1v || 2x256MB Kingston HyperX BH-5 @ 294MHz 2-2-2-5 3.94v

  22. #97
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Sydney , Australia
    Posts
    1,600
    Quote Originally Posted by PedantOne View Post
    780 Ti has little bit higher then Titan performance, less memory, price will be higher then GTX 780 (little bit). In detail, 780 Ti IS TITAN with 3GB and higher clocks, BUT this puupy will be available in custom versions! What about (780Ti) Titan Lightning?
    I'm so annoyed for buying a GTX780 Lightning at launch now, only two weeks old and its outdated already .... watch for pricing on the 780Ti Lightning, if its faster than Titan then it will probably cost more too ....

    Bencher/Gamer(1) 4930K - Asus R4E - 2x R9 290x - G.skill Pi 2200c7 or Team 2400LV 4x4GB - EK Supreme HF - SR1-420 - Qnix 2560x1440
    Netbox AMD 5600K - Gigabyte mitx - Aten DVI/USB/120Hz KVM
    PB 1xTitan=16453(3D11), 1xGTX680=13343(3D11), 1x GTX580=8733(3D11)38000(3D06) 1x7970=12059(3D11)40000(vantage)395k(AM3) Folding for team 24

    AUSTRALIAN DRAG RACING http://www.youtube.com/watch?v=OFsbfEIy3Yw

  23. #98
    Xtreme Member
    Join Date
    Oct 2012
    Posts
    338
    I think 780 Ti Lightning will not be here that soon, in two/three months maybe

  24. #99
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by LordEC911 View Post
    I wouldn't call it panic, more like a natural response.
    Im even sure they had many solution available well before last weeks. In general AMD, Nvidia tests different solution or simulate them just for choose the sku available, so they nearly know what they could obtain from different variant, they then decide between TDP, clock speed, temp, noise, binning, availability and ofc price..

    They could have launch a 780 allready with the same amount of SM of the Titan, with 3G..
    Last edited by Lanek; 10-23-2013 at 02:38 AM.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  25. #100
    NooB MOD
    Join Date
    Jan 2006
    Location
    South Africa
    Posts
    5,799
    Yup, but it IS coming.
    Xtreme SUPERCOMPUTER
    Nov 1 - Nov 8 Join Now!


    Quote Originally Posted by Jowy Atreides View Post
    Intel is about to get athlon'd
    Athlon64 3700+ KACAE 0605APAW @ 3455MHz 314x11 1.92v/Vapochill || Core 2 Duo E8500 Q807 @ 6060MHz 638x9.5 1.95v LN2 @ -120'c || Athlon64 FX-55 CABCE 0516WPMW @ 3916MHz 261x15 1.802v/LN2 @ -40c || DFI LP UT CFX3200-DR || DFI LP UT NF4 SLI-DR || DFI LP UT NF4 Ultra D || Sapphire X1950XT || 2x256MB Kingston HyperX BH-5 @ 290MHz 2-2-2-5 3.94v || 2x256MB G.Skill TCCD @ 350MHz 3-4-4-8 3.1v || 2x256MB Kingston HyperX BH-5 @ 294MHz 2-2-2-5 3.94v

Page 4 of 11 FirstFirst 1234567 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •