You don't jump from 12m/mm2 to +14m/mm2 at the highend when you can't even hit 12m/mm2 for your lower chips.
If transistor count is right at 7.08b then they are looking at a best case of ~540mm2 to a worst case of ~585mm2.
Printable View
It doesn't matter. If you go from 80 to 120, there is a 50% improvement, or a 33% worsening, depending on where you stand. But it doesn't matter if you are talking in %, mph, liters or whatever, math is the same.
---
ON TOPIC: I still don't understand how is it possible that a chip not designed for rendering can perform very close form a GTX690 whilst having a lower power consumption, on the same node. Unless the benchmarks used are cherrypicked it seems veeeery weird.
Regarding large dies and heatspreaders:
Guys, this may be a silly question and I'm sure there is a logical answer for this but what stops gpu manufacturers putting multiple cores in a single die - CPU style? Surely now, looking at how small and powerful the 680 is, this would become a possibility, right?
Counting down the days until Titan is released....
:D
~585mm2 would be pretty much be a 1:1 transistor avg scale up of gk104, I surely don't think that will be the case.
If the chip has allot of cache for gpu compute it that will push the transistor count fast but should be laid out densely compared to logic.
Either way Nvidia is not new to making large dies, if anybody can pull off engineering an insanely large 7b transistor die for retail consumption it's Nvidia.
In some sense, that's essentially what CUDA cores are. :p:
SourceQuote:
Originally Posted by fellix@B3D
Yeah even a display port would work, BUT i use a Apple cinema display with my GTX 590, which only has a male mini display port on it so I need a Male display port to Female mini-display port... which I cant find ANYWHERE.
So Female mini display to Male Display port...
Somebody please tell me what the hell this cards supposed to be. I havent been up to date for the last 5 or 6 months + did not read all the pages and am now unsure about its place/rank/...
So... is this "TITAN" the flagship single-GPU of the next generation (780GTX?), with lower spec cards following? Or is this just THE Kepler, that was replaced by GK104 at launch?
What confuses me the most is the price (900usd?) and the delay of AMDs next gen cards.
Thanks in advance.
Cheers
It's just the big Kepler that was supposed to come back when the GTX 680 was originally released. No new gen is coming any time soon, though this will be able to compete with that too. Most likely though this will stay a curiosity like 8800 ultra back in the day, mostly just for PR and some xtreme enthusiasts, and will never actually drop in price to reasonable levels.
Estimatiom of 520mm2 ... not 502mm2 ... as said 14.1M of transistors / mm2 will be really surprising on the 28nm2 ... and like the GK 110 who will be use on Titan have been allready released 6 months ago with the Tesla K20...
?If you believe the GK110 = GK100 you are really far of reality . The Power of the marketing
I don't think gk100 could have been released at gk110 speeds, but I wouldn't doubt if Nvidia could have released a cut down lower clocked part that was 25% faster than a gtx 680. However it would only serve to hurt Nvidia financially as such a card would reduce the price of the gtx 680(as it couldn't be called this anymore) and the super low number/high priced nature of gk100 would hardly offset the difference. 28nm seems to be less troublesome for both companies this generation and I can imagine yields are better(not supply of wafers). However financially 28nm isn't as good for Nvidia as 40nm, as they have to pay for the wafers rather than the number of good dies so monoliths makes less sense to produce for mass production. Its best to wait for yields to be good I can imagine and sell the early stuff as high a margins as possible in the professional market.
When Nvidia turned the a gtx 660 ti(very likely) into a gtx 680, while naming gk110 into geforce Titan. they showed a very intelligent marketing move. By bumping up a efficient gaming card that doesn't consume massive amounts of power at high voltage, Nvidia made the gtx 680 look like a better card than it really was for 500. With the absolute nutty pricing of the 7970, it made it really look like Nvidia took home every metric of power, performance and price.
Nvidia took advantage of AMD's idiocy(low clocks, high price) and made it so easy to make the gtx 680 look good.
The gtx 680 isn't really better than a 7970, however AMD's blundering marketing team made it possible for a overclocked gtx 660 ti look like a darling flagship card.
Who gives a fluke about GK100 ? GK110 was taped out before GK104 was on the street.
Bubble> Launch is in six days, yes six not five. This is the most powerfull VGA in history, only few percents under dual chip GTX 690. Without microstuttering, game profiles, noisy cooler, but for dual chip price. Price is the same as GTX 690. Card is really beautiful, looking really great. Cooler was made with great materials, like GTX 690.
OBR removed the results of your website, reason?
Oui, I really hope this isn't just a rumor and this this don't end up coming out.
Because I just sold my 680 lightning for $350...
NEO- I would have bought your Lightning for that .... used ones are still $500+ here
Lets hope for miracles...
Im sorry, I suspected this was the original big kepler also, whats the diff between GK100 and GK110? (excuse the n00b question...I just bench them...)
:D
There is/was no GK100, at least not in any conceivable form. Drawing boards maybe.
LOL The "680 is really a 660!" debate will never end. As far as I know, cpu and gpu manufacturers always have multiple designs on the drawing board and decisions on what to sell are governed by what can be sold most profitably. So if GK100 or whatever couldn't be sold 11 months ago as a product line, what turned out to be the 680 was. Similarly, AMD probably had much higher clocks intended for the 7970, or perhaps even hoped to have the 8000 chip done by then, but we got the 7970 because that's what they could produce at that point in time.
Everyone seems to forget these are for profit companies whose primary concern is to make money for shareholders, not give you cards that you can overclock the highest, or cards that perform the highest if doing so comes at a loss on current process.
There were rumors NV would put out a mid range chip that offered 90% of the performance of the 7970 before the 680 launched, and that a bigger chip would come in the fall. At the time I said,"If they price it $100 less it will clean up based on branding, and feature set". Imagine their delight when it turned out 110% of the 7970.
Corell> Everything from that article is known many days, Titan will not beat GTX 690, but its a single GPU against bloody problematic dual GPU
nVidia has always used the code name GK110 which actually had it's first tape out in December 2011 and was completely functional in May, 2012.
DUAL GK110, will be possible?
Imagine a sigle GPU POWER GK110 accompanied by a DUAL POWER GK110 is something very interesting. :D
Dual GK110 on a single GPU with a waterblock + radiator + pump included would be quite nice. All for 500$... please? :D
ARES II is Water Cooled too, why not dual GK110
At the price they are selling it (1500$) they should do Dual GK110... otherwise it's way overpriced.Quote:
ARES II is Water Cooled too, why not dual GK110
1000€... :mad::(
http://framebuffer.com.br/forum/attachment.php?aid=2037
Link
imo it is the entry price in 2 weeks it will settle
http://framebuffer.com.br/forum/attachment.php?aid=2038
http://www.2compute.net/Asus/GTXTITAN-6GD5.htm
that price makes me very discouraged, if maybe lower the price of the GTX 680 I make an SLI ...
the price of GTX 680 will be lower I think between $400-$440, soonQuote:
that price makes me very discouraged, if maybe lower the price of the GTX 680 I make an SLI ...
you can delete this post by go to Edit Post and than to Delete Post and Delete this post in the following manner:)
http://www.abload.de/img/doggsyanx.jpg
Waterblocks, allready ready for Titan .
Got a mail from EK waterblock.. a poll for vote for the design of their next waterblock.. for the Titan..
http://thinkcell.ekwb.com/idea/new-f...oose-your-bestQuote:
EK Water Blocks, Ljubljana based premium water cooling gear manufacturer, has opened a poll on EK-ThinkCell site where you can vote and choose your favourite design for the upcoming nVidia Geforce Titan Full Cover series water block as well as the redesign of the existing EK-Supremacy CPU water block.
( all my hope of getting a pcb screen ruined )Quote:
Just for the record. The VGA card in the back is NOT Geforce Titan!
And the question is : why -then- nvidia hasn't released it around that time to completely dominate the market given that AMD would had seemed to be unable to respond for -close to-two years in a row.
Is there a weird law against monopoly which would disable one of the competitors to release something that far better from competition? It seems senseless to me that nvidia gave the chance to AMD to breathe and even momentarily take the crown (driver update), what was the point of that?
I normally don't post but it really seems to me the vast majority of people here and around the net are overlooking the most basic detail with this card.
When a company has a product, one of of dividing into two different categories is, products that are made to sell, and products that are sold because they are made. This may seem like the same thing, but it is very different, almost all gpu's made for consumers are the first category, while Titan is a rare exception and falls into the second category.
Titan gpu's are not made to be sold as Titan's, they are K20x's that don't quite make it. So what they do is take a k20x, disable one of the core clusters from 15 to 14, remove the ECC memory, and then use the bios to turn DP units into integer unit to increase the graphics performance while killing their server use in one blow.
So when your asking yourself why didn't these come out earlier, it was a function of how long it took them to get a stockpile of these mostly working but not quite par chips. Also the sale price is the same, to get as much money from them they can, without gaining too much ire from consumers for exploiting the demand.
So there it is, this gpu is not made due to any sort of consumer demand verse profit made, so stop thinking in terms as if it was. The single gpu performance crown and all that other stuff here, is just bonus, nothing more.
K20X has 14 clusters. And it is not really clear if the DP units are SP capable. Assuming there are dedicated DP units, that is. That much is not really clear.
I was ask me the same thing some days ago..
Anandtech presentation of K20x
Quote:
Fundamentally GK110 is a highly enhanced if not equally specialized version of the Kepler architecture. The SMX, first introduced with GK104, is the basis of GK110. Each GK104 SMX contained 192 FP32 CUDA cores, 8 FP64 CUDA cores, 256KB of register file space, 64KB of L1 cache, 48KB of uniform cache. In turn it was fed by 4 warp schedulers, each with two dispatch units, allowing GK104 to issue instructions from warps in a superscalar manner.
GK110 builds on that by keeping the same general design, but tweaking it for GK110’s compute-focused needs. The single biggest change here is that rather than 8 FP64 CUDA cores GK110 has 64 FP64 CUDA cores, giving it 8 times the FP64 performance of a GK104 SMX. The SMXes are otherwise very similar at a high level, featuring the same 256KB of register file space, 64KB of L1 cache, 48KB of uniform cache, and the same warp scheduler structure. This of course does not include a number of low level changes that further set apart GK104 and GK110.
Opps delete post I was mistaken.
K20x while it has 15 clusters, only 14 are active, K20 has 13 like I thought.
Details from the anandtech review of the K20, http://www.anandtech.com/show/6446/n...rrives-at-last .
Due to driver enhancements last year (specifically 12.11), AMD managed to outperform the GTX680 with the HD7970s by the end of 2012(GCN latency issues aside) and costs a full 20% less. (Link)
I don't think anti monopoly laws come into play if no companies are trying to compete with you. If NVIDIA had these cards ready to go last year I think it could have only benefitted them. (the only thing I can think of they'd have to worry about is if they sold below cost to drive AMD from market)
Nvidia is not crazy.. this card for consumers was not ready... or GK100 will have exist ( when i was speak early about difference on GK100 and 110, this was not Nvidia have not release the GK100, just they have never push the GK100 outside early tapped out it who have been made around August 2011 , so impossible to know what was Kepler GK100 initially ) ... they have only start to push the GK110 in taped out in May.. ( 2 months after release the GK104 and 7 months after release pushed out the initial samples of the GK100 on test .. ), the GK110 have been starting production in end of summer.. ( who is fast for consummers production, not when you only send core to an HPC center, as yields are not important, what is important is you can delivery them ( who have finally been possible for 14000 chips 3 months later ) . K20 Tesla series have been then launch officially on Q4 2012.
If Nvidia will have launch the GK100 around march or even before ( January 2012 ), this cost 6 months for start the final production after tapped out.. so they should have pay TSMC during since month for launch the GK100 and finally put it down in december.. ( this cost too many time, engineer and money for let it down because GK104 was enough ..)
Its not a question of release them or not, what i was said, is when peoples said, this is the Kepler who was expected to be the GTX680, this is not true. GK100 have been taped out in august 2011 and never released, GK110 is not the Kepler GK100 ... the GK100 have been taped out and lost somewhere between august 2011 and december 2011. We all know how work Nvidia, Gx100 = first cores.. GK110 = the second based on the g100 architecture.
The GK110 is really a revisited GK100 ( with certainly more difference between the GK100 you can see on the GK104 ( more FP64 cores ( 64/SMX ), no Polymorphe engine ( no need for computing , they are replaced by pure DP units )
If the GK110 used in August 2012 for the Titan center, is not even able to use the full cores and SMX available ( one is disabled ), this is really hard to imagine they could have this chip ready for Q1 2012 as a consumer / gaming cards.. specially when it have been taped out in May 2012 ( so after the GK 104 have enter the market ) . And finally release it as a "titan cards " 8 months later .. who dont look to be included in the 700 series so far.
If we look at the rumors of Kepler, we reported that some of the GTX 680 using GK100/110, proximity information with current reality makes me believe that yes, GK110 was ready that time.
For example this table, see that little lacking to be 100% reliable, if we understand how it happens this time, we see that these results are real and these VGA's already existed at that time, of course, prototypes ...
http://tof.canardpc.com/view/685d64f...d96cc9dc9b.jpg
http://legitreviews.com/images/revie...Benchmarks.jpg
It's not so bad in the UK, once you convert a Newegg price to Sterling and add 20% VAT the prices are within ?5-10 on most components. Sure you may have to shop around but component resellers don't have anywhere near the profit margins of high street retailers. I'm sure in Eastern Europe this observation isn't true, but for the UK at least if you make the effort the only real overhead is VAT.
@kersh : i think it's mainly related to the VAT included / not included difference. but you're also right, they also add a little extra for europeans :'(
Any reliable source or just fantasy? Do you think Nvidia is so stupid and would repeat the mistake they made with Fermi, i.e. release a large GPU on an unproven new process?
You are going on and on about a GK100, but you have no proof whatsoever. Just stop if you cannot back it up. Seriously, it gets so old.
Lanek:
GK100 didn't exist and won't exist.
You can believe whatever you want, the truth doesn't necessarily reflects what you think or even wish.
If you want to get it right, at least talk about GK204 ( 3072SP ) which should be the next iteration to juice some more $$$ for nV later on.
Yes I believe I referred to that. But imagine if titan was out around July/august? No amount of driver updates would had saved AMD..
I have read quite many opinions above of why nvidia chose not to release yesteryear' but I'm still unsure which stand to reason and which don't...
Yep, $899 USD = $1199 AUD and $999 USD = $1299 AUD, even though our dollar is stronger than their dollar. And out stupid government wonder why Australians import so many goods :rolleyes:
There is no talking about when. Pretty sure it isn't coming at all.
GMxxx is Maxwell, GK204 is Kepler. By the time we have 3072SP on a single gaming GPU we will be well in Maxwell generation.
And I find it unlikely that there will be more GKxx0 professional GPUs released, either.
maxwell was tape-outed few months ago, but NV is waiting for 20nm production
GK110 has 15 SMX. A 16 SMX GPU would be a completely new design. Highly unlikely that we'll see that with Kepler since it really isn't necessary and would just cost money to design, tape out etc.
There are rumors that initial Maxwell parts will be 28nm too. Either they skip the traditional refresh and release Maxwell@28nm in Q4 2013/Q1 2014 or they release the Kepler refresh and release Maxwell@20nm somewhere in H2 2014.
Source? Which Maxwell GPU?
Key of your post... H2 2014 for Maxwell ;)
What do you think about the Maxwell@28nm rumors, then?
We'll get a Kepler refresh until then, right?
What I'm curious about is to see if this part really replaces GTX690 as rumoured.
As long as they're at different price/performance points, I don't see why it should. (and have to wonder if with improved yields on 114s/smaller memory bus if 690s aren't cheaper to make)
Am interested to see the reviews, with GTX680 SLi in my main box I probably don't have a lot of reason to buy this card. But I do love to play with new video cards, the rarer the better. I'd have a 690 if the quick sellouts wouldn't have outlasted my lust for one. Was on autonotify with EVGA and newegg for weeks trying to buy one.
Given the limited numbers too (in addition to it not being performance matched to the 690) I don't see Titan displacing any of the current 600 series. Its faster than a 680 but slower than a 690, so 690 and 680 SLI users have little incentive to buy one (unless theyre desperate to run single card and would take a performance hit to do it). SLI Titans would be badass but the price is likely to be hideous and unless you are playing modded skyrim or hitman etc 2Gb ram is enough.....and 4Gb 670/680 are available much cheaper. Would be nice to have a rare powerhouse of a card though is the only real appeal, and given no one knows just how good the 700 series will be.....hard to drop for me $1200+ per card on something esoteric and 'fun to play with' when I need 2 to not be a downgrade.....
How are these cards going to be anything more then 750-800usd if a 690 goes for 1k?
Besides them just screwing us then gouging on top of that,even two oc'ed 670's could top a 690 for 200 less.
Guess we will see how it goes in a week or so
Nope.
This.
At some point next year.
This year we will see GK114 (or something similar). Rumours about ~1920SP make sense. And then, the next GPU iteration is going to be Maxwell based. :)
"Limited edition!" "Fastest single GPU!" "Faster GPU than in the upcoming GTX 780!" "6GB of VRAM, enough to last you for years!"
Lots of way to market this, I'm sure you get the vibe...
For the record, with amd's 8xxx 7 months away now, and nothing new from nv as well anytime soon, I'll be grabbing two titans. :)
I had a bad experience with 5970's and I don't think I'll be going with dual gpu cards ever again.
I think 799 would be a fair price, especially considering there is 6gb of memory on these cards which in the past has added 100 dollars to a cards price tag. Considering all things, these cards probably cost atleast 50 percent more to make than a gtx 690 as the die size is close to double and we all know yields are like for a chip this big. I personally think the gtx 690 is overpriced by alot more than this card, considering the entire last generations MSRP was inflated.
I think cards around or under 300mm2, with 2gb of ram should all be price at 400 and under. And in addition, the price of a 2x version of a GPU should not be double the price, there should be some saving involved.
I don't think it is commercially feasible for Nvidia to continue to sell big monolith chips for 500 dollars like in the past. The R and D and the amount of wafers(plus the price going up) such cards take to produce make it not worth it. Next generation I would be happy to see Gm104 chips selling for 399, gm100 chips selling for 750-800(if the performance difference is 50%) and gm104x2 selling for the same. I want the big monolith cards to continue be sold as consumer cards and I think this means selling these cards well past 500 dollars.
Considered I bought my 470SOC 209 euros (big phat GF100) 2 years ago, it seems there's a problem with nVidia cards price since 6X0 series.
edit :
http://tof.canardpc.com/view/9bf806a...4b0f1798bc.jpg
http://bbs.expreview.com/thread-55910-1-1.htmlQuote:
Oh Hey, 7970 must be overclocked to 2GHz to overcome TT
Guys, I know this is pure speculation but what would you expect from gtx780 compared to Titan? Similar performance at a 680 price point? Maybe Titan will remain as flagship and 780 might have a more domestic quantity of ram? Who would utilise 6gb of VRAM anyway? Someone with three 2560 screens and increased texture packs? Or will intelligently managed games start using as much VRAM as they have available?
:)
It'll require quite a bit to match the Titan. I don't think even a 20 nm GK104-sized Maxwell will reach it. But of course we don't know what size chips Nvidia will release for next gen. It does look like neither Nvidia or AMD will release a proper new high end for desktop before 20 nm.
let me make this easy for you. the gtx 680 replacement will have less absolute performance, but more performance per dollar. this is unquestionable and obvious IMO.
the last thing we need in the next 48 hours is 100 posts about how gtx titan is a waste of money compared to gtx 780 in 10 months. yes guys, we know that
NVIDIA GeForce GTX Titan Final Specifications and Clock Speeds Confirmed ? NDA Lifts on 18th February
and the lack of a few hours to the end of the NDA does not exist on the site of Nvidia anything related to any launch. Titan is hoax? :rofl:
More like ares II 7990 > 690 > 780 > 7970 ghz > 680
Lol firestrike!!
Not with the rumored 837/876 clocks. The 690 would still be about 30% faster.
I wouldn't expect the GTX7xx line before Q4 and the 790 (if there even will be one) before Q1 2014.
Hmmmm, so will prices be cut on GTX 690s I wonder? And by how much? :D
Those specs on WFFC are about what I got, but without working Boost.
^^^^^^^^^^^^^^^^^
supreme commander was maxing out more like crapping out because not enough vram on 2GB gtx285 at single 1080p 4 years yes 4 years ago
2Gb ram is enough at what ?! benchmarking ? sure.. get on 3GB then youll see how 2GB is enough.. get on 4GB then youll see how 3GB is enough.. get my drift ?
how bout crysis 3 ? max out the cvar configurator and youll need 4GB at single 1080p
as for enough gpu power you can always add more gpu power if needed and its definitely needed
you guys oc/push your systems yet you dont push your apps/games.. you guys expect the developers to come thru?
these fookers are releasing more 32bit/low vram consoles and we are going to be in this ratchet bandwagon for a long time to come
dont agree with me ? dont fret !! ill be enjoying/maxing out every bit of 4 titans on at least a couple dozen games at single 1080p
Is the "near 690 performance" supposed to be with those low clocks too?
:)
the leaking has begun! that is definitely real.
first impressions are: 6+8 pin, 6gb, 250W+ (the vapor chamber heatsink is huge), and COULD BE 12"+ long. titan is definitely a big baby. only 7990 is 12.5" long, so I'm guessing titan is 11.5"-12", and obviously they want it not beyond 11.5", because some cases are limited to that.
Thx boxleitnerb and Olivon!
75W from the PCIe slot, 75W from the 6 pin and 150W from the 8 pin. So 75+75+150=300W