This is the place I first saw the rumor (translation).
Printable View
This is the place I first saw the rumor (translation).
lenzfire and Chiphell are basicily the same chart
Swap core labels
3 Stream Processors =1 CUDA core
GK100 was 6 GPC's 2304 SP (768 Cuda)
GK110 is suppose to be double GK104 4 GPC's 1536 SP(512 Cuda)
GK110 would be 8 GPC's 3072 SP (1024 Cuda)
If they switched to Stream processors they won't need to hotclock the shader
Your post makes absolutely no sense.
Why put 1536 cores with 2+GHz into a mainstream part? That card would eat power like crazy and you would have to increase TMUs, ROPs etc. accordingly to accommodate these cores. No way they have 1536 cores and hotclocks at the same time.
If those charts are true, the nv next gen cards will be sweet.
Haven't we seen that info before and now it's just nicely put into a table or?
Anyway from a quick glimpse, those specs looks to be the most realistic that's been presented so far. It's just one thing I find a little amusing, when you compare which cards have 1.5GB/1.75GB VRAM vs 2GB VRAM. :P Well GTX 660 looks like a very good bang-for-buck card that I've been waiting for.
550m² sounds like very large for GK110 though, I can't help wondering what power consumption on a 550m² / 6.4B transistor card would be like. :o The efficiency better be good or we are probably around ~300W barrier but I'd personally bet it's more like 280W.
Oh well I hope at least the GK104 specs are true, Nvidia would sell a lot of those cards no doubt and at 290mm² which is relatively small for Nvidia and probably not too huge TDP rating, maybe 175~189W'ish that I also hope overclocks nicely (I'm thinking up to like 1050~1100MHz ofc depending on the ability to tweak the voltage), the Nvidia could have a big winner there.
I hope 690 comes with more than 1.7gb of video memory, freaking 999$ pretty sure they can squeeze in more memory for that much.
I am traveling so dont really want to read the whole thing instead of youtube...
New - http://semiaccurate.com/2012/02/06/h...plergk104-die/
http://semiaccurate.com/2012/02/01/p...lergk104-fast/
Wonder how hard it would be to get one of these for a sponsored build :cool:
Looks like GK104 gained some weight since the last time Charlie saw it. :ROTF: I'm sure these double and triple confirmed specs and price will change a dozen times before launch.
I dont know if the specs are correct but i do expect them to launch in April.
too good to be true???
italian pr confirm april lauch.....
The specs look believable..not saying its true or fake. If true I'm really dissapointed with the VRAM sizes. Ati has 3GB across the board.
More people have larger res screens now...1.5gb on $400 x60 ti - Very dissapointed
we will see
huum posted allready the 2Fev 2012 here when Lenzfire have release this article and, a second time when Expreview have release the article based on Lenzfire article yesterday, closely followed by Fudzilla who have repost the Expreview article.
If we continue to repost what have allready been posted and discussed for 4 days allready ...
Seen that chart already, if it is legit, too little VRAM, again.
My bad...
Nvidia better hope GK104 is good if their big chip is coming late this year. By that time AMD will be close to bringing their refresh for 7000 series.
Some more news.
The GTX 680 or it's naming equivalent(name hasn't been set in stone yet), will be based on GK104. You heard it right, they want to fight AMD high-end only with the GK104 chip. But don't worry, bigger things are still coming.
This means, that the card, that comes in April will be lower clocked, we're talking 900-950MHz, performance between 7950 and 7970 and will be after all priced at $299.
The last bit of info reveals answer to Eyefinity, which should not surprise anyone.
Do you really think nvidia can produce a vid card for $299 that AMD has to charge $550 for? Or is AMD charging this because they can (against the gtx580 RRP)? If this is the case how could they announce a 50% price cut to stay competitive? Or will they then just hang in there until the next refresh?
:)
If vardant is right, this can mean Charlie last article about GK110 who have been just only tapped out can be right too. and you will not see it before Q3, end of Q3. ( Gk100 abandonned ? ). ( ofc i will wait real official infos, i cant say i trust anyone today )
The problem i see there is about performance of GK104, I really doubt Nvidia will price it 300$ if the card was enough for fight with 7950 ( in cherrypicked res, game, settings ).. Yes in some games the 7970 is just upper of the 580 (and popular games, i think to BF3 or Dirt ), in some it goes too a lot better, if AMD can pull some performance gain out of a driver in thoses games.. it will start to be really hard for Nvidia, if they dont have the high end.
I dont know why, perf of a bieffed GK104 around the 580 can looks right, so let say more or less 7950 performance ( 7950 is a bit faster, and i dont even speak about OC models, but it is not important yet )... this will put a point to this price: 7970-500$, 7950-400$, GK104-300$ ?
Thoses are performance card, and price dont have to be at 450+ , but if the GK104 is really between 7950-7970, they will fight in price with the 7950.. so 350-375-400$ .. If the point is they dont have an high end card yet, the 300$ can be too a solution, for win on "price for perf ", (and we know thoses middle range cards sell the more ).. or give a better light on this card. But i dont see a card with a so low price, between 7950 and 7970..
We cant say Nvidia have been an angel when it come to pricing his cards so, even more if they have not high end for sell, why dont put it close of the 7950price, and play the price war if needed.
About naming scheme, ( something who make me really doubt about this type of info ), marketing wise it will be completely nuts to call it GTX680, whatever is the performance, if it dont end faster, and high end should coming then, call it 6600TI and play the wait game for high end.
(again i respond to the vardant post, i dont say i trust any info yet, i mean: this + charlie post is a little bit too much suspect when they come at the same time, could be just noise. )
More likely it's somewhere around GTX580 or 7950 and priced close to 400$. If GTX 580 is really EOL like some claim, GK104 is the only thing that can replace it. Hoping for 7970 performance and 300$ price is absurd, that's simply not the way the market works, even 4800 wasn't that good of a deal. Of course if the performance is only close to 580 it might end up with a 300$ price tag if we're lucky. I think Nvidia have a good spot between 7870 and 7950 if the estimations about 7870 price and perf are true. They could price GK104 between 300-400 and AMD would have no rival there.
That performance level with that price is really great news, but if the performance is really between 7950 and 7970 then it's almost a GTX 580, right?
Will NVIDIA kill its current flagship card without releasing her big dog?
Though I really doubt they're going to price it at 299 bucks.
Who cares about the wording. It is late. Late to the market, late to beat the competition. It's been 15 months since GTX 580 release, and it's going to be 20 months in Q3.
Also, Nvidia isn't known for great price-performance. While GK104 may compete with 7950, it will be quite expensive if so, with GTX 580 also staying expensive (their high-end...), 7950/70 staying expensive, and 7990 being insanely expensive. And this may last till Q3 thanks to Nvidia... :rolleyes:
Even if they where releasing the new cards today it would not greatly affect pricing, if they are faster they will be more expensive than the competing products on the high end. All we would have is expensive flagship cards from both camps without a price war, at the high end they simply don't do price wars there.
Midrange is another story, probably anything from $150-$200 is going to be the meat and potatoes of the lineup and where the line in the sand is drawn.
EDIT: If anything AMD has finally given Nvidia the market pricing ranges they would prefer to be in at the high end again, Nvidia will simply follow suit in pushing the price ceiling back up where it was in the g80 days.
theres alot that we might not be able to see in the background
if nvidia launches at a high price, amd would not lower unless they see stock start piling up. what the competition is doing is not their main reason for pricing, its their own supply and demand. naturally having more cards in the high end does mean they will probably sell less, but for all we know they will still sell every card at the high price, even with competition out.
if nvidia comes out and sets their prices low, it could be because they believed they couldnt clear their stock at the higher prices, which might be due to having great supply quantities. or they are doing it knowing they earn less per card, but also undermine the competition and forcing them to earn less per card.
About it..
http://translate.google.com/translat...icherer-quelle
Well read the article too, as it is a good read coming from them, knowing they have too launch rumored spec thoses last months.
I really hope the 256 bit membus isn't true, I don't see GK104 standing a chance against the 7970 at higher resolutions
Hmm, how long 'till some fake slides start floating around?
PS: wait, let's get started :rofl:
http://img716.imageshack.us/img716/7...ld68ayfkmh.jpg
PS: don't get mad, jk :P
only 8.26" long, pff. i need a card atleast 10" before i can get any enjoyment out of it.
I'm liking mine more :D
Attachment 124036
thats how accurate this entire thread has been
GTX680 = GK104
3*32SPs/SM
8 TMUs/SM
4 SMs/GPC
4 GPCs
32 ROPs
256bit bus
2GB 5.0Gbps GDDR5
950MHz core
2*6pin
________
http://forum.beyond3d.com/showpost.p...postcount=1538
Someone found the originals from that OBR fake. Basically background image of a GTX550 and the benchmark slide from a GTX580.
http://forum.beyond3d.com/showpost.p...postcount=1546
The majority of their marketshare doesn't care about $500 cards. AMD hasn't effected the sweet spot price on any current gen card. They released a nice enthusiast card for multi-monitor users but this isn't like last gen. Nvidia is competing just fine with ever card in there lineup but GTX580.
So NVIDIA is going to name the mid range GTX 680, fail naming scheme, reminds me when they used the 9800 GTX name as a mid range card.
you should read the post 555 here. if you want i can make it GTX660 or any name if you like. (Thanks to ManofAtlantis on beyond3d)
http://img713.imageshack.us/img713/187/88aqz.jpg
and the slide for the chart is exactly one used when 580 have been released ... I had suspect something yesterday and so i have just find this one in few sec on Bing images search ( type GTX580 slides ).
http://img35.imageshack.us/img35/6276/slide4u.png
I add the white square, less disturbing.
http://img694.imageshack.us/img694/1076/slide4mod.png
(just have to redo the slide with same color sheme and update the game with new ( Dirt2>Dirt3, BC2 > BF3 etc ), or just mod it.
1 alone it was ok, but not both at same time.. ( its still possible Nvidia will use the old chart and moddifiying it, but it is really doubtfull, the 550 backround image slide you can easy find on the web, he's used for the spec )
Note: OB .... have remove his article and someone have too get this screen and discussed it before http://translate.google.com/translat...icherer-quelle
nVidia GK104
Product name (top solution): GeForce GTX 680 (but not entirely sure)
28nm production at TSMC, the area approximately 340mm ² (unofficial estimate)
4 Graphics Processing Cluster (GPC)
4 Streaming Multiprocessors (SM) aka shader clusters per GPC, ergo a total of 16 shader clusters for the GK104-chip
96 stream processors (SP) aka shader shader units per cluster, ergo a total of 1536 shader units for the GK104-chip
8 texture units (TMU) aka shader texture units per cluster, ergo a total of 128 units of textures for the GK104-chip
32 raster operation units (ROPs)
256-bit DDR memory interface (up to GDDR5)
Chip clock (top model): 950 MHz
Elimination of Hotclocks, no extra clock speed of the shader units more
Single-precision arithmetic performance 2.9 teraflops , double precision with 1:6 = 486 DP GFlops processing power, 121 Texturierleistung Gtex / sec
Memory clock (top model): 2500 MHz, memory bandwidth so that at 160 GB / sec
2048 MB of GDDR5 memory configuration
http://www.3dcenter.org/news/die-akt...rformance-chip
Now where have I seen that before? ;)
chiphell maybe ? ( outside TMU count, clock speed, DP Flops etc )
Anyway, this bring an interesting question.. 4SM x 96SP bring 1GPC = 384SP (4gpc )... the exact count of the GF114 ( or 104 with 1SM disabled ).. or if you like better a 580 with 96SP /SM .
I ask me how performance wise work thoses Cudacores, cause it is clear they are " a lot smaller ", and this is not due to 28nm.. Outside other part have been drastically reduced in size or completely removed ( Polymorph engine, etc etc )
If Nvidia is going to something pretty similar to AMD this round ( SP wise ), this could be really interessant.
it does get them something. if most of your profits come from server side, and your competition makes most from client side. you might try to sell your client side stuff as low as possible and making them less sufficient to compete due to lower margins and lets you gain more of their market share. its basically an oligopoly and the only sales you get are ones your competition did not get.
This might be relevant:
Rambus
I don't think Rambus has anything to offer at this point, it was merely about stopping their patent trolling attempts.
Maybe, or maybe it was "cheap", because Rambus recently lost a lot of its value.
You're making a very basic assumption that companies strive for monopoly status as a primary goal. A race to the bottom does nothing for nVidia's share price so no, there will be no freebies from the green team just to make AMD bleed.
If nVidia has an ace with Kepler they will use it to make more money, not less (see G80). There's also the question of supply - you can only undercut prices so much until you can't meet demand.
I wouldn't say nothing. If they can sell enough units at a price that hurts AMD's sales, it would increase their market share and ergo their stock price. It's all about bottom line profits, not how you get there (think state functions). If they increase their bottom line while reducing their competition, then naturally more people will be willing to invest their money with the green team.
There's no reason to believe nVidia can gain market share via lower pricing. AMD will simply match them. Net result is no market share gains and both companies lose. The company with lower costs simply loses less - only the customer wins.
Price wars work if one company has significantly lower costs due to economies of scale, vertical integration, operating efficiency etc. Otherwise it's a useless tactic.
Regardless of whether gk104 is top chip or not, if it competes with and possibly beats (in specific games) the 7970 then what stops Nvidia charging prices similar to 7970, or a little lower to force a drop from AMD? If its not the highest gpu then why would it be called a 680 and not a 670? The $299 price point makes it more like a 660 even ... There is no way any 680 labelled card will release at $299rrp, how much did 280-480-580 release at?
Maybe cause this card have not the performance some want to make believe it have, and nothing is ready for the high end yet in the side of Nvidia ? ( pure supposition, im drunk and dont want enter a ranged war now .. )
The best i have read thoses last weeks, is peoples who think Nvidia want to lower the price of the market for going back, to a lower price for high end ... yes ofc..
Sources are now telling SemiAccurate that Nvidia has two variants of the GK104 in the pipe. These two variants hint at a finer grained fusing ability for the end product.
The two siblings are said to be GK104-400 and GK104-335, basically a full working and partially fused off version of the same chip. The -400 is said to be an “8 group” device, the -335 described as “7 group’. If you recall the sad tale of Fermi/GF100, the chip had large swathes of shaders turned off, the ability to do less radical surgery was not there. This is a fairly painful way to deal with defects, the more granular you can make the disabling, the better off you are.
Nothing comes for free in the silicon world, and the art of chip design is balancing granularity with cost. Nvidia botched this badly in Fermi, and paid a high price. The only good that came of it was the entertainment in seeing their spokespeople spin ever increasing leaps of logic in public. This however doesn’t placate investors much, even if they do smile.
With this new description of the -400 and -335 variant of GK104, it looks like Nvidia has implemented what AMD has been doing since at least the R700 (HD4000) chips, if not earlier. Instead of being forced to fuse off large blocks of shaders as a minimum, it looks like they can now do much smaller chunks. In Fermi terms, instead of taking a CU at a whack, they can now do portions of a CU too.
This should greatly improve yields, allow for endless SKU variations, and generally make things better. Of course, it comes at a die size penalty, but after the last learning experience, it would be foolish to do otherwise.S|A
http://semiaccurate.com/2012/02/09/t...pler-variants/
Interesting, TPU has posted it with that article as a source, calling the rumor "reliable"... http://www.techpowerup.com/160263/NV...Machinery.html
?_? <-- my face after following this thread, I've never seen so much rumored specs floating around that I probably wouldn't even notice if Nvidia confirmed them and call them fake. :P
^QFT
This thread's reliability is inversely proportional to its number of pages; and directly proportional to the numbersss of countlessss-ish rumorsssss...
I was hoping at-least few infos were solid at this point, hell even the cards' numbering ain't confirmed; no one knows if it will still have "Geforce" on it. LOL!
Might be time for the mods to move this thread.:D
i think the lack of info is because they will launch a mid range card below the current 580 price. and if they told us that then they would loose all sales from when they said it, until it launches. if they had something really powerful coming soon, they would be talking about it to slow down 7970 sales.
For me the only way it seems to make sense is that maybe their big X80 chip is delayed somewhat. They said stuff about 7970 not impressing bla bla bla and thats all, they havent touched on when they are going to have something that will make it look ordinary. Now rumors of a X60 chip are pretty consistent so im gonna go with that. Its going to be a ~350$ chip seems to be the most sensible assumption I can make out of it.
Maybe this is like Fermi all over again but this time it might be worse because they cant release a X80 chip so we are going to see something like GTX465 ? Which in the short term will be a disaster till they get their X80 and X70 ready with decent thermal performance.
Maybe March will bring a new flavour... Nvidia should have something out by then, and AMD will have New Zealand out
edit** OMG 777 post! better go to the casino and bet it all on red (they don't offer green as an option)
It might be worse? Not really. It looks much better. With Fermi they needed 6 Month form Amds High-End and Performance chip to launch and all the small stuff was also 6months late compared to amds small stuff. This time they'll launch all chips except of the top-dog in April (At least in Notebooks). That's just 1,2 Month behind Amds smaller parts. Nvidia Top-Dog might again come 6 Month late, but probably you just can't launch such a big chip with early process.
Real NVIDIA Kepler, GK104, GeForce GTX "670/680" Specs Leak Out
http://www.brightsideofnews.com/Data..._Mockup_68.jpgQuote:
In the past few weeks, we've seen various fishy rumors on the product specifications of first discrete GPU using the upcoming 28nm Kepler architecture the GK104. While we have known parts of the specifications, such as no hot clocks, the doubling of Streaming Multiprocessor (SM) node from 48 to 96 CUDA cores (i.e. Stream Processors), 256-bit memory controller, the real specifications are (finally) here... even though, our information differes minimally from information originally posted on 3DCenter.org.
Quote:
NVIDIA Kepler GK104 Architectural overview: at first look, very similar to GF110, but then you take a deeper look: 1536 Stream Processors instead of 512!
So explain to me again what makes their leak "real"?
-PB
Liked this bit:
Finally, just about time... Just a hint on "when" so maybe that April news is true after all, April 1? XDQuote:
You won't need to wait for too long, as NVIDIA is already starting pre-sale activities, and getting ready to counter AMD and their momentum with the Radeon 7700 (Cape Verde, February 15), 7800 (Pitcairn, March 6) and 7900 Series (released).
1536 really? uh i am shocked
:D
I believe it when I see it.
I would love it to be true though, too many suggested specs around at this point. If it's true Charlie's positive article regarding GK104 would turn out to be true and Nvidia indeed would truly have a winner in their hands. I dunno but it just seem unrealisticly positive regarding performance/price haha.
http://img703.imageshack.us/img703/8...nstitrexxw.png
I have put side to side, the BSN scheme and the GF110 ones.. Just for see the difference in density of the SP ( ofc this is a schema lol, but this give a idea of it )... now try imagine it in a 340mm2 cores instead of 550mm2... I dont even want try to imagine a GK100-110. this make 3x more SP in an SM...
The only way is barely divide the cuda cores (SP ) size by 300%, or i dont see how they can do it in 340mm2.
Lol yes, i doubt we will have a core schema before the release or before the " first presentation" of the card by Nvidia general..
ok, i didn't read the article. I thought that was supposed to be a legit pic of the architecture and commented on that assumption ;)
I have to assume using hotclocks decreases transistor density but who knows by how much.
What was the most recent Nvidia hotclocked GPU that we have a real die shot of?
Shaders, for AMD, don't take up as much of the die as one would imagine, at least in previous vliw5 architectures. Jawed at b3d did a breakdown of Juniper, I think, few years back. I'll see if I can find it.
Well these guys surely seem the most confident out of the rumors I've read, but I still don't like how these numbers look eerily similar to AMD's design. I guess the simple answer would be that Nvidia decided their superscalar design was too expensive and resulted in poor yields compared to AMD, but even if they did triple the core count, if they removed the hotclock wouldn't that result in just a minimal increase?