http://www.abload.de/img/map_c_1120x941d2o94.png
http://ascii.jp/elem/000/000/658/658986/index-3.html
Printable View
for the ASCII graphs: Asynchrone shaders clock ? 1200/3000? look a bit too fast for hot clock, GK110: 20SM x 32SP - 4TMU/SM ( GK104 is the GTX 560TI 448SP with faster clock, 14 SM or the GTX470 ).
Well at least there's new numbers to put in the rumors mill.
yeahQuote:
Well at least there's new numbers to put in the rumors mill.
I have never seen information that was so contradictory. Hotclock, no hotclock...GK110= single chip, GK110=2xGK104...384bit, 512bit
Maybe Nvidia will surprise us all?
Edit:
The memory bus data of Fermi is messed up bad...
So no 512 bit memory bus? If NVIDIA decides to keep using the 384 bit memory bus, I really hope they up the VRAM to 3GB.
I have read this time the traduction on AscII, this is even worse we can imagine. I think they mostly make a compilation of rumors, with some we have allready heard for GCN ( XDR2 )....
http://translate.google.com/translat...2Findex-3.html
I think there's something we dont seen when rumors come from Chinese, Taiwanese or Japanese site. There's a lot of "company", "well informed source (lol ) ", who play to launch a lot of rumors. There's a big numbers of company involved in the production on electronic components, a lot of little brand who sell offtly one type of gpu, a lot of brand we will never seen outside the Asia market.
Jensen on Kepler
In the last year, if not even longer, Nvidia has talked more about Tegra than any of their other products. This is despite the fact that anyone who has seen at their balance sheets knows that they make money mostly on Geforce and Quadro products.
After the keynote, we managed to ask Jen Hsun Huang, Nvidia CEO, a single question. We asked, where is Kepler? The answer we got was that we have to be "patient" about it.
http://www.abload.de/img/jensen_press_ces2012kmwco.jpg
Furthermore, we discovered that there was an internal discussion to showcase Kepler at this keynote but Nvidia decided not to. As far as we know, it was a business decision to wait, and Kepler is production scheduled to launch at a later date. Current estimates are that it will launch in first half of 2012, possibly sometime late in Q1.
Written by Fuad Abazovic :D
I like how "Fatal1ty" is in this pic with Jen. In fact before reading your post I thought it was about him. Barely noticed Huang..
oh lol, didn't know Jonathan Wendel? was still around. hahah
Q3 ex-master on top :)
adrenaline Site Talking about the performance of GTX 660 (GK104) card It will come faster than GTX 580 (GF110) 15% , and this card GTX 660 will cost only 399$ ! If we consider that this is true This means that this card GTX 660 (400$) will come with the same performance of HD 7970 (550$) card Or slightly lower( between the Radeons HD 7970 and 7950). Because HD 7970 Faster than GTX 580 15-20% and GTX 660 faster than GTX 580 15% This gives us the result that GTX 660 will be the same performance of HD 7970 or slightly lower.
Of course, sudden major if this card is priced 399$. Mathematically this is mean HD 7970 cost you 550$ and GTX 660 cost you 399$ Percentage GTX 660 (399$) Cheaper than HD 7970 (550$): 399$-550$= 150$ (37.5%) and The difference between the performance of the cards is : Either the card will be the same performance HD 7970 or slightly less. This means that GTX 660 card is the best solution.
Of course, before anyone attacks me :D This is the analysis of the rumor that we talk about it... Who knows maybe this rumor be true may be incorrect
Its happened before!
Why not? If 660 is around 7950, and 670 slightly above 7970 and 680 above that it wouldn't be to different from earlier series. nVidia might be aiming higher than AMD, the later release suggests that.
Besides, I don't think 15% higher performance would put it that close to 7970. 7970 is around 35% faster than GTX 580 at high resolutions. It would still be closer to GTX 580 than 7970. So it would mirror the situation in the last series pretty good.
These rumors actually come from OBR (here) so I doubt that they are true
15% - are you sure?
e.g. http://www.computerbase.de/artikel/g...stung_mit_aaaf (14 games/2600k@4.5ghz)
1920x1080 4xAA/16xAF +20%
2560x1600 4xAA/16xAF +30%
1920x1080 8xAA/16xAF +29%
2560x1600 8xAA/16xAF +33%
1920x1080 no AA/AF +18%
2560x1600 no AA/AF +25%
I’m talking about performance, that's best measured with no bottlenecks at higher resolutions. In those cases 7970 is at least 35% faster. Of course you won’t see such results with bottlenecks in the mix.
http://www.sweclockers.com/image/dia...1caa4640d34453
http://www.sweclockers.com/recension...deon-hd-7970/1
Review is in Swedish but I hope you can read graphs, the first image is an Index over the average performance, in individual games it’s sometimes faster and sometimes slower.
NVIDIA would have to be out of their minds to price a card that outperformed the GTX 580 at a level that UNDERCUTS their own product. The entire point of price structuring through old / new lineups is to effectively clear out stock of their existing products while still selling a new architecture.
Very simply and as illustrated well by AMD: if a graphics card outperforms a previous generation part that's still being sold, expect it to be priced at a higher level.
Seems like people who have no clue about the GPU market are talking out of their rear ends again in an effort to generate traffic.
I make a recapitulation, will be more easy.
5th jan: http://www.fudzilla.com/home/item/25...amds-28nm-lead
11th : http://www.fudzilla.com/component/k2...nsen-on-keplerQuote:
Meanwhile AMD’s 28nm lineup looks very good and despite silent promises made by Nvidia to its partners that “Kepler is much better than that“, they are expressing serious concerns about AMD’s lead, or rather they how to go compete against AMD until Kepler launches. Of course, at this point nobody knows what to expect from Kepler in terms of performance and all we have to go on are Nvidia’s own rosy predictions.
A few partners also confirm that they have heard talk of a Q1 launch for Kepler, but there is no better data than that. A few other independent sources expect to see the cards in Q1 2012 but then to actually start selling the parts in volume in Q2 2012. In any case, AMD will launch much of its 28nm lineup by then and it will open a significant lead with the HD 7970.
We will try to learn more next week at CES, probably some huge announcements on Monday, January 9th 2012 so stay tuned.
11th after confirmation by AMD of the 7950 launch the 31th Jan. http://www.fudzilla.com/graphics/ite...n-january-31stQuote:
After the keynote, we managed to ask Jen Hsun Huang, Nvidia CEO, a single question. We asked, where is Kepler? The answer we got was that we have to be "patient" about it.
Quote:
We managed to overhear that AMD's Radeon HD 7950 will officially launch on January 31st. The card should be available as of that week, while the rest of the lineup should follow in February.
In case you somehow missed it, the Radeon HD 7950 features 1792 stream processors, or 28 GCN compute units, and a 384-bit memory interface paired up with 3GB of GDDR5 memory. The reason behind the January 31st launch date might have something to do with the Nvidia's plan to launch its 28nm Kepler based GK104 GPU sometime around that date and AMD simply wants something to coincide with Nvidia's launch, or ruin its day.
It's the 'still being sold' part that's the issue with your logic here. There was a huge gap between the 480 and the 460 launch, and the 460 easily has more transistors than the 280/285 before it and is faster and was sold cheaper than the 280/285. What we're not mentioning here is the fact that TSMC have raised wafer prices, so Nvidia may not be able to price their *60 much lower than say the 570. But I imagine they'll just stick to the old method of launching high end and harvested high end parts first and waiting for a couple of months before releasing the tier below that.
Who has an idea as to how this new lineup will perform in bitcoin crunching ?
True, they're even more fake then all this..
http://videocardz.com/nvidia/geforce-700
btw if you check that OBR leak really close, you will see a sausage in the background LMAO.
anyway, according to the latest rumors we won't see GK104 before June..
lets hope the new cards from nvidia can do 3 monitors off 1 card and have more than 2gb of memory, something last generation was lacking.
naa, they need to push their 3d tech, and the 3 monitor setup. so i would expect it to do both and support 6 monitors or 3 at 120fps, otherwise they fall behind in tech by way too much.
Any news of if there is any actual new tech instead of just improvement on things? I would like to see something new and revolutionary. Gcards are boring me as of late, for something so essential, they are so lack luster in appeal as far as original engineering goes. I'm talking about getting rid of certain subsystems and completely replacing them with something else. Either it may not be feasible or whatever, or maybe I have been watching too much science fiction lately, but it would be nice to see something completely new.
- Living in Tech Dreamland (perhaps)
WTF is wrong with people calling Nvidia's next gen 700 series? logically its should be 600 series. I am not sure why there are so many people assuming this bull.
I do not think Nvidia will play sheeple and skip 600 naming scheme just to be on par number-wise with AMD, even more if they have a strong product, naming its just a marketing gimmick usually used when a bubble effect needs to be added to a product.
(regardless as to what both companies have done in the past)
Ummmm.....2XXGTX - 4XXGTX?
Short memory....
i thought they did it because its a new architecture....2xx was one arch, 4xx-5xx is another arch, 7xx is a new arch....
There's a difference between an architecture evolution and a new mArch. Fermi have open the line of a new mArch, Kepler is just an evolution of this mArch. There will be some major change, but not a new idea behind the architecture.
(Polymorph engine, a SIMD based architecture... ( Basically a group of SMID who include the computer parts + 1D Stream processors ( Cuda cores ), regrouped in GC ).
The question stay open about the numbers of SP/SM and how many SM by GC...
"Kepler" Features 256-bit and 2GB Memory - EXPreview
Quote:
Though we are unable to confirm the concrete specifications and how effective the performance of first-launched "Kepler" will be, well-informed source indicated that "Kepler" will feature 256-bit memory controller, its corresponding graphics will pack 2GB memory, have TDP of 225W. We infer the first product may be the GK104 "Kepler" GPU. Judged from the memory controller and memory capacity, it is supposed to be difficult to defeat AMD Radeon HD 7970 or even Radeon HD 7950, in addition, TDP of 225W doesn’t have much going for it compared with the competitor.
Quote:
As for Radeon HD 7950, the source revealed that the graphics card will launch on January 26th, several days earlier than the previous report.
Now, that's an interesting info, late March-early April, seems confirming my earlier info from a friend in a local computer distro, but the performance estimation is underwhelming, since earlier he said it would trade blows with Tahiti. Well, perhaps it WILL, if you use NVidia's review guideline (the same kinda guideline that put HD 7970 60% faster than GTX 580). :D
IF 256 bit buswidth rumor comes true, i agree, it won't be able keeping up with Tahiti, and the US$ 399 MSRP seems more logical that way, with performance around ~GTX 570 or HD 6970.
Well if this around 220~225W speculations are true for GK104 it somewhat worries me what to expect about GK100, another GTX 480 coming? (provided it hasn't been cancelled by now :P) Performance wise I still expect this GK104 to be like 5~10% faster than GTX 580 (from the much higher clock speeds and increased cuda cores) though but definitely won't beat 7970 however it's probably quite close fight with 7950.
The NVIDIA top dog is going to have a 256 bit memory bus and 2GB of VRAM? is this meant to go against the 7970 or lower?
Posted Today (saturday) http://www.fudzilla.com/graphics/ite...-early-q2-2012
While 3D and 3 monitor technology seems to be the latest trend in both computer graphics and home theater, IMO 3D seems to be more of a marketing gimmick that companies have used to capitalize on the popularity of Movie Theater 3D. I know many people who ran out and bought very expensive 3D LED LCD's right after Avatar even though they already had very nice flat panel HD screens at home, and after a couple months of occasionally watching something in 3D (with very expensive glasses), pretty much none of them use it anymore. Then there is the fact that 20-25% of the population get sick watching 3D. Read this...
As for triple-screen gaming... once again, that is another niche market that I don't see becoming all that common. The vast majority of Americans typically have one computer monitor, and most of the time it's a pretty crappy one. I don't for see Joe Public who buys their computers at Best Buy suddenly buying graphics cards that cost more than his entire computer and then tripling up on monitors. Only a very few, very dedicated, and well funded gamers even know about triple monitor gaming let alone actually own a setup like that.
^
its about checking off features. lots of people who buy expensive gpus want to keep them for years and want future proof. if you walk into a store and one card shows 6 screen gaming, and the other shows nothing, you question if you will want those 6 screens before you upgrade again. (and with prices of screens it can be pretty cheap to get 6 screens too, although they will suck). its all about marketing tactics. even if something is used 5% of the time, its more stuff to keep your sales from going to your competition.
i also like the idea of pushing higher framerates, i cannot wait for the day of 240fps gaming where you can spin your mouse fast and never loose track of you position.
I think there is a modest chance that nvidia is waiting for 7950 information.
If Nvidia is trailing AMD and only has a mid-range chip ready, it will be extra important to get this product positioned right. they cant launch GK104 blindly and risk it looking unfavorable compared to 7950, especially with no GK100 coming soon. they will wait, and if they can clock GK104 to match or beat 7950, they will try their darnedest to do that.
AMD and Nvidia may be in a month-long game of chicken right now. I expect AMD to end the 7950 NDA once everyone already has one.
I agree with most of that... except one thing:
Most people that try 3d gaming stop using it.
Not one single person who has seen my eyefinity setup said it wasn't amazing.
Target audience is definately "very few, very dedicated, and well funded gamers" as you say. But if you put cost out the window, one is a technology that people would overwhelmingly use whilst the other is not.
If only the top-range cards supported eyefinity/3d and mid-range didn't, I'd have no problem with that. It's kind of what happens anyway, because they dont have the processing power in the mid range cards for the extra resolution. Yes there are exceptions of course.
i`m curious what would happen if "Apple" was doing all the marketing for triple display setup
Odd... I don't know a single person who has used 3d and doesn't continue to do so. My brother has actually been on a mission to change every single monitor in his house with 3d ones, and his main gaming rig is hooked up to a mitsubishi 73" 3d HDTV...So, where does this "most people quit using it" come from?
Maybe your brother is not part of "most people".
let say with Mac, macbook, nothing, 7% of market share, not really a revolution...
But i can imagine you say that, for see, if the marketing team of Apple was doing the pub for it, if it will sell more...
Well, in reality, there's 2 majors problem... 3 monitors, the place for put them, and the gpu power to go with the resolution.... The last one is maybe what have let down a lot of peoples.
new monitor have small bezel, and no Bezel ( 1mm ) is around the corner, but yes i dont know how i will do if i find 2 Samsung as i the old i own now, as they have nearly 1.2cm of Bezel. and 3 good monitor is a lot of money. ( and i dont see IPS panel coming with no Bezel anyway in short time ).. So i need take 3 new monitors ( for best result it is better to use 3x the same monitor ).
For the same price i can surely find a 27-30" and buy a new gpu as the 7970.
But i need admit im really tempted for DCS A10C. wow, should be extremely immersive. with Track IR5.
http://www.chiphell.com/thread-346223-1-1.html
RE: GTX680 (not sure if this is the final model number)
May be seen as early as in February (not paper launch but retail), as Mr Huang doesn't like the HD7970 to shine most in the 28nm era.
The performance would be around the same as the HD7970, depending on the drivers.
So far the core clock is at 780MHz, with 2GB vram.
Source is from an AIC manufacturer, saying that even the retail packages are being printed now.
With 2GB ram they would need a 256bit or 512bit wide memory bus interface.
so i don't buy it.
256bit is probably to low to even reach the 580performance... unless they run it at 1500MHz..
512bit seems an option... but then this is a high end die and a rather big one.
Argh, if the GTX 680 or whatever NVIDIA is going to call it has a 2GB and 256 bit memory bus I seriously don't see it win against the 7970 unless if NVIDIA surprised us. Enough with the low VRAM already NVIDIA.
It might still be 384bit, but with different size memory chips. This has already been done :).
Yes GTX 550TI 192bytes bus 1GB memory...
Look more like GTX660 spec for me.... But with all the rumored spec, i think even if i was write them or know them, i'll will dont believe it myself ( lol )...
GTX680 / GTX670 will be based on Kepler and will be launched in February, at least that's the plan now :)
P.S: Don't ask me the source of this, cannot say
2GB of VRAM, really? Why?
Doesn't make any sense... Especially after 3GB 580 cards.
I can max out 2GB of VRAM at just 2560x1600 already, so I was hoping to get more for 2012 titles.
why this is worry me more with the last 2go rumors? it start to look as they have move the GK104 SKU > GTX680/670... if true lets prepare us to get a 7xx refresh in some month...
Nvidia Kepler schedule revealed
A quarter behind AMD
We are still missing a lot of details, but according to the current schedule it looks like GK110, Nvidia’s Kepler high end card, which should up with the Geforce GT 680 brand comes in April.
A chip called GK107, a much slower version of the same 28nm core, will show up in May, while the sexy performance market sweet spot part codenamed GK106 is planned for June. We still have no info when the most affordable part of this gang, codenamed GK108, comes to market.
Mobile parts should follow a different launch dynamic and we expect more than just one card in April. It looks like Nvidia wants to make a name in desktop with high-end parts and that mobile GPUs will have a much wider spectrum, simply because Nvidia has some 300+ design wins in the mobile segment and that they can make nice money off it.
Huh? so Fud speak about GK110 for GTX680, not in february but April ? why GK110 and not GK100? not logic if the other sku are called, GK104 - GK106 - 107 etc...
I feel a lot of intox somewhere, ( at same time with the week end i have got, i will polluate an ocean just by dive in it ).
You know what ? I trust more Matose of Fuad.. but im lost this time.
More so-called "leaks":
Abnormal performance in GT1 and GT3, which is probably a driver issue.
http://img585.imageshack.us/img585/6...zjhd4id8ad.jpg
http://img502.imageshack.us/img502/5...w7ssgapp2g.jpg
http://www.obr-hardware.com/2012/01/...-on-track.html
Quote:
You've all probably read (especially on Fudzilla) VGAs with graphics architecture Kepler is postponed to April, May etc. It's all just BS and Camouflage maneuver. Launch is literally in a few weeks/days, all this is just to appease the competitor AMD ... before crushing blow!
May be but they are company that can market their products better then most of companies i know and that`s the whole meaning of my post if they were to market iMac with 3 display setup there would be lots of people drooling over it.
Source for the 3DMark11 score?
surely Chipphell
Obr hahaha that guy again lol
Why can't both sets of rumours be right? Something in Feb and something in Apr. The big question for me is availability. If the 7970 can barely stay in stock at $549 what are the odds that a GTX6xx part from nVidia won't sell out in days assuming lower pricing? It would make more sense for nVidia to follow suit with a $500+ card in February to limit demand but rumours so far point to GK104 launching first.
/shrug
We should make an XS contest, choose a date, and they will pull a winner for the ones who are the closest. Someone have some hardware reviewed ( PSU, motherboard ) who are lay somewhere ?
I believe the first chip to be released, the GK104 has always been planned for around february, depending on source they've just spoken end of january or even march but yea around that and that the big chip would come later on. I don't see any new rumors here, just misinterpretion of what's already out there. I don't think there's any reason to delay GK104 which won't beat HD 7970 but will still be a nice offering in its price class, ~$400. I could really do with more VRAM than my current 1GB so 2GB sounds appealing to me right now as my 1GB keeps getting maxed out in Skyrim but not really keen on spending more than 250 EUR on a gfx card today though with the PC gaming coming 2nd hand so will probably have to wait for something more reasonable.
that 3dmark11 run seems to be fake.. if you dont have a TN panel you should have seen already :D
http://www.forum-3dcenter.org/vbulle...postcount=2966
http://img.techpowerup.org/120116/Sh...116-182756.jpg
Hard or soft they better say something soon. The BS meter is off the charts already.
CPU-Z ID is from end november, something like that, ID-211**** too
Lol, besides the fact it has already been busted, it's running on a regular 7970, I'm not sure where you base that "1200 MHz" 7970 on.
A stock 7970 does P7924 on a i7 965 (X58 - 3.75 GHz): http://www.guru3d.com/article/radeon...fire-review/17
A 1200 MHz 1970 does nearly 12K (i7 3940K - 5.7 GHz): http://hwbot.org/submission/2241370_...70_11913_marks
http://hwbot.org/image/712656.jpg
Obviously not the P score since the author said that GT1 and GT3 were bugged. That would be stupid. Look at GT2 and GT4 instead.
Your score looks inflated.
Here's a 1125MHz run:
http://img155.imageshack.us/img155/2914/699727.jpg
Slightly slower than on Chiphell's screenshot.
Here's a 1285MHz run:
http://tof.canardpc.com/view/5272437...300872ed55.jpg
Slightly faster than on Chiphell's screenshot.
But since it's fake it's irrelevant, anyway.
You kids *may* all be too naive :)
http://img819.imageshack.us/img819/8...vlu4lv6ti9.jpg
Did the author make all the effort only to paste something for you to play with the curve? Isn't pressing "Del" faster than pasting something there? The author claims he cannot say too much about it, otherwise may lose job if taken to legal ground. Who knows.
Not from a manufacturing/logistics point of view.
The on-sale date of the 7950 would indicate all this work has been done in the month before it. If NV was shipping any Kepler based cards soon, you'd think people would know & have solid leaks/info since it would need to ship around the same time as the 7950.
Do not underestimate the inertial forces of a Chinese new-year.
28nm graphics card battle incoming !
Hope prices will be better :yepp:
It seems to me the February launch rumor was either (really) old or "seeded" bad info from a certain AIB.
or "how to catch a leaker"