Charlie says gk104 price is $299: http://semiaccurate.com/2012/01/23/e...k104-price-is/
Printable View
Charlie says gk104 price is $299: http://semiaccurate.com/2012/01/23/e...k104-price-is/
MRSP 299$ for GK104:
http://semiaccurate.com/2012/01/23/e...k104-price-is/
Damn, too slow :)
Thats all fine and good for a single 1080P monitor but if you do any of the following it runs out really fast:
-run 2560x1600
-run Nvidia Surround with 3 displays
-use Nvidia 3D Vision
Not to mention the newer games are already on the edge with a single 1080P monitor.
I for one would pretty disappointed if they kept the high end to 2GB considering it is really easy to get Metro 2033 over 2GB as it is...
^Agree
I cap out my 2GB 6970 at 5936x1080 (Eyefinity with Bezel Compensation) when I am playing Grand Theft Auto 4.
If the price is true, theres no way it stands up to HD7950, not to mention 7970.
The problem is it just doesn't help :) It is either playable with 1.5GB or it isn't at all.
http://i44.tinypic.com/23hnnfk.png
Previous page - http://tof.canardpc.com/view/a2f56c4...fce6c28e95.jpg
Here's the original article from where come thoses benchmarks. ( well in reality the original is Hardware.fr but like that you have in english )
http://www.behardware.com/articles/8...-surround.html
Note you cant do 3 monitors on a single GTX580 ( max 2560x1600 ), this is why the SLI one's for 5760x1080
300$ card that is as fast as GTX 580, with 2GB memory and come with huge oc potential would be totally kick a$$, following the trend of success like previous cards the 8800GT, GTX 460. Although it won't be much attraction to the xtreme crowds :D
But economy is different now, games stopped evolving , you no longer need top end card, except for the niche multi screen crowd.
DP :welcome:
IMO Problem is, you are a reviewer and often have got pre knowledge on new stuff. Saying yea right without explaining, people are going to have some questions marks. If you do know more, I don't think you should post that, unless you are going to explain what you know. If you don't know more, then, it's kind of trolling a bit. You obviously do know more, otherwise you wouldn't post that. I don't mean that in a derogatory way, but that is the first thing that came to my mind.
If GK104 is mid range performance, and will launch first, it will be a first that a "4" chip is launched before the flagship product wouldn't it? Why would nVidia do that unless it was at least as fast as 7970? From the current rumours, it seems that will be the case, even from the Charlie articles.
Please correct me if I'm wrong. :)
-in bf3 @ 2560x1440 gtx580sli [1.5] I left all setting at max. by mistake , 30 minutes into the game it started to jerk around ,fps had dropped to 20 . vram was at 1538 , went to settings turned off msaa ,restarted the game [same server] vram was now below 1500mb and played the game for hours after that.
-not a big deal on one year old cards ,but to buy cards with 1.5gb of vram today ,no way , to buy 2 x high end 2gb cards to keep for 2 years not likely either.
...Half of which gets changed last minute or adjusted. Of anything, pricing is usually the very last to be released. I wouldn't expect anything that he says to be based on more than intuition, which is likely far more informed than mine :).
EDIT: I imagine he's snickering at the thought of my intuition being even close to his on such things :p:
Its getting crazier !!!!
But come on guys can anyone really believe that nv will sell us a card less than its market value ? 1 of the 2 has to be false , either its gonna be 300-350$ OR its gonna be close to 7970 or 7950 performance. Have to say with all the stuff flying around its hard to make a calculated not noob guesstimate.
Yeah the 299$ and performance > 7970 doesn't seem right.
that would make the whole >200$ lineup of nvidia complete obsolete. So how will they fill the space above and below?
570-580 would be worst buy ever, but they are to big to drop under the 250$
.
If it performs like the 7970 they will price it accordingly so they can still sell 560-570-580.
580 is EOL. The other cards of the old gen probably will be EOL once Kepler is available, so it IS obsolete. GK104 will be the new 8800GT. I guess at 300$ it will come quite close to the 7970 and GK110 will obliterate Tahiti for 400-600$. AMD will drop prices big time and introduce a XTX or what they call it to at least hope to compete with GK110.
Let's see if I'm right on this one.
i wouldnt be shocked if nvidia released kelper at cebit
580s wouldn't be EOL until somethings ready to take its place.
-PB
Well if such a thing were to happen it would be a bigger surprise than the 4800 series. 300$ and 7970 performance makes no sense to anyone but a blind fanboy who wants a cheap card. Yet again people have even more bloated expectations than before 7970 launched. I think it was supposed to be at least twice as fast as previous gen, or at least 6990 performance at like 400$? :rolleyes:
What do a GTX 580 get in 3DMark11, the HD7950 got like 61xx, just wanna know if it's even realistic to expect GK104 to perform about as good as GTX 580.
Default HD7950 gets about 50points better GPU score than GTX580
OK ty well I suppose it's possible for GK104 to perform within margin of 10% of GTX 580 at least which is something I've waited for, GTX 580 performance at a more reasonable cost, hopefully it at least delivers that. Overclocking should also be a bit better probably.
They are not making GF110 anymore afaik, so that is EOL. You can still buy from the stock that is left, though.
AMD improved performance by around 40% (on average) compared to the 6970. This it not much for a new node. According to the latest info/speculations Nvidia focused heavily on perf/W this time around. So it is not unlikely that GK104 will improve quite more than 40% compared to its predecessor, the 560(Ti). I don't find it unreasonable that GK104 will land in between 7950 and 7970. That would make it only 60% faster than the 560Ti - an increase that is not uncommon for a new node.
Also, by the time GK104 launches, prices for the 7970 might have dropped a bit due to matured process and better availability. Considering that there is still GK110 for the segment above, why wouldn't GK104 be priced at 300$? If it were overpriced like the 7970, what would they want for GTX680? 800$? Who would buy that?
Think back of the 8800GT. It was priced in the 250$-300$ range and was also around 30% faster than the fastest previous offering of the last generation, the X1950XTX - just like the 7970 is around 30% faster than the GTX580. You see the parallel? It was a great deal and nobody cried "impossible". Tahiti and GK104 will have similar die sizes, so similar performance is to be expected, especially since GK104 probably is a chip trimmed for gaming while AMD had to make compromises due to HPC. The difference between Nvidia and AMD this round is, that AMD again stops at <400mm2 die while Nvidia will (as always in the last 6 years) have offerings above that. GK104 will be the new 8800GT in my opinion. And 8800GTX and Ultras followed suit.
And please stop calling people fanboy. That is childish and immature.
As far i know back in wery 386 486... till now, and matrox mistique and voodoo2... till now, all new generations of gpu chips go like 20-30% "better" same as for cpu
Edit... I guess I shouldn't post that.
When comparing differences you should compare at high settings to avoid bottlenecks. 7970 is more lite 60%+ faster than 6970.
No, when it's a new generation it's more often than not close to 100%, not counting half generations as the AMD 3000 or 6000 series or nVidia 9000 or 500 series.
That's pretty much what I've been thinking, the 7970 isn't particularly fast for a node change. If you look back just one node change the GK104 being faster or on par with the GF110 is normal rather than exceptional. If the GK104 is faster than the 7970 it just means that AMD have missed the sweet spot (not that the 7970 can really be described as a sweet spot card) and they've not got the same performance per millimetre advantage they enjoyed in 40nm.
Nvidia's midrange offerings tend to use more transistors and more die space than AMD's have on any GPU since the 2900xt, Nvidia only really needed to close the gap on performance per mm2 to make this situation occur. The only light in the tunnel for AMD is that their card has very immature drivers, but as AMD drivers have not been the highlight of 2011 it's somewhat a mixed blessing.
- Computerbase: 30% at 2560x1600_8/16, 40% at 2560x1600_4/16 http://www.computerbase.de/artikel/g...stung_mit_aaaf
- TechPowerUp: 39% at 2560x1600 with AA/AF http://www.techpowerup.com/reviews/AMD/HD_7970/28.html
- HT4U: 35% at 2560x1440_8/16: http://ht4u.net/reviews/2011/amd_rad...st/index50.php
You can pick games where the the difference is larger, but on average 60% is not true.
Even with SGSSAA the maximum difference between the 6970 and the 7970 is around 45% (in Skyrim):
http://www.computerbase.de/artikel/g...stung_mit_ssaa
Compound results with CPU-heavy benches included isn't a good basis when you want to tell the actual difference. With SC2 and Civ V you can effectively lower the advantage of any graphics card.
And any test where they don’t disclose their test systems is disqualified immediately in my eyes. A valid test should have at least an 2600K at 4.5GHz to avoid bottlenecks. Benching on an old unclocked Nehalem doesn’t cut it.
Computerbase tested with a 4,5 GHz 2600K. The influence of the CPU should diminish at these high settings, but it depends on the benchmark of course. I chose these reviews because they represent many games and have a performance rating for convenience. Can you point out at least 10 games where the 7970 is 60% ahead of the 6970?
I just would say...why not stop dreaming about a mid-range card beating(or on par) an high end one?Techonology never gives miracles, even if it's new.
For this card to match Charlie's description it would have to either:
1- Be faster than 7970, in which case it will likely cost +/- 50$ from $550
2- Be around 570~580 performance and cost 300$
Both those claims above will likely never occur together with Nvidia. Unless they have an incredibly abnormal reason to do so.
Never seen NV have an Ace in the market and undercharge for it.
So we have two cards with approximately the same die size that could be close in performance. Why is one of them midrange and one highend? Highend in my opinion is reserved for the big chips in the range of 500mm2.
If Tahiti were highend, what will GK110 be, then? Super highend? Ultra highend? Makes no sense. Again: Look at the 8800GT. Same scenario.
Then again, what will GK110 be? Or do you honestly believe that GK104 will remain Nvidias top dog? 7950 will offer similar performance to GTX580 at a similar price but lower power consumption. This is perf/$ stagnation and it is not good for the customer.
Fermi was borked. Imagine Nvidia did everything right back then like it appears for Kepler now. Imagine a fixed, more efficient 460 aka 560(Ti) had been released instead of the 460, fighting with the 5870, coming close. This is the exact same situation that we will have in 2 months. Compared to a fixed 480 aka 580, 5870 is not highend. Equally, compared with GK110 a 7970 is not highend. True, there is a time shift in between these two. It changes perception, but not the facts.
It will probably not be faster. I expect GK104 around 7970-10% and 160-180W.
The 460 has less transistors but is bigger than the 5870 (if I remember correctly), while 260 is bigger and has more transistors than the 4870. Size alone doesn't dictate performance and Nvidia will price as high as they can get for each card, that price is dictated by competition, so $300 is high for a basic mid range card. I'm guessing nvidia are probably expecting a price drop or new performance drivers to go up against the 660GTX launch.
The 8800GT wasn't on the same process as the 8800GTX, the GTX was on 90nm and the GT 65nm. Personally I think Nvidia put in a lot of effort this generation to make sure they got 28nm right, while AMD were making cutbacks.
What I see is people hyping their expectations up to a point where nothing short of mind blowing performance for less than $400 will satisfy them.
To me, that's completely unrealistic.
Not one of you has given a good reason why. Why should the successor to a 300$ card be priced considerably higher? Why is +60% on a new node completely unrealistic? If you follow this logic, chips will become faster but more expensive. So in 5 years we have +200% performance and +200% price? That is completely unrealistic.
I don't get why you deny that perf/$ will increase considerably - as it did with every new generation eventually...in every market segment.
Unrealistic is the proper description. Why would you sell a card which is faster than your competition's 550$ card for nearly half the price ? Has it happened before ? Then why would it happen now ?
At this rate no matter how good the next gen NV cards become it will still disappoint people who expected it for half the ATI prices... o.O
I like the reviews of SweClockers.com, but it's in swedish. They try to eliminate bottlenecks and try to isolate the actual perfromance of the cards. They too have an Index, and there you can read that 7970 is 48% faster in 1920x1080 and 2560x1600. The bigger the difference you can get in real world test the less bottlnecks is involved and the closer you are to the real performance difference. They have a number of games where you can see about 60% better performance.
7970 has high demand as the fastest card, and low supply due to a fresh node that's more expensive to manufacture with. But as long as people are willing to buy at this price, they will sell. Nvidia will do the same, and balance the price according to demand and supply. And demand will of course be affected by what the competition is offering.
The only realistic scenario that exists in my opinion is that:
- If priced $300 is true then:
- If GTX 580 is gonna get EOL it needs to be roughly as fast as GTX 580 ( +/- ~10% )
- GTX 580 performance for $200 dollar less with a bit lower power consumption and DX11.1 = there's your new gen benefits, very fair
- If you can't beat competition in performance despite the competitor was first => start a price war (offer better performance/price ratio)
- $300 GK104 puts pressure on HD 7950 & HD 7970, might have to drop to AT LEAST $400 and ~$500 as I doubt performance difference between GK104 and HD7950 will be big
- Everyone benefits !
- (Compare 8800GT vs 8800GTX except this time $50 expensier, similar scenario has happened before in Nvidia camp)
Yes it has happened, but on the other shoe. That product was the 4870.
The gtx 280 and gtx 260 at launch were priced at 650 and 450 respectively. The price of the 4870 was 299 or 280 and was the same speed or hairs faster than the gtx 260. I think the gk104 will be around 7950 ish as far as speed goes. Considering the size of past midranges from NV and their performance, them catching up to mid range AMD high end would not be a surprise(gtx 9800+, gtx 260, gtx460, gtx 560 all performed similarly to the 4850, 4870, 5850 and 6950 respectively).
As far as pricing goes, I think they want to price things in respect to gk110 or gk112 or whatever the high end is. I think at this point in time, the reason the 7970 is selling out is primarily a supply one, however quantities are getting much better quickly and it is getting easier and easier to find one in Canada. Best buy of all places has them at stock, at the 549.99 retail price. NCIX has a decent amount of stock in too. In a month, I have a feeling it will be pretty easy to find a 7970.
I think if gk110 is hard launched, I expect decent quantities for that card considering it will be launched later. There is a point where people don't want to pay anymore and $650 I think was max for a card that is not semi limited edition. During that time we were not in a recession and people had way more money to spend. I think right now the max people will pay for a card is less than 650, if it has decent supply(not a weird halo product like these limited editions(the ultra) or dual cards). Pricing the gtx 660 at 300 dollars leaves room for a $550 or $600 dollar, gtx 680.
IF, performance are soo good, i dont know why peoples assume nvidia cant put the GK104: 380 -400 $ - GTX 670: 500$ - GTX680: 570 - 580$.... If really the GK104 perform so well, Why will you Nvidia cares about it? If a price wars occurs ( specially if AMD release a 7950 1.5gb ), they dont want to be to low on MSRP....
8800gt was an Ace, the card rocked in its day and performed close enough to the 8800gtx for a fraction of the cost, bought one on launch day from NCIX for $189 compared to the $6xx I paid for a 8800gtx.
I really cannot recall anything positive about Nvidia from char-lie so for him to even be able to squeak out any kind of praise is enough to raise some brows and warrant some attention.
We'll all know soon enough, whether it beats or loses to the 7970 is irrelevant, it comes down to price and performance in the end.
The thing was completely different at this time, the difference between high end and middle end was not the same., todays it will mostly be a 570 and not a 560... the 8800GT ( die shrinked ) was coming with 112SP vs 128SP for the GTS, the clock speed was upped a little bit.. the 90 > 65nm specially used for decrease the price of the GTX.. not really for make a new series. the difference was only of 14tflops Even the difference between the 90>65nm ( ultra, GTX > GS, GT ) was small ( 65nm : 750 vs 690millions of transistors for the 90nm )
As you see the 8800GT ( as the GTX ) was part of the high end SKU, with the same count of transistors, of 690 and then 750 millions.. ( laser cutted )... same as the 570 is for the 580... not a different sku..
8800ultra: 128SP 90nm 690millions transistors _________________ - 612mhz
8800GTX: 128SP 90nm 690 millions transistors. _______ 518Tflops - 575mhz
8800GTS: 128SP 65nm 754millions transistors. _________________ - 650mhz
8800GT: 112SP 65nm 754 millions transistors _______ 504Tflops - 600mhz
The market demand at the time would agree that the 8800gt was in fact an ace and could be considered underpriced at launch when you consider market demand driving the price up nearly $100 over launch day prices for an extended period of time. What made the 8800gt such a success was the simple idea of price/performance, you got allot of performance for the price. Nobody cared about process, transistors or flops, it was all about price and performance.
There's not allot of sense to me in debating the price, performance & market placement of hardware that we know too little about as of yet.
We'll just have wait until launch to know what Nvidia's next gen is going to bring to the table as far as price/performance is concerned.
The gtx 570 and 6970 has been around 330 dollars since the middle of the 2011, maybe even earlier. The gtx 560 448 at $289 hit around the gtx 570 performance as well, heck even the gigabyte gtx 560 SOC hit gtx 570 performance at around $269 and this was 10 months ago. Unless GK104 is well well under 300mm2 while using a new process, it should hit gtx 580 performance (which is at this point, looks like 10% with the 7950) level atleast if the memory controller is fixed.
If from going to gf114 to gk104, Nvidia only gained, 15-20%(gtx 570 performance) while using a new architecture and new process, it would be a massive failure, worse than the 2900xt or the 5800xt in someways because both companies have learned their lessons by now. In addition, with the 6870 going for 180 dollars and gtx 560 ti going for 200, it would be a bad value proposition. Not to mention the gtx 560 launched at $229 and getting gtx $570 performance at $299 would mean worse price to performance even from launch prices to launch price. With a mainstream card, this would be idiotic.
The only unrealistic thing is not so much performance I think from GK104, I think it is more so price. $349 seems like a more realistic price to me.
All these comparisons to the 8800GT makes my head sore.
-PB
Are you implying gk106, has now become gk104? That would make some sense, with performance coming in between gtx 570 and gtx 580.
NVIDIA - "We expected more from Radeon HD 7900" [Exclusive]
http://www.nordichardware.com/news/7...exclusive.html
If this is true, and 3 ports can be used at the same time then they already sold one to me. I currently use 2 nvidia cards to drive 3 monitors (not in SLI) and would happily move down to just 1 card.Quote:
source - http://semiaccurate.com/2012/01/19/n...-clear-winner/
For the doubters, both of the current cards we saw have two DVI plugs on the bottom, one HDMI, and one DP with a half slot cooling grate on one of the two slot end plates.
Not only is creating an absolute out of a relative a no-no in statistics (they created a "Baseline" and called it "1") but all that can be assumed if its real is...
A) Our new product is better then our old product.
B) Our old product was crappy as compared to our new product.
C) We aren't very good at making graphs
D) If you liked our old products, specifically ones that still don't need to be upgraded from, you might like paying more for our new products. But we don't plan on comparing it to any competitor as that would ruin our sales pitch.
or, last but not least.
E) We like the color green.
[chiphell] Keep it classy nvidia, viral marketing discovered at chiphell, Kepler rumors debunked
http://www.overclock.net/t/1205407/c...umors-debunkedQuote:
Several media are a variety of domestic Kepler low prices, HD7900 will follow the diving like the article, take a look at the foreign media, who did not even reported, along with some media editors forum explosive in the major so-called "material "It is clear that a variety of moves NV's marketing activities in the water behind the army to suppress the HD7900 as well as leading the sales momentum, CHH to see these posts I will lock posts, I hope the members can self-load.
rough chinese translation. ALL of the posts related to kepler being 299$ from chiphell, and speculated performance, were from marketing shills (proven by the owners of the forum) and have been banned.
Charlie from SA was bamboozled as well. Keep it classy Nvidia.
lol, give me a break, viral marketing from nvidia. The translation is so ridiculously bad it could literally mean anything. The bit about the viral marketing is purely made up. Anything to give nVidia a bad name ey. :rolleyes:
Surprised if true. Charlie of all people would be most skeptical.
I just wanted to add that not only are you 100% right, but that this has happened since I can remember when any new GPU or CPU comes out... people hype it up and lower the price and are disappointed. the one launch that I can remember that was not seen as a let down was the G80 one (and a few before it's time as well). considering we have NO IDEA how fast it would be and there was close to no information on it until it came out and destroyed all of the DX9 cards. everything since then has been hyped beyond reason.
I think you totally missed the point on this. not to mention totally missed the last 5 years of GPU pricing history. high end GPU's since the 8800GTX have always cost any-ware from $500-$700 and have out performed the last architecture by 50% or more... the price stays the same and the performance increases in the respective price bracket.
honestly i don't see why people think Nvidia are going to change the formula. since the dawn of GPU time we have seen:
-high end (best performance but poor price to performance ratio) For example: GTX 580
-mid range (good performance and best price to performance ratio) For example: GTX 560/GTX 560 Ti
-low end (poor performance and poor price to performance ratio) For example: GT 520
on a side note. TinyTomLogan over at OC3D did mention in one of his videos that he has been testing some new graphics card and that they will be released on Tuesday (the 31st of Jan). what they are I have no idea. 7950's I would guess but I guess you never know.
Show us the proof please.
I have done up and down Chiphell since seeing this post and read more piss-poor translations than I care to admit but not once did I see any proof of this.
It is also ironic that one of the primary sites for misleading info about ANY GPU release suddenly had a blast of consciousness all of a sudden.
The only thing i know is a wide marketing rush have been initiate and we have all seen and feel it., since the official 7970 launch in shop, ( the 9jan ) there's not one day without a " leak ", " an info ", a "price ", a " release date " about Kepler.. i will quote Charlie who is preparing an article ( dont ask you why thoses 2 article was so short, see them as an introduction ) about Nvidia marketing : - " Wall Street will look as Gandhi this year compared to Nvidia "
And the most funny, with all the " infos " who have been rolling: We still have absolutely not any idea of what is Kepler, nor to what it look like..
Conclusion: - I will not take the words of Chipphell as true, not more as i take the other we have seen yet.... i think we are flying high so far to accord too much credit to what we read ( good or bad ) .. This make 4 days we are debating about the performance of Kepler based on a price..
So Charlie is no more Edelman or Bite and he's new nVidia's ho ? Weird world ... :D
Interesting find I just came across.
/runs for cover
http://rigmods.com/wp/wp-content/upl...x68017q223.jpg
http://rigmods.com/wp/wp-content/upl...gpu-za54qo.jpg
edit: http://rigmods.com/wp/blog/leaked-nv...110-chip-info/
At least someone around here is perceptive. :)
It's always humorous when the usual suspects are so in a rush to go after the message and messenger that they completely miss the point. :)
What I don't get though is the endless rooting for a company to fail. While neither side is without fault... there just seems to be an extra dose of animus with the anti-nVidia/intel crowd. While I currently own intel/nvidia, I really wanted AMD to succeed with Bulldozer. I just wasn't going allow myself to ignore the fact the chip was a bust. I'm also glad the initial hd7xxx cards are fast. The better AMD does, the more pressure there is on Intel and nVidia to build killer products and actually compete for consumers money. At least at the end of the day I get to have a laugh when the same person who is always part of the very negative crowd didn't find his complaining about the HD7970's price to be any bit ironic.
Early next week Nvidia starts manufacturers in Asia to present graphic cards based on Kepler, when full production will start in two/three weeks from now.
It's time to say something new. The first news (if true)is breakthrough new feature for PhysX and CUDA through its processing. If this thing really confirmed, it will be a big bomb. Further information concerning the power consumption to be thanks to the final respin very low. Respin also helped NVIDIA achieve high operating GPU frequency. As for performance, still true that middle-class Kepler card easily beats Radeon 7970 ...
You've probably noticed that the ChipHell banned some users, trolls who allegedly worked for Nvidia. This is bull, why would Nvidia Trolls worked there and not just for famous US/EU forums ... it's all only comedy. PS. Banned by a "nApoleon" biggest TROLL there ...
http://www.obr-hardware.com/2012/01/...-chiphell.html