It would be disappointing in that Nvidia's midrange would be beating AMD's best. Bodes poorly for future competition. I like it best when the two are neck and neck so we don't end up like the CPU world.
Printable View
I think it depends. Atleast for the consumer. If nvidia and AMD performance is very similar to each other, price cuts are not a necessity. Since they perform similarly, they can be priced close to each other. Since AMD pricing is at an all time high for the company, this leaves room for very high pricing, particularly for the ultra high end. This is strictly hypothetical but if gk104 outperforms 7970 by 20% and is priced around $500 AMD would be forced to drop the price of 79xx series 100 dollars or more. The lower Nvidia pricing, the more AMD has to drop the prices of their cards. If Nvidia prices high enough of course, AMD won't have to drop prices.
Also, I can't be the only one who gives two flying f***s about GK104. I want some top-end news on GK110.
If GK104 is the only thing they're farting out the next few months then I'll stick with my GTX580.
if GK104 beats 7970 and is cheaper then its worth a look but of course we want the big guns.
:)
You really belive that this card is going to be named gtx680 and its going to be their midrange?
I dont think so.
Why not? 6870 was AMDs midrange and took the "870" from the previous fastest card, the 5870. GK110 is not ready, Nvidias midrange chip can compete with AMDs performance/highend-hybrid, so this is just marketing. GK110 probably will be the 700 series. Would make sense because it is rumored to be more capable on the shader front.
OK. But it doesnt make sense to me, neither did this AMD previous number jump. Perhaps they try to catch few more sells by giving the card the higher number, hoping that some of us will not check the card perfomance before purchase. :ROTF:
Well, I hope people look at performance and price and compare it to the previous gen. If it's not at least a 40% improvement in fps/$, show it the finger. At least that is what I'm gonna do.
I dont care what its called or where they think it fits in the range, if its faster and cheaper than a 7970 then its worth a look. I think its a bit unrealistic to expect a mid range card to beat the oppositions flagship, even if they could do it they wouldnt as its not economically sensible, they could charge more for something slower. Besides, when has a flagship gpu ever costed us$399??
:)
If hot clocks are still there than GK104 could cost higher to produce (lower yields) and consume more power than Tahiti despite having less die size.
Radeon 9700 pro was $399.
But, if nvidias midrange smokes amds high end, I would expect high prices from nvidia since amd wouldn't be able to compete performance wise. However, I think the gtx680 is the high end single card. I'm excited to see nvidias offerings in the upcoming months.
Sent from my SGH-T989 using Tapatalk
It better do a lot better in games then 7970, otherwise it will be disappointing to me. But if it does turn out to be the midrange chip, which seems to be the case, it will be fantastic on the other hand. A midrange chip beating the opposition's high end chip hasn't happened for a long time. GK110 better come in a decent time span though.
That was a while ago mate, those things were near double that here, I'm
Pretty a mate of mine paid near $1000 for a 9800 all in wonder ..... Ouch
Don't you worry, Nvidia will bleed us if it's really that fast, that's why they are giving it a high end name, so they can attach a high end price tag to it ..... You will see
:)
What makes you think Tahiti is high end? Maybe it's a bit more than a performance card, but by current standards, it's not high end. Had they made it 450+mm2 (more units) or clocked it significantly higher, pushing thermals/power to what the market is willing to accept these days (250w), then it would deserve this classification.
If it had GTX580+60% performance@stock, no problem. Still impressive of GK104, though. That would be like the GTX560Ti competing with the 6970 when in reality it was about 10% slower than the 6950.
Same here, sort of. I am nearing my upgrade cycle time, so I want to buy new cards within the next 4, at worst 5 months (not sure I have the patience, though). And I'd rather not go for 3-Way SLI to future proof myself for the next couple of years. But if high-end Kepler is cancelled then I guess I may have to... Really rather stick to two cards tops, though.
I also don't like 2GB of VRAM. Seems barely enough for my 30 inch display even today, and who knows what's coming later on. And 4GB of VRAM would be too much, overkill and a waste of money. 3GB sounds so much better... :(
This could also be named 670 ? and maybe later there will be a 675 and 680 ?
http://semiaccurate.com/2012/03/07/t...nm-production/
Troll?
Bolded parts for emphasis on why I think it's little more than a troll by him. At least for the nVidia-related sections.Quote:
Originally Posted by Charlie FUD-curate
Charlie again makes no sense. Nvidia is too dumb, no one else is having 28nm problems...and then suddenly a complete turnaround and ALL 28nm production, not just Nvidias, is halted. Charlie is an idiot.
Charlie, no use crying, you were not invited, 12 days, so for the Kepler selected. It was expected that level a comment coming from this guy after that Nvidia did not sent her invitation to the premiere of Kepler!
I definitely believe there was some kind of issue with 28nm when you look at how many 79xx based AMD cards are oos at NE.
I am not sure you know what you are talking about. You don't cure a flu by killing the patient and reanimating him. Nothing similar happaned on 40nm either. If they needed to stop production, then it must have been a different, very serious problem, not just yields. Like machine failure, issues with chemicals, or possibly a dozen of other problems that can happen inside a fab.
SKYMTL reported earlier that AIBs have problems getting Tahiti chips and boards. Could be cause by this pause in production.
The difference between GTX 580 1.5GB and 3GB is negligible even at extreme high resolution with 4x/8x AA
http://www.hardware.fr/focus/50/test...-surround.html
In other words, even 1.5GB is more enough enough for your 30 inch display
Apparently, a certain insider in PCInLife was saying "forget about 399(USD), even 449 might be unachievable with current costs"
So there you go.
Well that's p***** on my campfire.