"if somebody opposed your ideas call him a fanboy" this is the new way of "argue"ing in XS.
i think mods have to ban fanboy word :D
Printable View
"if somebody opposed your ideas call him a fanboy" this is the new way of "argue"ing in XS.
i think mods have to ban fanboy word :D
Still trolling...?
We are enthusiasts, it doesn't matter who's chip it is, we are having a logical discussion. It seems your blindfolded because your inherent love for nVidia doesn't allow you to acknowledge FACTS.
My link DID give you facts. Matter of fact, there is a photo of the GTX wafer. Secondly, you now know there is 94 cores per 300mm wafer. Which before you feigned ingnorance....
So, I'm going to play connect-the-dots with you.
Here:
GameSpot
theINQUIRER
PureOC
Probably more than you'll ever want to know:
SemaTech
So... as you can see, it is widely accepted that a 300mm wafer cost about $5,000 dollars. If you wish not to accept this widely known fact, then your just being stubborn and/or an utter "fanboi".
But, just to prove your ignorance, even if we use some fictitiously low cost of $3,500 per wafer and an industry mind blowing 50% yield for a 1.4billion transistor fab, your still looking at $75 per GTX200 core. (Which in reality is estimated @ 20% yield, or about $277 per core)
How do you make a complete Video card with heat sink, fan and materials for $25 bucks..?
Quote:
Originally Posted by gojirasan
So (again) remedial mathematics suggest you have no clue as to the business end of microprocessors. We are indeed concerned for Nvidia... and also taking a stab at them while their obviously down.... simply because their monopolistic stance. This will be my first ATI card in a long time and I am quite sure that nearly every gamer and enthusiast is right there with me.
By the time Nvidia actually goes 55nm with their GTX200 series, ATI will probably be getting better yields than what they are and will be able to drop the price of their HD4000 series even more.
Which we are all happy about... except die-hard nVidia fans!
.
I'm happy they both have good cards. I'm dissappointed that NV didn't manage 10.1, displayport on some models, and 55nm. But I'm pleased as punch that AMD has a good quality chip, worthy of the money asked for it.
I just love ATi can make a good comeback this round, God knows how much they need it, nVidia will be fine, they're the one who's sitting over a mountain of cash and a majority of marketshare, so a step back or two won't hurt them much, only will push them harder to make better & MORE AFFORDABLE product for us.
Wow, Thats the best theory I've ever heard :rofl:Quote:
Originally Posted by gojirasan;
Prove that to me so I can start my new busness buying 4870's removing the ram and selling them to ATI for only $1200.
Wowza, those gt280 chips seem to cost dearly to nvidia :x :x.
Some very informative posts here tho, especially liked the one from savantu :) thanks :)
I've learned that the new GT200 55nm will come maybe in the end of Q3, but probably in the beginning of Q4. Either way it isn't very far away :)
since 55nm process is linear shrink, we'll have 400 mm^2 GPU at bast case (if NVIO2 isn't implemented on die). If NVIO2 is implemented then GPU will be few % bigger! That's still way bigger that RV770's 260 mm^2 so I don't see how NVIDIA can compete with ATI margin vise?!
My prediction is that it will have less redundany to make it smaller than 55nm alone could. I've stated before that Nvidia may go all out and release a chip with more active components (and the same or a little more transistors), but this will hurt them financially in the short run. In the long run their image may be more important.
This is XS guys. Just play nice, there is no reason to ruin this thread.
I just noticed Xoulz doing a lot of insulting people. I'm surprised mods haven't stepped in.
I see the 2900 as the beta for the 4870, the 2900 did not live up to expectations, perhaps because it was released a year early on an out of date technology!
Factors influencing die cost:
Die size
Bulk wafer cost
Fabrication process
Yield
Area G200:R770 = 256:576 = 9:4 = 2.25
Cost (1GB DDR3 ASUS Cards) 4850:GTX280 £183.59:£339.56 = 1.85
This is a difficult comparison since the 4850 1GB is the most expensive model and the only 1GB variant on Scan and the GTX280 is their cheapest model. However, moving to more expensive GTX280s brings the cost ratio in line with the die density ratio.
So yes, the cost of the competing TMSC GPUs correlates at around 1:1 with die area after any reduction in cost due to using a 65nm rather than 55nm process. My guess is that this is offset by poor yield due to high area.
Not using the latest technology node (65nm) is part of why the 2900 (90nm) did not live up to expectations.
Sweet, do you guys think power consumption will be lower than 4870?
Q4 Nv would release GT200b 55nm with 256-bit and GDDR5
Q1-Q2 2009 New DX10.1 Card with GDDR5......http://www.xtremesystems.org/forums/...lies/smile.gif
:)http://www.xtremesystems.org/forums/...lies/smile.gif
:)
It's the Ultra and will be released on the 11th of July
-Core Clock: 738MHz
- Shader Clock: 1666MHz
- Memory Data Rate: 2520MHz
Pricing believe it or not is around the same price (in the UK) as Crossfire HD4870 512MB cards
I'm guessing this will be pretty fast. perhaps the best single card solution?
John
It might not be the GT-400 but boards of those clocks (Some Brands call them the Ultra and other brands dub them GT280 Extreme) are being released from various e-tailers on the 11th of July
I hope that the cooler on the card is not the same as the GT280GTX as the clocks are pretty high, or perhaps it is indeed the GT200-400 core which is "binned" or fabricated for higher clocks.
It is certainly making me think before purchasing 2x HD4870 cards as I do not want to have MultiGPU problems, the Ultra/Extreme GT280 @ 738Mhz, 1666Mhz and 2520Mhz does sound impressive on paper. (and should certainly be closer to Crossfire HD4870 Performance)
John
Pity Nvidia... wanting to please its stockholders by affirming them with an upcoming 55nm version for the GT200 chips so that there's profit to be had,
BUT
not wanting to destroy its current GT200 sales by announcing a 55nm version yet.
My sympathies for Nvidia whose stock price has fallen more this year than any other year ever for what, 10+ years?
OMG the audacity of nvidia to persue a die shrink!! :mad:
Seriously......
The 55nm based GT200 was planned all along and developed along side its 65nm counterpart. Its not a marketing gimmick to please the stockholders. A die shrink is inevitable. Nvidia prides themselves in having the fastest graphics solution on the market and die shrinks are a part of that....its basic stuff.
And yeah....i'm sure nvidia is totally worried about hurting the sales of their curent GT200 series which has COMPLETELY taken the market by storm. :rolleyes:
Does anyone know if this GTX 280 "Ultra" will be a widely available new SKU? Will it be something that one could step-up to with the eVGA step-up program?
I don't think the 55nm gtx 280 is gonna come in august, maybe sept/oct? Between the rapid price drops and the the constant incremental expensive upgrades, nvidia is gonna lose a lot of goodwill from their customers. That said my gtx 280 stepup just arrived woooOO! ;)
Could you give us a link?
Oh, you mean the super overclocked version, right? It's not GT200-400, it's still GT200-200
GT200 shader clock operates on a 27MHz crystal so yes, 1666MHz is impossible.
Considering how GT200's OC, near 1.7GHz is not going to happen. My GTX 280's SPs are the clear weak point, they will only hit 1.458GHz. Even the binned Tesla GT200 has only a 1.5GHz shader clock.
If 1666Mhz is impossible we best Lynch mob this e-tailer for false advertisement
GeForce GTX 280 Extreme 1024MB
John
pfft so you think an oc can reach two 4870 performance haha
Well I have to humbly eat my hat, because despite my predictions of 6 months or so till 55nm, it seems they might be here a lot sooner. Have to admit it's looking like I was very wrong :(
Could just be process improvements.
Some GTX280s have been able to hit close to those clocks so if they're higher binned or its a core revision, it can be possible
After all, the G80 Ultra was just the A3 version of the G80 GTX core and had higher clocks
At first, i expect GT200 55 nm version would be an optimised core with 24 ROPs and 384 bit GDDR5 configuration, so i predict them would come out in November at the earliest, but now it seems the configuration would stay put and the chip would come out sooner (September ?).
I don't get the Leadtek Extreme version. Such high clocks should only be possible to achieve with watercooling, yet it has standard air cooling.
A cherry picked chip on its aircooling OC edge ? If the price premium permits, why not ?
as much as i'm liking this 'progress' in graphics, i was gearing myself up to get a 4870x2, and now a 55nm gtx is on the horizon :cussing:
4870x2 is looking like august availability?
so september for nvidia response...? give or take some weeks:shrug:
dont even have the 9800gtx+ yet....:confused::hehe: