There you go again, where are you coming up with this range, was it in the silicon quarterly or something.... :confused:
I call shenanigans, the wafers are more expensive by only $23.99 per wafer and you can take that to the bank, for real...
Printable View
A huge cost is R&D. You worry how much the wafers and heatsinks cost, but the real point to consider is R&D. The Fermi architecture, just like the G80 architecture, was a long time in development and expensive to produce. But consider a few points:
- The architecture is being made into three seperate high-end products, the cheapest of which is the GeForce part. Quadro parts are often priced twice (or more) the price of GeForce parts, and Tesla parts even more.
- The architecture is extremely modular. Scaling up for the next generation after GTX 480 is almost as simple as CTRL+V. Scaling down is just as easy.
- Longevity. Fermi has it. It's an extremely advanced architecture with a ton of features built around the best part of DX11, tessellation.
I haven't seen anything that's made me believe that NVIDIA will be selling GPUs at a loss. Consider that even with HD5800 out performing NVIDIA parts, NVIDIA is gaining more of the market, selling a ton of GPUs and making a lot of money. Even IF they were selling GF100 at a loss, they aren't in AMD's financial shoes.
Besides, why should we care if they make or lose money on a product so long as the price and performance are competitive?
Amorphous
you had me until this...
nVidia has only gained market share in the low end and OEM markets, laptop makers are dropping them due to issues over the last 2 years, and OEMs are tired of renaming parts for new models. nVidia will be fine if the first round of fermi doesn't do well, but they better hope they get some money from somewhere. No business can afford to sell all it's parts at a loss for very long.
No difference, I still think its a contrived figure...
http://jonpeddie.com/press-releases/...er-also-beats/Quote:
Intel was the leader in Q4'09, elevated by Atom sales for netbooks, as well as strong growth in the desktop segment. AMD gained in the notebook integrated segment, but lost some market share in discrete in both the desktop and notebook segments due to constraints in 40nm supply. Nvidia picked up a little share overall. Nvidia's increases came primarily in desktop discretes, while slipping in desktop and notebook integrated.
but they have less market shares then last year
Can we then analogize scaling down as using the DEL key? :D
Scaling down for the mid and low range would be a good idea. They'd get a much higher yield and they could fill those market segments with nice Fermi arch chips instead of evolved G80 arch chips.
Well, I for one also consider the market and social impact of my purchase. I'd like strong competition in the GPU market. What happens now in the early days of gpu computing could affect how the market looks for decade. But we all will have our own reasons for buying a product.Quote:
Besides, why should we care if they make or lose money on a product so long as the price and performance are competitive?
You would have to be a subscriber to get the full article.
That would DEL the whole die what you need to use is the backspace key :)
I hope its scalable otherwise we will have Gx3xx in low-mid end based on GT200 and G92 cores.
My reason for buying is funding the gpu makers till holodecks come :D and play some games in the mean time.
5970,5870,5870 2GB 5850 VS. 480, 470 final scores:
http://bbs.pczilla.net/attachments/m...ecd61398ab.png
Total scores for what though?
I are can be like troll?
No but seriously, why even bother posting something like that, give me 10 minutes in excel and i can make you a more believable but completely fabricated graph.
To all the ATI fanboys in this thread, fermi might not be insanely faster or cheaper than the 5xxx series but at least it can run two screens :p: (so do i get my troll points yet?)
mao5=P2MM,a infamous chinese ATI fanboy...
Sometime he maybe has some real inside source,sometime he just post some bull:banana::banana::banana::banana:...
He specifically wants two monitors, three just don't cut it xD
You forgot your “Radeon 7" bull:banana::banana::banana::banana:....
http://forum.beyond3d.com/showpost.p...postcount=1761
6-9 months tops, as for most cards, I guess.
The cost difference can't be significant. Must be something else. Heat, perhaps...
No reason for them not to release it once they can.
It shouldn't make a significant difference in heat dissipation.
It's probably built that way to allow a bigger radiator while keeping an acceptable width.
750MHz is too high for a reference card.
And 512sp version is supposedly coming later...
I'm sure they will make some profit. On both Geforce and (of course) Tesla cards. The price they are selling them for is bound to be higher than the manufacturing cost.
It's the only high end card with disabled parts I can think of.
There would be low availability in any case.
And we can't complain, Tesla is their primary market... At least they'll make some serious $$$! :yepp:
LOL, nice source! :rofl:
I doubt they sold many... But they sold a lot of 4870x2s, I think.
What's up with all the hate? :shrug:
Unlikely IMO... Or at least right now. But there should be definitely something similar coming out a bit later.
:rofl:
Because Tesla cards cost a lot more than Geforce cards. And Tesla is the primary reason for creating Fermi arch (GPGPU, etc). So they are just following their plan.
Looks about right, I guess.
This is typical for high end cards these days, though. Gotta blame TSMC, I suppose...
Good definition! :up:
I bet they are not using 10.3 drivers that are supposed to be MUCH faster in Dirt2.
Interesting, nonetheless.
Great! Going to be an excellent comparison then! Looking forward to it! :up:
It doesn't use much tessellation at all.
Mostly for crowds of people and some minor effects...
250W.
1.5-2x gaming performance of GTX285? 5870 is already faster than that on average...
Doubt that, just Tesla cards I think.
Yeah, stylish and sturdy. I like it, too.
Doubt that, since it needs SLI for 3 monitors...
Oh, there is real info.
Which should hopefully be covered by Tesla sales.
Good point, though.
Fail graph.
I was trolling with that comment, jeez relax.
but honestly the flicker issue is still present with the 10.2 drivers and the new 10.3 drivers. I got 4 friends with 5870s, the asus card doesn't flicker for some reasons but the MSI and club3D cards both have flickering on the second screen when connected to my dual 1920x1200 monitors. I really wanted to get a 5870 but seeing as I work on my machine and have dual screens specifically for that reason, the 5870 is not a card I can risk buying.
Keeping mem frequency at 3D clock will fix the problem. The asus card doesn't flicker because it's bios keep mem frequency at 1200Mhz in mutli-monitor.
http://forum.beyond3d.com/showpost.p...9&postcount=49
http://www.hardwareluxx.de/community...9-post364.htmlQuote:
3DMark Vantage Extreme
HD5970: 12339
GTX480: 9688
HD5870: 8912
GTX470: 7527
HD5850: 6848