Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
A huge cost is R&D. You worry how much the wafers and heatsinks cost, but the real point to consider is R&D. The Fermi architecture, just like the G80 architecture, was a long time in development and expensive to produce. But consider a few points:
- The architecture is being made into three seperate high-end products, the cheapest of which is the GeForce part. Quadro parts are often priced twice (or more) the price of GeForce parts, and Tesla parts even more.
- The architecture is extremely modular. Scaling up for the next generation after GTX 480 is almost as simple as CTRL+V. Scaling down is just as easy.
- Longevity. Fermi has it. It's an extremely advanced architecture with a ton of features built around the best part of DX11, tessellation.
I haven't seen anything that's made me believe that NVIDIA will be selling GPUs at a loss. Consider that even with HD5800 out performing NVIDIA parts, NVIDIA is gaining more of the market, selling a ton of GPUs and making a lot of money. Even IF they were selling GF100 at a loss, they aren't in AMD's financial shoes.
Besides, why should we care if they make or lose money on a product so long as the price and performance are competitive?
Amorphous
NVIDIA Forums Administrator
you had me until this...
nVidia has only gained market share in the low end and OEM markets, laptop makers are dropping them due to issues over the last 2 years, and OEMs are tired of renaming parts for new models. nVidia will be fine if the first round of fermi doesn't do well, but they better hope they get some money from somewhere. No business can afford to sell all it's parts at a loss for very long.
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
No difference, I still think its a contrived figure...
http://jonpeddie.com/press-releases/...er-also-beats/Intel was the leader in Q4'09, elevated by Atom sales for netbooks, as well as strong growth in the desktop segment. AMD gained in the notebook integrated segment, but lost some market share in discrete in both the desktop and notebook segments due to constraints in 40nm supply. Nvidia picked up a little share overall. Nvidia's increases came primarily in desktop discretes, while slipping in desktop and notebook integrated.
Last edited by highoctane; 03-21-2010 at 08:26 PM.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
but they have less market shares then last year
Can we then analogize scaling down as using the DEL key?
Scaling down for the mid and low range would be a good idea. They'd get a much higher yield and they could fill those market segments with nice Fermi arch chips instead of evolved G80 arch chips.
Well, I for one also consider the market and social impact of my purchase. I'd like strong competition in the GPU market. What happens now in the early days of gpu computing could affect how the market looks for decade. But we all will have our own reasons for buying a product.Besides, why should we care if they make or lose money on a product so long as the price and performance are competitive?
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
You would have to be a subscriber to get the full article.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
That would DEL the whole die what you need to use is the backspace key
I hope its scalable otherwise we will have Gx3xx in low-mid end based on GT200 and G92 cores.
My reason for buying is funding the gpu makers till holodecks comeand play some games in the mean time.
Coming Soon
5970,5870,5870 2GB 5850 VS. 480, 470 final scores:
![]()
Last edited by mao5; 03-22-2010 at 01:01 AM.
Q6600 (400x9) 2GB DDR2-1000 Asus P5K-E WIFI 2xRadeon HD 4850
Total scores for what though?
I are can be like troll?
No but seriously, why even bother posting something like that, give me 10 minutes in excel and i can make you a more believable but completely fabricated graph.
To all the ATI fanboys in this thread, fermi might not be insanely faster or cheaper than the 5xxx series but at least it can run two screens(so do i get my troll points yet?)
mao5=P2MM,a infamous chinese ATI fanboy...
Sometime he maybe has some real inside source,sometime he just post some bull...
He specifically wants two monitors, three just don't cut it xD
You forgot your “Radeon 7" bull....
http://forum.beyond3d.com/showpost.p...postcount=1761
6-9 months tops, as for most cards, I guess.
The cost difference can't be significant. Must be something else. Heat, perhaps...
No reason for them not to release it once they can.
It shouldn't make a significant difference in heat dissipation.
It's probably built that way to allow a bigger radiator while keeping an acceptable width.
750MHz is too high for a reference card.
And 512sp version is supposedly coming later...
I'm sure they will make some profit. On both Geforce and (of course) Tesla cards. The price they are selling them for is bound to be higher than the manufacturing cost.
It's the only high end card with disabled parts I can think of.
There would be low availability in any case.
And we can't complain, Tesla is their primary market... At least they'll make some serious $$$!
LOL, nice source!
I doubt they sold many... But they sold a lot of 4870x2s, I think.
What's up with all the hate?
Unlikely IMO... Or at least right now. But there should be definitely something similar coming out a bit later.
Because Tesla cards cost a lot more than Geforce cards. And Tesla is the primary reason for creating Fermi arch (GPGPU, etc). So they are just following their plan.
Looks about right, I guess.
This is typical for high end cards these days, though. Gotta blame TSMC, I suppose...
Good definition!
I bet they are not using 10.3 drivers that are supposed to be MUCH faster in Dirt2.
Interesting, nonetheless.
Great! Going to be an excellent comparison then! Looking forward to it!
It doesn't use much tessellation at all.
Mostly for crowds of people and some minor effects...
250W.
1.5-2x gaming performance of GTX285? 5870 is already faster than that on average...
Doubt that, just Tesla cards I think.
Yeah, stylish and sturdy. I like it, too.
Doubt that, since it needs SLI for 3 monitors...
Oh, there is real info.
Which should hopefully be covered by Tesla sales.
Good point, though.
Fail graph.
Last edited by zalbard; 03-22-2010 at 01:09 AM.
I was trolling with that comment, jeez relax.
but honestly the flicker issue is still present with the 10.2 drivers and the new 10.3 drivers. I got 4 friends with 5870s, the asus card doesn't flicker for some reasons but the MSI and club3D cards both have flickering on the second screen when connected to my dual 1920x1200 monitors. I really wanted to get a 5870 but seeing as I work on my machine and have dual screens specifically for that reason, the 5870 is not a card I can risk buying.
Keeping mem frequency at 3D clock will fix the problem. The asus card doesn't flicker because it's bios keep mem frequency at 1200Mhz in mutli-monitor.
http://forum.beyond3d.com/showpost.p...9&postcount=49
Last edited by mindfury; 03-22-2010 at 01:28 AM.
http://www.hardwareluxx.de/community...9-post364.html3DMark Vantage Extreme
HD5970: 12339
GTX480: 9688
HD5870: 8912
GTX470: 7527
HD5850: 6848
Last edited by WeiT.235; 03-22-2010 at 01:52 AM.
Bookmarks