Page 34 of 123 FirstFirst ... 24313233343536374484 ... LastLast
Results 826 to 850 of 3051

Thread: The Fermi Thread - Part 3

  1. #826
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by LordEC911 View Post
    But the original statement still stands, 40nm wafers are more expensive than 65/55nm wafers and yes, by a few thousand dollars per wafer.
    There you go again, where are you coming up with this range, was it in the silicon quarterly or something....

    I call shenanigans, the wafers are more expensive by only $23.99 per wafer and you can take that to the bank, for real...
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  2. #827
    Registered User
    Join Date
    Feb 2010
    Location
    NVIDIA HQ
    Posts
    76
    A huge cost is R&D. You worry how much the wafers and heatsinks cost, but the real point to consider is R&D. The Fermi architecture, just like the G80 architecture, was a long time in development and expensive to produce. But consider a few points:
    • The architecture is being made into three seperate high-end products, the cheapest of which is the GeForce part. Quadro parts are often priced twice (or more) the price of GeForce parts, and Tesla parts even more.
    • The architecture is extremely modular. Scaling up for the next generation after GTX 480 is almost as simple as CTRL+V. Scaling down is just as easy.
    • Longevity. Fermi has it. It's an extremely advanced architecture with a ton of features built around the best part of DX11, tessellation.

    I haven't seen anything that's made me believe that NVIDIA will be selling GPUs at a loss. Consider that even with HD5800 out performing NVIDIA parts, NVIDIA is gaining more of the market, selling a ton of GPUs and making a lot of money. Even IF they were selling GF100 at a loss, they aren't in AMD's financial shoes.

    Besides, why should we care if they make or lose money on a product so long as the price and performance are competitive?


    Amorphous
    NVIDIA Forums Administrator

  3. #828
    Xtreme Member
    Join Date
    Sep 2006
    Posts
    171
    Quote Originally Posted by Amorphous View Post
    Besides, why should we care if they make or lose money on a product so long as the price and performance are competitive?


    Amorphous
    you had me until this...

    nVidia has only gained market share in the low end and OEM markets, laptop makers are dropping them due to issues over the last 2 years, and OEMs are tired of renaming parts for new models. nVidia will be fine if the first round of fermi doesn't do well, but they better hope they get some money from somewhere. No business can afford to sell all it's parts at a loss for very long.

  4. #829
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by highoctane View Post
    There you go again, where are you coming up with this range, was it in the silicon quarterly or something....

    I call shenanigans, the wafers are more expensive by only $23.99 per wafer and you can take that to the bank, for real...
    The difference is, I never asked you to believe me...

    Quote Originally Posted by Amorphous View Post
    A huge cost is R&D. You worry how much the wafers and heatsinks cost, but the real point to consider is R&D. The Fermi architecture, just like the G80 architecture, was a long time in development and expensive to produce. But consider a few points:

    I haven't seen anything that's made me believe that NVIDIA will be selling GPUs at a loss. Consider that even with HD5800 out performing NVIDIA parts, NVIDIA is gaining more of the market, selling a ton of GPUs and making a lot of money. Even IF they were selling GF100 at a loss, they aren't in AMD's financial shoes.

    Amorphous
    Right... Lots of money is spent on R&D.
    Lots of money is also spent on operational expenses.

    Ummm... pretty sure Nvidia lost marketshare the last couple of quarters.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  5. #830
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by LordEC911 View Post
    The difference is, I never asked you to believe me...
    No difference, I still think its a contrived figure...


    Quote Originally Posted by LordEC911 View Post
    Ummm... pretty sure Nvidia lost marketshare the last couple of quarters.
    Intel was the leader in Q4'09, elevated by Atom sales for netbooks, as well as strong growth in the desktop segment. AMD gained in the notebook integrated segment, but lost some market share in discrete in both the desktop and notebook segments due to constraints in 40nm supply. Nvidia picked up a little share overall. Nvidia's increases came primarily in desktop discretes, while slipping in desktop and notebook integrated.
    http://jonpeddie.com/press-releases/...er-also-beats/
    Last edited by highoctane; 03-21-2010 at 08:26 PM.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  6. #831
    Xtreme Addict
    Join Date
    Apr 2007
    Location
    canada
    Posts
    1,886
    but they have less market shares then last year

  7. #832
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by Amorphous View Post
    • The architecture is extremely modular. Scaling up for the next generation after GTX 480 is almost as simple as CTRL+V. Scaling down is just as easy.
    Can we then analogize scaling down as using the DEL key?

    Scaling down for the mid and low range would be a good idea. They'd get a much higher yield and they could fill those market segments with nice Fermi arch chips instead of evolved G80 arch chips.

    Besides, why should we care if they make or lose money on a product so long as the price and performance are competitive?
    Well, I for one also consider the market and social impact of my purchase. I'd like strong competition in the GPU market. What happens now in the early days of gpu computing could affect how the market looks for decade. But we all will have our own reasons for buying a product.

  8. #833
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Sn0wm@n View Post
    but they have less market shares then last year
    Yeah, is the table wrong?
    It shows a decrease for everyone but Intel...
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  9. #834
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    You would have to be a subscriber to get the full article.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  10. #835
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,631
    Quote Originally Posted by Solus Corvus View Post
    Can we then analogize scaling down as using the DEL key?

    Scaling down for the mid and low range would be a good idea. They'd get a much higher yield and they could fill those market segments with nice Fermi arch chips instead of evolved G80 arch chips.


    Well, I for one also consider the market and social impact of my purchase. I'd like strong competition in the GPU market. What happens now in the early days of gpu computing could affect how the market looks for decade. But we all will have our own reasons for buying a product.
    That would DEL the whole die what you need to use is the backspace key

    I hope its scalable otherwise we will have Gx3xx in low-mid end based on GT200 and G92 cores.

    My reason for buying is funding the gpu makers till holodecks come and play some games in the mean time.
    Coming Soon

  11. #836
    Registered User
    Join Date
    Jan 2006
    Posts
    80
    5970,5870,5870 2GB 5850 VS. 480, 470 final scores:

    Last edited by mao5; 03-22-2010 at 01:01 AM.
    Q6600 (400x9) 2GB DDR2-1000 Asus P5K-E WIFI 2xRadeon HD 4850

  12. #837
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    Total scores for what though?

  13. #838
    of the Strawhat crew.
    Join Date
    Mar 2008
    Location
    West TN
    Posts
    1,646
    Quote Originally Posted by insurgent View Post
    Total scores for what though?
    Troll points it seems. I thought someone kept these threads clean?

    XtremeSystems BF3 Platoon - Any XS member is welcome.

  14. #839
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    Quote Originally Posted by mao5 View Post
    5970,5870,5870 2GB 5850 VS. 480, 470 final scores:

    WTH is that? I've never seen so little context or relevant annotation in a graph

    Thats the graph makers fault BTW, not yours. Unless you made it.........
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  15. #840
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    403
    Quote Originally Posted by mao5 View Post
    5970,5870,5870 2GB 5850 VS. 480, 470 final scores:

    I are can be like troll?

    No but seriously, why even bother posting something like that, give me 10 minutes in excel and i can make you a more believable but completely fabricated graph.

    To all the ATI fanboys in this thread, fermi might not be insanely faster or cheaper than the 5xxx series but at least it can run two screens (so do i get my troll points yet?)

  16. #841
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    244
    mao5=P2MM,a infamous chinese ATI fanboy...

    Sometime he maybe has some real inside source,sometime he just post some bull...

  17. #842
    Registered User
    Join Date
    Jan 2006
    Posts
    80
    Quote Originally Posted by Coldon View Post
    I are can be like troll?

    No but seriously, why even bother posting something like that, give me 10 minutes in excel and i can make you a more believable but completely fabricated graph.

    To all the ATI fanboys in this thread, fermi might not be insanely faster or cheaper than the 5xxx series but at least it can run two screens (so do i get my troll points yet?)
    oh, buddy, can you complain to the AIC?
    Q6600 (400x9) 2GB DDR2-1000 Asus P5K-E WIFI 2xRadeon HD 4850

  18. #843
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    244
    Quote Originally Posted by Coldon View Post
    I are can be like troll?
    To all the ATI fanboys in this thread, fermi might not be insanely faster or cheaper than the 5xxx series but at least it can run two screens (so do i get my troll points yet?)



  19. #844
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    Quote Originally Posted by Coldon View Post
    I are can be like troll?

    No but seriously, why even bother posting something like that, give me 10 minutes in excel and i can make you a more believable but completely fabricated graph.

    To all the ATI fanboys in this thread, fermi might not be insanely faster or cheaper than the 5xxx series but at least it can run two screens (so do i get my troll points yet?)
    it runs two screens problem was solved long time ago wow what a hater
    Quote Originally Posted by LesGrossman View Post
    So for the last 3 months Nvidia talked about Uniengine and then Uniengine and more Uniengine and finally Uniengine. And then takes the best 5 seconds from all the benchmark run, makes a graph and then proudly shows it everywhere.

  20. #845
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    He specifically wants two monitors, three just don't cut it xD

  21. #846
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    244
    Quote Originally Posted by mao5 View Post
    whatever, I don't care your s hit
    You forgot your “Radeon 7" bull....

    Quote Originally Posted by mao5 View Post
    buddy, my source told me you can see "Radeon 7 Series" on the computer screen when you power on the Evergreen machine, ok? not the signature pic from sb else, thx.
    http://forum.beyond3d.com/showpost.p...postcount=1761

  22. #847
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Sn0wm@n View Post
    anyone want to take a wild guess as to when tsmc will have their 28nm process ready for nvidia's shrinked fermi ?????


    ill start and say Q4 2011
    6-9 months tops, as for most cards, I guess.
    Quote Originally Posted by HelixPC View Post
    on a serious note, seeing how the memory will make minimal performance increase for most people, im sure nvidia chose lower frequency gddr5 to keep cost down, thats my guess.
    The cost difference can't be significant. Must be something else. Heat, perhaps...
    Quote Originally Posted by Chickenfeed View Post
    Thanks for that. Further supports the possibility of a 480Ultra down the road.
    No reason for them not to release it once they can.
    Quote Originally Posted by fellix_bg View Post
    The metallic cover has definitely something to do with the heat dissipation.
    It shouldn't make a significant difference in heat dissipation.
    It's probably built that way to allow a bigger radiator while keeping an acceptable width.
    Quote Originally Posted by orangekiwii View Post
    Wheres my fermi 512sp at 750 mhz?

    Wasn't that the intended? I want to see benchmarks of that not the 'dud' 480sp version
    750MHz is too high for a reference card.
    And 512sp version is supposedly coming later...
    Quote Originally Posted by To(V)bo Co(V)bo View Post
    Can somebody tell me how much profit nvidia and there AIB's will make off of each card at these current prices?

    Nvidia isnt able to price these cards much more considering the prices ATI is selling parts for. Nvidia is bound and shackled by their lack luster performance. Ati's selling prices have a death grip on nvidias bottom line.
    I'm sure they will make some profit. On both Geforce and (of course) Tesla cards. The price they are selling them for is bound to be higher than the manufacturing cost.
    Quote Originally Posted by ***Deimos*** View Post
    WHAT IS THE BIG DEAL?

    512 vs 480. That's **only** 6% worth.

    End of the World because nvidia ships a card with less than all shaders?
    Broken card? Give me a break. 8800GT was **only** 112 out of 128 shaders, and one of best selling cards.
    It's the only high end card with disabled parts I can think of.
    Quote Originally Posted by annihilat0r View Post
    http://translate.google.com/translat...tm&sl=tr&tl=en

    According to this, Nvidia might have already sold huge numbers of Fermi GPUs as Tesla cards to the Chinese government. Which means a probability of very low availability of GeForce cards.
    There would be low availability in any case.
    And we can't complain, Tesla is their primary market... At least they'll make some serious $$$!
    Quote Originally Posted by saaya View Post
    breaking news! ati MIGHT have sold even higher numbers of rv870 gpus to the chinese government!

    source
    LOL, nice source!
    I doubt they sold many... But they sold a lot of 4870x2s, I think.
    Quote Originally Posted by Chruschef View Post
    This actually makes a lot of sense to me, GPUs are capable of hacking much faster than CPUs because of the mathematical nature of GPUs . . . and the Chinese govt. is allegedly quite fond of hacking things.
    What's up with all the hate?
    Quote Originally Posted by M.Beier View Post
    Probably posted somewhere here, but there could be 2 editions of the gtx 480 coming out.

    One with 512 shaders as well as the 480 edition. Its probably going to be an ultra with an ultra like price. Hopefully they bump the clocks up a bit like they have done in the past.
    Unlikely IMO... Or at least right now. But there should be definitely something similar coming out a bit later.
    Quote Originally Posted by dduckquack View Post
    i didnt know the chinese government played a lot of 3dmark

    Quote Originally Posted by BeepBeep2 View Post
    There are probably more computers in Seoul, South Korea than there are in North Korea as a whole...

    Anyway, why is nvidia selling Fermi to the chinese gov't.?
    Because Tesla cards cost a lot more than Geforce cards. And Tesla is the primary reason for creating Fermi arch (GPGPU, etc). So they are just following their plan.
    Quote Originally Posted by onethreehill View Post
    Looks about right, I guess.
    Quote Originally Posted by xdan View Post
    And also it seems that until 6 april the stock will be very low...almost inexistent...and the cards will be overpriced by sellers in some countrys like Romania.
    This is typical for high end cards these days, though. Gotta blame TSMC, I suppose...
    Quote Originally Posted by annihilat0r View Post
    Performance wise, I am not waiting for any surprises. Prices are good, TDP is not as bad as it was "hyped" to be, these cards seem to be somewhat decent.
    Good definition!
    Quote Originally Posted by Vardant View Post
    These are supposed to be new.

    http://i39.tinypic.com/14xp3qh.png
    I bet they are not using 10.3 drivers that are supposed to be MUCH faster in Dirt2.
    Interesting, nonetheless.
    Quote Originally Posted by SKYMTL View Post
    Honestly, it isn't even in my system yet. I am trying to re-benchmark as many cards as possible with the new 10.3a and 197-series drivers before popping the new stuff in.
    Great! Going to be an excellent comparison then! Looking forward to it!
    Quote Originally Posted by Solus Corvus View Post
    Dirt2 is also a DX11 title with tessellation - something we have been told repeatedly that Fermi is particularly good at.
    It doesn't use much tessellation at all.
    Mostly for crowds of people and some minor effects...
    Quote Originally Posted by eric66 View Post
    what is power draw of 480 ?
    250W.
    Quote Originally Posted by Olivon View Post
    1.5-2x gaming performance of GTX285? 5870 is already faster than that on average...
    Quote Originally Posted by btdvox View Post
    Will there be a 2+ GB version of the GTX 480?
    Doubt that, just Tesla cards I think.
    Quote Originally Posted by takamishanoku View Post
    Cheers mindfury for those terrific pics. The heat sink looks quality (as does the card in general)!
    Yeah, stylish and sturdy. I like it, too.
    Quote Originally Posted by ShadowFox19 View Post
    Being the card has two DVI and one mini-HDMI outputs, will they be able to support two monitors (via the DVI's) and a TV (via the HDMI) at the same time?
    Doubt that, since it needs SLI for 3 monitors...
    Quote Originally Posted by weston View Post
    whoa already 33 pages :O

    still no real info, so what's the point
    Oh, there is real info.
    Quote Originally Posted by Amorphous View Post
    A huge cost is R&D.
    Which should hopefully be covered by Tesla sales.
    Good point, though.
    Quote Originally Posted by mao5 View Post
    5970,5870,5870 2GB 5850 VS. 480, 470 final scores:
    Fail graph.
    Last edited by zalbard; 03-22-2010 at 01:09 AM.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  23. #848
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    403
    Quote Originally Posted by eric66 View Post
    it runs two screens problem was solved long time ago wow what a hater
    I was trolling with that comment, jeez relax.

    but honestly the flicker issue is still present with the 10.2 drivers and the new 10.3 drivers. I got 4 friends with 5870s, the asus card doesn't flicker for some reasons but the MSI and club3D cards both have flickering on the second screen when connected to my dual 1920x1200 monitors. I really wanted to get a 5870 but seeing as I work on my machine and have dual screens specifically for that reason, the 5870 is not a card I can risk buying.

  24. #849
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    244
    Quote Originally Posted by Coldon View Post
    but honestly the flicker issue is still present with the 10.2 drivers and the new 10.3 drivers. I got 4 friends with 5870s, the asus card doesn't flicker for some reasons but the MSI and club3D cards both have flickering on the second screen when connected to my dual 1920x1200 monitors. I really wanted to get a 5870 but seeing as I work on my machine and have dual screens specifically for that reason, the 5870 is not a card I can risk buying.
    Keeping mem frequency at 3D clock will fix the problem. The asus card doesn't flicker because it's bios keep mem frequency at 1200Mhz in mutli-monitor.

    Quote Originally Posted by Dave Baumann View Post
    Memory clock switching does not occur in mutli-monitor scenarios for the with hardware PowerPlay systems for the reason OpenGL guy commented on (which, as he pointed out, I've also commented on previously). With R700, the flash in multimon mode wasn't so much of an issue because it basically only does it when you boot a game, and in many instances it may also be changing resolution as it loads up, so that wasn't deemed an issue; with hardware PowerPlay, because we can get much lower lows and because it is hardware activity based PowerPlay states will move around frequently - both at the desktop and even in games - and so in multi-panel mode we have restricted the memory clock switching so that the screen flashing as it moves states on some of them do not occur.

    Even though, in multi-mon modes, we have to restrict the engine clock a little more than we can in single panel modes, we still achieve better power savings than we could with 2D/3D switching.
    http://forum.beyond3d.com/showpost.p...9&postcount=49
    Last edited by mindfury; 03-22-2010 at 01:28 AM.

  25. #850
    Registered User
    Join Date
    Jan 2010
    Posts
    23
    3DMark Vantage Extreme
    HD5970: 12339
    GTX480: 9688
    HD5870: 8912
    GTX470: 7527
    HD5850: 6848
    http://www.hardwareluxx.de/community...9-post364.html
    Last edited by WeiT.235; 03-22-2010 at 01:52 AM.

Page 34 of 123 FirstFirst ... 24313233343536374484 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •