MMM
Page 8 of 10 FirstFirst ... 5678910 LastLast
Results 176 to 200 of 226

Thread: Nvidia GT300 yields are under 2%

  1. #176
    Xtreme Member
    Join Date
    Oct 2006
    Location
    Redding, CA
    Posts
    232
    Quote Originally Posted by RPGWiZaRD View Post
    I just think NVIDIA rather sooner than later should drop this silly tactics to focus mainly on highend, it's not a healthy business in the long run try to rely on releasing a big but fast card as there's many disadvantages that comes from this, mainly development time & cost, greater risk for yield issues, cooling & power limitation issues and not to mention there's far less customers in this price range than what ATI is focusing.

    NVIDIA should try to offer a great from top-to-bottom product range based on a new arch, if they manage to pull a successful series then that would be disastrous to ATIs current tactics which would have to lower prices greatly and possibly still not getting any sales, it would start chewing on their market share which atm is what ATI is very interested.

    The perhaps biggest problem right now is that even if their big fat chip is fast, ATI will still get sales as they have nothing that competes in the same price range. I just don't see the logic behind NVIDIA atm, hopefully HD 5xxx series will teach em'.
    I think that that the focus they put on highend chips is well placed. Highend products, particularly ones that out do the competition, garner attention. Attention is visibility, and visibility is vital to marketing. The performance crown is an important aspect of the business.

    Where nVidia fails, is that they continue with this strategy throughout the life cycle of a product. Once a performance part is released, they should then focus on refining, limiting, and segmenting the arch. to fit into different markets. Go for the highend first to prove they've got a decent part and they're still relevent, then scale it back to keep sales up in all the market segments, which becomes easier once yeilds improve. Lower end parts need a great deal of supply capacity, because they're obviously going to apply to a greater market segment than the highend chips, and thus have more demand. When you're dealing with new architecture, yeilds on working chips may not be too good... As it looks like is happening with GT300. If nVidia focuses on highend parts first, where price premiums impact decision making to a lesser extent, then lower quanities and thus higher prices are much more acceptable. It's the low-mid range that would suffer horribly from inflated prices and non-existant availability. It's cool for a person to have a highend, kickass part that's only available in limited quantites. It's just frustrating to have a mid range card that you were charged a premium for because the company can't get their together.

    nVidia TRIES to do this, but it's a too little, too late situation. By the time they get around to it, the competition *cough*ATI*cough* has already extended it's arm into the other market segments. nVidia is left trying to rally its troops and get products out when the rest of the market is already into a full on invasion. Look what happened with the 8 series cards. Amazing performance for the time, compared to ATI's offerings, but they focused on the highend segment for too long. The 8800GTS and 8800GTX were the only worth while cards for a long time. Their lower end parts were overpriced for the performance. Hell, a 7600GT was still a competetive part compared to the similarly priced 8xxx series cards in those days. ATI's 2xxx series was decent enough, the 3xxx series was whatever it was, a dx10.1 update and die shrink IIRC, but they didn't really get off their asses until the 4xxx series. By then nVidia had some decent midrange products out, but the G200 cards were still in the highend stage, and the only midrange offerings they had to speak of were 8 series cards, either branded as such or under the 9 series rename, to keep up appearances that they were actually doing something with their time. That gave ATI time to catch up, grab some market share, and get products out that could compete or beat the G200 cards, all while maintaining and building a market share in the mid-low end.

    nVidia has the right theory going of release highend parts first, then refine them to fit other market segments and simultaneously ramping up clocks and performance on the highend as yeilds improve, their problem is implementation. ATI has a similar, but different strategy going, where they've got their highend offerings, but from experience they seem to be throwing more thought into their mid-low end products, where nVidia's presence is lacking. The difference, is that they can actually put their plan into action. It seems like nVidia just kinda releases the highend cards and keeps hyping them to death, while lackadaisically developing their lower end offerings, which end up failing spectacularly, all while they're blind to the fact that ATI has competitive solutions available already, in all segments.

  2. #177
    Xtreme Enthusiast
    Join Date
    Dec 2005
    Location
    San Diego, CA
    Posts
    529
    Those gta4 benchmarks look like the result of bad optimizations more than hardware capability. The scaling is really bad.
    Gaming rig
    Mountain mods Plateau-18 -- Core i7 980x 4ghz -- Msi big bang x-power -- 6gb Mushkin redline 998691 ddr3 1600 (6-7-6-18-1T) -- Gigabyte GTX580SO (900/1800/1100) -- Win7 64bit pro

    Heatware

  3. #178
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by bigKr33 View Post
    Those gta4 benchmarks look like the result of bad optimizations more than hardware capability. The scaling is really bad.
    yeah gta4 is one of the worst pieces of game code for sure...
    its not like previous gta games delivered on the hw they demanded to run though... gta has always been very unefficient... only battlefield has been worse in my experience...

    doesnt gta4 still scale with more than 6gb of memory?
    and it scales above 3ghz intel cpu power...

    most games scale until what, 2.6ghz cpu power and 3gb memory?

    the best eye candy/hw requirement game ive played so far is far cry2 closely followed by mass effect and gears of war...
    Last edited by saaya; 09-16-2009 at 10:12 PM.

  4. #179
    Xtreme Enthusiast
    Join Date
    Dec 2005
    Location
    San Diego, CA
    Posts
    529
    Quote Originally Posted by apt403 View Post
    I think that that the focus they put on highend chips is well placed. Highend products, particularly ones that out do the competition, garner attention. Attention is visibility, and visibility is vital to marketing. The performance crown is an important aspect of the business.

    Where nVidia fails, is that they continue with this strategy throughout the life cycle of a product. Once a performance part is released, they should then focus on refining, limiting, and segmenting the arch. to fit into different markets. Go for the highend first to prove they've got a decent part and they're still relevent, then scale it back to keep sales up in all the market segments, which becomes easier once yeilds improve. Lower end parts need a great deal of supply capacity, because they're obviously going to apply to a greater market segment than the highend chips, and thus have more demand. When you're dealing with new architecture, yeilds on working chips may not be too good... As it looks like is happening with GT300. If nVidia focuses on highend parts first, where price premiums impact decision making to a lesser extent, then lower quanities and thus higher prices are much more acceptable. It's the low-mid range that would suffer horribly from inflated prices and non-existant availability. It's cool for a person to have a highend, kickass part that's only available in limited quantites. It's just frustrating to have a mid range card that you were charged a premium for because the company can't get their together.

    nVidia TRIES to do this, but it's a too little, too late situation. By the time they get around to it, the competition *cough*ATI*cough* has already extended it's arm into the other market segments. nVidia is left trying to rally its troops and get products out when the rest of the market is already into a full on invasion. Look what happened with the 8 series cards. Amazing performance for the time, compared to ATI's offerings, but they focused on the highend segment for too long. The 8800GTS and 8800GTX were the only worth while cards for a long time. Their lower end parts were overpriced for the performance. Hell, a 7600GT was still a competetive part compared to the similarly priced 8xxx series cards in those days. ATI's 2xxx series was decent enough, the 3xxx series was whatever it was, a dx10.1 update and die shrink IIRC, but they didn't really get off their asses until the 4xxx series. By then nVidia had some decent midrange products out, but the G200 cards were still in the highend stage, and the only midrange offerings they had to speak of were 8 series cards, either branded as such or under the 9 series rename, to keep up appearances that they were actually doing something with their time. That gave ATI time to catch up, grab some market share, and get products out that could compete or beat the G200 cards, all while maintaining and building a market share in the mid-low end.

    nVidia has the right theory going of release highend parts first, then refine them to fit other market segments and simultaneously ramping up clocks and performance on the highend as yeilds improve, their problem is implementation. ATI has a similar, but different strategy going, where they've got their highend offerings, but from experience they seem to be throwing more thought into their mid-low end products, where nVidia's presence is lacking. The difference, is that they can actually put their plan into action. It seems like nVidia just kinda releases the highend cards and keeps hyping them to death, while lackadaisically developing their lower end offerings, which end up failing spectacularly, all while they're blind to the fact that ATI has competitive solutions available already, in all segments.
    Very good point. Although nvidia and ati always are at in competition with performance. Its as if nvidia is trying to instill the top spot for performance. And while nvidia does that ati is flooding the midrange with well prices gpus. Nvidia is always late to the punch for that. And everyone knows most of the market for gpus is midrange buyers.

    Just my .02
    Gaming rig
    Mountain mods Plateau-18 -- Core i7 980x 4ghz -- Msi big bang x-power -- 6gb Mushkin redline 998691 ddr3 1600 (6-7-6-18-1T) -- Gigabyte GTX580SO (900/1800/1100) -- Win7 64bit pro

    Heatware

  5. #180
    Registered User
    Join Date
    Feb 2008
    Posts
    54
    I dont see why is this so surprising? Same big chip as g80 and gt200 (or even bigger) and based on not nearly great and proven process as 90nm or 65/55nm. I'd sad it was more than expected.



    But i'd take it with grain of salt. Cause we don't know what they really emeant with just 7 chips to be success . Does that mean only seven came out uncorrupted or only seven reach jump over bar that's raised too much by some CEO. I'd rather expect only 20% but onmuch lower clocks than expected. What they announce ~750MHz while they reach only 650MHz on much smaller gt200b and on 18 month old proces. I'd say the bar is too optimistically risen too high for gt300.

    Yep i know they only have 128TMUs and 64RBEs nowadays so that advantage over ATi is now with RV870 totally melt away especially from the glorious days of g80 64:24 @680MHz vs. lately r600 pitiful 16:16 @740MHz. But still even with only 600MHz (i think they could reach at least 20% yield at that clock ) 128TMUs 76,8GTex/s vs. RV870 80TMUs (@850MHz) that reaches only 68GTex/s. But on the other hand do they really think they could make single chip that will make ATi crawl again without some really new radically architecture?!

  6. #181
    Registered User
    Join Date
    Feb 2008
    Posts
    54
    Quote Originally Posted by Xope_Poquar View Post
    Looks like the 5870 will stay at $400 for a long time.


    otoh .... they could make it cheaper cause in some parts of the world there are no rebates and gpus stay overpriced even when price steadily decline. even when they're phased out

    hd5850 $249 and hd5870(basic 1gb) $299 would be fair even for predatory manufacturers. and this way with even overpriced junipers $199 they simply rip us off.

  7. #182
    Registered User
    Join Date
    Feb 2008
    Posts
    54
    Quote Originally Posted by orangekiwii View Post
    .... or they didn't know and just expected their new architecture to actually be as good as they hoped.
    What a hellawa new architecture if any of this is true?! It's yet another redesigned chip with g70 heritage that we're watching for too long now.

    We could see now why the'r all mumbling about GPGPU all they maybe redesigned and upgraded was better and more advanced CUDA support and like we really care about it and for that reason we had to have huge overburdened chip that in the end has less FLOPS than some much smaller solution from ATi. It's more than disappointing. Why they fight CUDA front with gamers/CGs money just that somebody could crunch numbers on same chip?! Well that question wouldn't be such ubiquitous if they're really redesign their GPGPU approach as they claim.

  8. #183
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034
    Quote Originally Posted by hlopek View Post


    otoh .... they could make it cheaper cause in some parts of the world there are no rebates and gpus stay overpriced even when price steadily decline. even when they're phased out

    hd5850 $249 and hd5870(basic 1gb) $299 would be fair even for predatory manufacturers. and this way with even overpriced junipers $199 they simply rip us off.
    But then comes the obvious WHY? Why should someone lower the prices when the competition obviously aren't there.

    Don't worry prices will go down when GT300 is here, but AMD ain't doing charity work.
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  9. #184
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    449
    Quote Originally Posted by saaya View Post
    must have been gt200 based, def... they usually release tesla and professional graphics a couple of months after the end user cards...


    comparing gtx285 vs gts250 in crysis and crysis warhead doesnt make sense cause both are in the unplayable or barely playable range imo, even at 1280x1024 a gtx285 only pulls around 20/30 fps (min-av) the gtx285 is only 50-100% faster here...
    What....GTX 280 OC / GTX 285 pulls 25fps min easy at 1680x1050 w/ everything at Enthusiast / Very High no AA. I've owned a really good oc'ing 280 a while back and at 738/1512/1200 the card could push 1920x1200 all enthusiast while still maintaing avg fps above 25 and min 21-22.


    Quote Originally Posted by saaya View Post
    add 10 fps and you got the stalker clear sky results, and like above, the gtx285 is 50-100% faster than the gts250.

    and now for gta4, lol... a 9800gtx+ does about the same as a gtx280 here... ouch...
    http://www.pcgameshardware.com/aid,6...eviews/?page=2

    g200 3x as fast as g92 my 4ss :P




    and thats a quadcore@3.33ghz, thats where cpus stop scaling with gta4... so dont call it cpu limited :P
    http://www.pcgameshardware.de/aid,66...l/Test/?page=2


    Thats exactly what Im talking about. With latest patch, Q9650 @ 4.3Ghz+ and 8800 Ultra (702/1728/1150) I'd be lucky if I hit 23-24 fps in some of the bad spots of the city @ 1680x1050 with stuff maxed to High / Very High and shadows at 8.

    With 4870x2 I can play 1920x1200 30fps+ easy...I'd really love to know how this site benchmarked the game because from personal experience in actually playing the game...graphics card still matters even in this game.
    Last edited by LiquidReactor; 09-16-2009 at 10:55 PM.
    --lapped Q9650 #L828A446 @ 4.608, 1.45V bios, 1.425V load.
    -- NH-D14 2x Delta AFB1212SHE push/pull and 110 cfm fan -- Coollaboratory Liquid PRO
    -- Gigabyte EP45-UD3P ( F10 ) - G.Skill 4x2Gb 9600 PI @ 1221 5-5-5-15, PL8, 2.1V
    - GTX 480 ( 875/1750/928)
    - HAF 932 - Antec TPQ 1200 -- Crucial C300 128Gb boot --
    Primary Monitor - Samsung T260

  10. #185
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by LiquidReactor View Post
    What....GTX 280 OC / GTX 285 pulls 25fps min easy at 1680x1050 w/ everything at Enthusiast / Very High no AA. I've owned a really good oc'ing 280 a while back and at 738/1512/1200 the card could push 1920x1200 all enthusiast while still maintaing avg fps above 25 and min 21-22.
    yeah but you said g200 is 3x as fast as g92...
    so that would mean you expect an overclocked gts250 in your system with the same settings to only hit 7fps min and av fps only 9? thats def not the case :P

    and those fps are really too low to enjoy crysis imo... its... ok... but really not nice...


    Quote Originally Posted by LiquidReactor View Post


    Thats exactly what Im talking about. With latest patch, Q9650 @ 4.3Ghz+ and 8800 Ultra (702/1728/1150) I'd be lucky if I hit 23-24 fps in some of the bad spots of the city @ 1680x1050 with stuff maxed to High / Very High and shadows at 8.

    With 4870x2 I can play 1920x1200 30fps+ easy...I'd really love to know how this site benchmarked the game because from personal experience in actually playing the game...graphics card still matters even in this game.
    no idea... they dont mention too many details on settings unfortunately... even 30 min fps sucks tho... really crappy coding/porting of the game

    but again, a g200 isnt 3x as fast as a g92... not even when you only look at minfps...

  11. #186
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Regarding Nvidia's strategy...

    Targeting the high end was fine especially since ATI was doing the same. In fact, the mid-range mainstream cards were largely neglected throughout the course of history, for those of us who remember that far back.

    In fact, it was usually the exception, not the norm, to have a great mid-range card. In particular, the 6600's and the X1950Pro stood out to me as great bang/buck cards - but they were part of a whole flood of bad cards. Remember how craptacular the 8400/8600's were as well as the 2400s/2600s just 2 1/2 years ago?

    Of course, we've been spoiled lately but Nvidia's strategy has definitely started to hurt regarding the $150-$300 range. The GT200 release was helped by having a G92 lineup underneath it, but the RV770s easily slotted in to the 200-300 range, especially since the GT200's priced themselves too high while the G92 performance couldn't match RV770 as easily.

    I will say though that I like ATI's new approach to releasing a whole slew of cards at once - meaning we can get a 58xx, 56xx, and 54xx all within a short period of time, rather than the old approach they (and Nvidia still) has of releasing the flagships, then waiting 6months to a year+ for the mid range to show up

  12. #187
    Xtreme Member
    Join Date
    Apr 2006
    Location
    Central California
    Posts
    359
    Quote Originally Posted by zerazax View Post
    Regarding Nvidia's strategy...

    Targeting the high end was fine especially since ATI was doing the same. In fact, the mid-range mainstream cards were largely neglected throughout the course of history, for those of us who remember that far back.

    In fact, it was usually the exception, not the norm, to have a great mid-range card. In particular, the 6600's and the X1950Pro stood out to me as great bang/buck cards - but they were part of a whole flood of bad cards. Remember how craptacular the 8400/8600's were as well as the 2400s/2600s just 2 1/2 years ago?

    Of course, we've been spoiled lately but Nvidia's strategy has definitely started to hurt regarding the $150-$300 range. The GT200 release was helped by having a G92 lineup underneath it, but the RV770s easily slotted in to the 200-300 range, especially since the GT200's priced themselves too high while the G92 performance couldn't match RV770 as easily.

    I will say though that I like ATI's new approach to releasing a whole slew of cards at once - meaning we can get a 58xx, 56xx, and 54xx all within a short period of time, rather than the old approach they (and Nvidia still) has of releasing the flagships, then waiting 6months to a year+ for the mid range to show up
    Definitely agree. I always bought my cards around the $200 price range for as long as I've been buying cards (since Matrox days). It seems no coincidence that when one brand falls short in the mid-high range that I don't buy at that time. It's fun to see who has the top dog but for me the real competition has always been the 6600GT, the 9800PRO, etc.

  13. #188
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    449
    Quote Originally Posted by saaya View Post
    yeah but you said g200 is 3x as fast as g92...
    so that would mean you expect an overclocked gts250 in your system with the same settings to only hit 7fps min and av fps only 9? thats def not the case :P

    and those fps are really too low to enjoy crysis imo... its... ok... but really not nice...
    I did mention GTX 280 vs 9800GTX tri sli since there is now other way to really experience the difference. Unless theres a tool that magically multiplies your fps by 3 for theoretical 3x fps numbers.
    --lapped Q9650 #L828A446 @ 4.608, 1.45V bios, 1.425V load.
    -- NH-D14 2x Delta AFB1212SHE push/pull and 110 cfm fan -- Coollaboratory Liquid PRO
    -- Gigabyte EP45-UD3P ( F10 ) - G.Skill 4x2Gb 9600 PI @ 1221 5-5-5-15, PL8, 2.1V
    - GTX 480 ( 875/1750/928)
    - HAF 932 - Antec TPQ 1200 -- Crucial C300 128Gb boot --
    Primary Monitor - Samsung T260

  14. #189
    Xtreme Enthusiast
    Join Date
    Dec 2005
    Location
    San Diego, CA
    Posts
    529
    Man all in all I just want to know when the gt300 series will be out. I just wish they could be out before the end of this year. I'm sure its not going to happen, and if thats the case i'll probably just go ati for this round. I'm not biased towards either company, I just go by supply and demand. Ati is ahead of the game right now. Likewise I'm sure the gt300 will be faster clock for clock, but I don't want to wait another 6+ months.
    Gaming rig
    Mountain mods Plateau-18 -- Core i7 980x 4ghz -- Msi big bang x-power -- 6gb Mushkin redline 998691 ddr3 1600 (6-7-6-18-1T) -- Gigabyte GTX580SO (900/1800/1100) -- Win7 64bit pro

    Heatware

  15. #190
    Registered User
    Join Date
    Feb 2008
    Posts
    54
    Quote Originally Posted by Smartidiot89 View Post
    But then comes the obvious WHY? Why should someone lower the prices when the competition obviously aren't there.
    Charity, charity everybody fronts out that shield (Does anybody knows what charity means these days) It's not charity just a common sense about ATi's MSRP. Stores could easily raise that prices if demand overcame supply. So it would be some common sense for ATi. But hell with reason when there's enough unreasonable buyers

  16. #191
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by saaya View Post
    you mean your not sure? its pretty obvious isnt it?


    but that doesnt mean nvidia loses... they lose a lot of potential sales... but is it really that much?




    Is it really that much...? YES!


    It's significant.

    Do you understand that people don't buy Computers every month, like you seem to think...? AT best, people make this plunge about every 4 years!



    Don't underestimate how much people spend on Santa Claus. The increasing importance our computers have on our lives is nothing short of a cultural phenomenon. Rampant use of PC's as "Family Centers" and as entertainment and socializing, etc... means this year's Holiday Season is going to be huge for the computer industries !!

    And leading the charge is going to be Microsoft. With the multi-media experience of Zune and Windows 7 ...!

    It's going to be HUGE!

    OVERDONE AND MASSIVE. Don't underestimate marketing and how much people value seamless and ease. You Tube, Facebook, Twitter, eHarmony, Stocks, etc.. unite's the world.

    Microsoft's Windows 7 is going to be in the center of all of that. Record sales!

    That means, indirectly DirectX 11.0 will be important as a standard as well as a marketing thing; DX11, ...even if people don't understand it, the logo will be everywhere, well marketed by both ATi and Microsoft.

    It's going to be EVERYWHERE.


    Mac people are fascinated by QuickTime X ... Microsoft can market Window 7's and DirectX, as an even deeper & richer environment than Macs.

    Microsoft has a bunch of technologies it is selling you with Windows7. Natal, Zune, DX11, Aero, 64bit, easy, powerful, etc..

    Simple is good... standards are good! I think this time around, Microsoft is going to sell the idea of Window7 being more compatible and easy to use.

    Apple might need to be concerned, if people start shopping






    So, If nVidia doesn't have a product for these hungry buyers, ATi will make massive profits and will be able to drop prices once nVidia enters the market (around the Superbowl).. squeezing nVidia's sales.

    All-the-while releasing budget 5600 series, with a final blow. Nvidia will have to play a fine line of dumping chips for market share and profits. Ati could thwart nVidia's attempt to recoup their fab & funding for the GT300 if the only market is the $499 high-end GPUs. As that hi-end market is minuscule to profits to be made in mainstream sales.



    You do understand that the SLI and 4870x2 Market is extremely small... not everyone is Xtreme, we are enthusiasts, you seem to forget that often.




    Investor's don't care if Nvidia's low yield, high-end ($499) GPU sell's some 40k GPU's to a bunch of computer freaks.








    Or, i could just be really high ...?
    Last edited by Xoulz; 09-17-2009 at 02:33 AM.

  17. #192
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by zerazax View Post
    Regarding Nvidia's strategy...

    Targeting the high end was fine especially since ATI was doing the same. In fact, the mid-range mainstream cards were largely neglected throughout the course of history, for those of us who remember that far back.
    originally there was only one current gen card from each maker, then they made OEM/cut down cheaper retail versions so there were 2 versions... then they stopped using the last gen as mainstream/entry level product and created a card based on the latest gen but physically cut down and not just castrated by blowing fuses. from there on more and more skus developed...

    Quote Originally Posted by LiquidReactor View Post
    I did mention GTX 280 vs 9800GTX tri sli since there is now other way to really experience the difference. Unless theres a tool that magically multiplies your fps by 3 for theoretical 3x fps numbers.
    well thats a diferent thing then... was a misunderstanding i guess
    but i wouldnt say a gtx280 beats 3 g92 cards in sli either... actually xbitlabs made an article showing that 2 g92 cards in sli beat a gtx285 and are cheaper...
    if you can accept it being a multi gpu setup thats def a good alternative...

    i have a set of gts250 and gtx260 cards at home and the 260s are barely faster in most games, but consume notably more power, are way bigger and heavier, have a bigger heatsink and still run hotter... i def prefer the gts250 cards, cheap, small, very fast for their price and size...
    the only time i had to throw in the second card in sli to play games in 1920x1080 with all eye candy and aa is in far cry2 and crysis... besides that i could play everything fine with 1 gts250 so far...

  18. #193
    Xtreme Enthusiast
    Join Date
    Jun 2007
    Posts
    681
    Quote Originally Posted by Smartidiot89 View Post
    But then comes the obvious WHY? Why should someone lower the prices when the competition obviously aren't there.

    Don't worry prices will go down when GT300 is here, but AMD ain't doing charity work.
    AMD has terrible market share. A few years ago Intel slashed their own CPU prices when the core architecture came out to flood the market with their chips and maintain brand loyalty. This worked and core now dominates the market. AMD is in trouble with nVidia's branding and they might want to recruit an entirely new generation of customers right now - they could provide a superior product, at a lower price, make a -ton of sales, converting millions of users and setting themselves up for a coup in the next generation. If they do slim their own profit margins, they have months to sell to users before GT300 is even a threat, and when it does come out nVidia would be forced to take a loss coming to market late with lower yields.

    It's a trade-off - give up some profits right now, in effect "buying" loyalty and market share, and then hope to exploit this in the future for profit then.

  19. #194
    Xtreme Member
    Join Date
    Jan 2008
    Location
    Lexington, KY
    Posts
    401
    Quote Originally Posted by Machinus View Post
    AMD has terrible market share. A few years ago Intel slashed their own CPU prices when the core architecture came out to flood the market with their chips and maintain brand loyalty. This worked and core now dominates the market. AMD is in trouble with nVidia's branding and they might want to recruit an entirely new generation of customers right now - they could provide a superior product, at a lower price, make a -ton of sales, converting millions of users and setting themselves up for a coup in the next generation. If they do slim their own profit margins, they have months to sell to users before GT300 is even a threat, and when it does come out nVidia would be forced to take a loss coming to market late with lower yields.

    It's a trade-off - give up some profits right now, in effect "buying" loyalty and market share, and then hope to exploit this in the future for profit then.
    ATI can't realistically lower the price on these parts enough to put them into the price range where sales volumes will increase dramatically and still make a profit. The kinds of people that know anything about these cards or even care enough to keep up with the technology (people like us) are usually willing to pay a price premium for the latest and greatest piece of kit, but we're definitely in the minority.

    ATI will do what it (and NVIDIA) have always done: try to win the high end and then let the reputation trickle down to your lower parts to increase sales regardless of performance.
    Gaming Box

    Ryzen R7 1700X * ASUS PRIME X370-Pro * 2x8GB Corsair Vengeance LPX 3200 * XFX Radeon RX 480 8GB * Corsair HX620 * 250GB Crucial BX100 * 1TB Seagate 7200.11

    EK Supremacy MX * Swiftech MCR320 * 3x Fractal Venture HP-12 * EK D5 PWM

  20. #195
    Xtreme Enthusiast
    Join Date
    Aug 2007
    Location
    Arlington VA
    Posts
    960
    Quote Originally Posted by Xope_Poquar View Post
    Definitely agree. I always bought my cards around the $200 price range for as long as I've been buying cards (since Matrox days). It seems no coincidence that when one brand falls short in the mid-high range that I don't buy at that time. It's fun to see who has the top dog but for me the real competition has always been the 6600GT, the 9800PRO, etc.
    9800 pro was the top dog for a while, so that's not targeting "mid range".

    For me, it was really the push to LCD's that started me gunning for the high end.

    Prior to that, why bother? Fast FPS games (my favorite) are always run at 800x600 with details just to make things easier to see. No real need for anything beefy.

    But now with no true quake/unreal game out (mostly single player slower games) why not see the detail? And having a native res forces you to run higher end cards.

    I hope the next gen cards bring a lot to the table. I've been running OC'd 8800gts 640 since they hit and yet to come across anything I can't play.
    AMD Phenom II BE, ASUS Crosshair II formula, 8gb ddr2 800, 470 SLI, PC P&C 750, arcera RAID, 4x OCZ Vertex2, 2x samsung 7200 1tb, HT Omega Clario +

  21. #196
    Xtreme Addict
    Join Date
    May 2008
    Posts
    1,192
    Quote Originally Posted by Shadov View Post
    The whole text can be found here: http://www.semiaccurate.com/2009/09/...eilds-under-2/ by Charlie Demerjian

    Comments?
    Correct me if I am wrong, but nVidia is not manufacturing their own chip. So they can predict the standard model of yield based on die size, but outside that it is up to the fab they contract.
    Quote Originally Posted by alacheesu View Post
    If you were consistently able to put two pieces of lego together when you were a kid, you should have no trouble replacing the pump top.

  22. #197
    Xtreme Member
    Join Date
    Jan 2008
    Location
    Shin Osaka, Japan
    Posts
    152
    Quote Originally Posted by LiquidReactor View Post
    Games you know...not benchmarks. And really demanding games such as Crysis, Stalker SoC or GTA IV.
    Yes, games used as benchmarks. Look at some of the older articles on anandtech.com.

    This was why GTX 280 was so underwhelming with its $650 price tag. You could get better performance from a $499 9800 GX2--and you weren't missing out on anything (i.e. No DX10.1, both supported CUDA/PhysX and etc).
    Quote Originally Posted by flippin_waffles on Intel's 32nm process and new process nodes
    1 or 2 percent of total volume like intel likes to do. And with the trouble intel seems to be having with they're attempt, it [32nm] doesn't look like a very mature process.
    AMD has always been quicker to a mature process and crossover point, so by the time intel gets their issues and volume sorted out, AMD won't be very far behind at all.

  23. #198
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034
    Quote Originally Posted by Machinus View Post
    AMD has terrible market share. A few years ago Intel slashed their own CPU prices when the core architecture came out to flood the market with their chips and maintain brand loyalty. This worked and core now dominates the market. AMD is in trouble with nVidia's branding and they might want to recruit an entirely new generation of customers right now - they could provide a superior product, at a lower price, make a -ton of sales, converting millions of users and setting themselves up for a coup in the next generation. If they do slim their own profit margins, they have months to sell to users before GT300 is even a threat, and when it does come out nVidia would be forced to take a loss coming to market late with lower yields.

    It's a trade-off - give up some profits right now, in effect "buying" loyalty and market share, and then hope to exploit this in the future for profit then.
    So what happens then? AMD have lowered their prices BAM GT300 arrives and AMD have to lower prices once again resulting in lower margins. One pricecut is more then enough. Market shares will come slowly, it won't happen over night and AMD can't sacrifice tons of potential income just to gain a few marketshares.
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  24. #199
    Xtreme Member
    Join Date
    Mar 2009
    Location
    Miltown, Wisconsin
    Posts
    353
    I believe they should stick to the separate cards approach. Make a really powerful physx card and then make a powerful gpu. That way you can buy exactally what you want. Everyone seems to have 2+ pcie slots theese days waiting to be used. They seem to be pushing the dedicated physx card anyways, so why not just make one then. The regular gpu can have some physx power just enough for minor use. Id rather run a separate physx dedicated card truthfully anyways so why would I need to buy more gpus with physx Im not gonna use. That way crunchers can buy more physx only cards, and gamers can buy more gpus. That way it isnt forced down your throat if all you want is the physx or just a gpu.

  25. #200
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    640
    Quote Originally Posted by Xoulz View Post


    Don't underestimate how much people spend on Santa Claus. .....this year's Holiday Season is going to be huge for the computer industries !!

    And leading the charge is going to be Microsoft. With the multi-media experience of Zune and Windows 7 ...!

    It's going to be HUGE!

    OVERDONE AND MASSIVE.

    Microsoft's Windows 7 is going to be in the center of all of that. Record sales!

    And you're basing this on what?

    On the fact that unemployment rates continue to climb? Up to 9.7% nationwide as of August with no end in sight....numbers continue to climb, but do look like they're at least slowing down slightly. (August's rise was "only" 0.3% higher than July, and the previous six months' pace of unemployment increase--Dec '08-May '09-- was a 0.4-0.5% increase each month.)

    And we'll have a national unemployment rate well over 10% by Christmas....count on it.

    Or maybe you're basing your optimism on the fact that 16 states have unemployment rates over 10%? And another 7 states are in the 9% range. Maybe it's the three (3) states with unemployment rates that are below 6%...ND, SD, and Nebraska. Yeah, those will do it.

    Here you go....a picture of the unemployment across the U.S. by state/region, as of June '09.




    So what region is going to support a big Christmas? Not the west coast, not anywhere from Florida to Michigan. Maybe the Northeast....which is mired with 8% or higher unemployment across the region. Guess it's all down to the Dakotas and Nebraska.


    I'd wager that the Asus CEO is more correct, that Windows 7 will NOT usher in huge computer sales, not to business, not to home consumers. With the huge economic uncertainties that still abound nationwide, I predict this Christmas will be as dismal as last season, if not worse.

    True, there are signs the economy is picking up slowly. But instead of employment picking up, which is always the last thing to happen when coming out of a recession, the still employed workers are simply being asked to do more, either with long erhours or more productivity expectations.

    If we're lucky, we'll see a "big" Christmas in 2010, and I use the term "big" in relative terms....big only in respect to the '08 and '09 seasons, not to really big holiday seasons like in the mid-'90's.

Page 8 of 10 FirstFirst ... 5678910 LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •