PDA

View Full Version : Graphics card market falls again in 4th Q...



Sparky
03-28-2007, 09:20 PM
Did a quick search and didn't find anything, if this is a repost then mods please lock, thanks :)

http://www.xbitlabs.com/news/video/display/20070327224149.html


Despite of loud talks about high-performance GeForce and Radeon graphics cards by their respective developers, the market of add-in graphics adapters was down both sequentially and annually in the fourth quarter of 2006. Even though there are logical explanations concerning the sales slash, the market of add-in-cards still seems to have hard times.

Well... Maybe if the software would catch up with the cards first it might help, also maybe if they wouldn't price the things so high they might sell more too! Like that rumored $999 8800 ultra, that's just stupid :slapass:

justin_c
03-28-2007, 09:37 PM
ok, out of all the people who use a computer who understand what a graphics card is (1 out of 4).

now in that 1 out of 4, what percentage know what directx 10 is? (lets say 1 out of 6).

out of the 1/6th, who runs their games at max resolution and maxxed AA. (most don't touch their settings, unless to turn it down).

and then, who would pay approximately 600 bucks for a top of the line gfx card, which companies like Nvidia and ATI want to sell out?

dont flame me, lets look at some figures.

Eastcoasthandle
03-28-2007, 10:42 PM
This is Nvidia and ATI doing. In other words, they want it, they got it.
If they continue to sell graphics cards for the price of a laptop the market will continue to "thin out". Problem is that we will not have to many chooses in the near future. It looks to me we have "peaked" the highest level that this market will achieve and has been on a decent for sometime now. However, lets see what Intel does before declaring doom. I would also be interested to know if IBM will get involved as well. What ATI and Nvidia needs to do is up the market up a bit more to allow for some healthy competition. It just may give the market a good jolt for discrete and integrated solutions.

vitaminc
03-28-2007, 10:43 PM
duh. AIC market should be declining anyway. who the hell needs a graphics card besides gamers and some workstation folks? not everyone needs an ferrari when public transportation are available.

theteamaqua
03-28-2007, 11:06 PM
yay that means cheaper stuff ... yet i havent seen high end GPU for less than $500 ... when it come out

GeForce 3 Ti 500 was like $399.99 ..... just saying

Aerou
03-29-2007, 01:22 AM
yay that means cheaper stuff ... yet i havent seen high end GPU for less than $500 ... when it come out

GeForce 3 Ti 500 was like $399.99 ..... just saying

maybe the X1950XTX?

but this is gettin sick ... GPUs are bigger, hotter, more expensive, more power-consuming ...

DTU_XaVier
03-29-2007, 02:42 AM
duh. AIC market should be declining anyway. who the hell needs a graphics card besides gamers and some workstation folks? not everyone needs an ferrari when public transportation are available.

What about a good mercedes or BMW?? Not everybody wants a ferrari, but I wouldn't settle for what you call "public transportation" no matter what... ;)
Let's not forget, Add-In Graphics-cards aren't only the top end of the line... It's everything in between as well...

K404
03-29-2007, 02:54 AM
This comment is neither here nor there, but the feeling of holding a top-of-the-range graphics card is much more reassuring than holding a top of the range CPU.

A lot of people subconsciously expect a graphics card to be a big-ass lump of wooooow

Seriously...if the best-performing GPU was the size of a 5200.....i`d feel cheated somehow.

I felt depressed when I first picked up a C2D, especially when I had an Opto 150 in my other hand...

Chewbenator
03-29-2007, 06:11 AM
Well maybe if SOMEONE had their video card out like Nvidia we would see more competition, lower pricing, and better sales. But, with people holding out for the ATI component and nothing to compete against the 8800 this was expected.

VulgarHandle
03-29-2007, 06:24 AM
Well maybe if SOMEONE had their video card out like Nvidia we would see more competition, lower pricing, and better sales. But, with people holding out for the ATI component and nothing to compete against the 8800 this was expected.
QFT

Super strokey
03-29-2007, 07:35 AM
I didnt read teh interview so this may be out in left field but i think this has to do with teh gradual shift to notebooks

vitaminc
03-29-2007, 08:41 AM
What about a good mercedes or BMW?? Not everybody wants a ferrari, but I wouldn't settle for what you call "public transportation" no matter what... ;)
Let's not forget, Add-In Graphics-cards aren't only the top end of the line... It's everything in between as well...

People on XS wouldnt settle for public transportations. But if you are to build your grandparents a computer, are you going to dump an additional $600 for a 8800, or even an extra $50 for X1300?

deathman20
03-29-2007, 09:04 AM
Well maybe if SOMEONE had their video card out like Nvidia we would see more competition, lower pricing, and better sales. But, with people holding out for the ATI component and nothing to compete against the 8800 this was expected.

Agreed and thats what im truely waiting for before I decide. I want to see whats out there.

7he ]-[0rr0r
03-29-2007, 09:09 AM
Darth Sideous...
Good... good.
now that they can feel the markets anger they may start reversing the price trend.
I feel like we've been getting ripped on most component costs over the past years it all just crept up.

awdrifter
03-29-2007, 09:24 AM
This market will shrink even more if AMD could really pull off Fusion. By then maybe even casual gamers won't need a add-in card.

vitaminc
03-29-2007, 09:32 AM
This market will shrink even more if AMD could really pull off Fusion. By then maybe even casual gamers won't need a add-in card.

And Intel is making CPU + GPU in 2009 also.

J-Mag
03-29-2007, 09:33 AM
but this is gettin sick ... GPUs are bigger, hotter, more expensive, more power-consuming ...

Why does everyone whine about this? You are thinking incorrectly, IMO.

On a performance comparison basis GPUs are getting cooler and using LESS power. Low end cards today perform like high end cards of yore.

The reason you see bigger badder and more powerful GPUs sucking up more power and creating more heat is because enthusiasts are driving new market niches in the high end that just didn't exist before.

Sparky
03-29-2007, 09:39 AM
And Intel is making CPU + GPU in 2009 also.

Considering Intel's past graphics capabilities - good for MS word and not much else really - I don't care a whole lot. Even for AMD + ATI, integrated CPU+GPU doesn't sound all that great. But it is a good idea for a basic grandparent PC I suppose.

vitaminc
03-29-2007, 09:42 AM
Why does everyone whine about this? You are thinking incorrectly, IMO.

On a performance comparison basis GPUs are getting cooler and using LESS power. Low end cards today perform like high end cards of yore.

The reason you see bigger badder and more powerful GPUs sucking up more power and creating more heat is because enthusiasts are driving new market niches in the high end that just didn't exist before.

With that mentality, CPUs are also getting way cooler and uses way less power. An undervolted/underclocked Athlon 64 X2 can do so much more compare to K5.

Face it. GPUs are getting much warmer and consumes way more power. No one cares about performance comparisons or power/performance if the TDP is getting out of whack.

vitaminc
03-29-2007, 09:48 AM
Considering Intel's past graphics capabilities - good for MS word and not much else really - I don't care a whole lot. Even for AMD + ATI, integrated CPU+GPU doesn't sound all that great. But it is a good idea for a basic grandparent PC I suppose.

Grandparent PC, business desktop PC, living room PC, or all PC besides gaming rigs and workstations.

AIC gfx card is going to shrink further.

XS people won't care because we all use and will continue to use graphics cards, because we play games or fold.

And for those who think R600 will make a difference, they are pretty naive to think the highest end card costing $600 will make up 2% of the unit shortfall. That's like saying, new car sales drop by 2% in 4Q because Ferarri did not push out their brand new F series in time...

J-Mag
03-29-2007, 10:11 AM
With that mentality, CPUs are also getting way cooler and uses way less power. An undervolted/underclocked Athlon 64 X2 can do so much more compare to K5.


Yeah exactly, tech is getting better all the time and people continue to whine about it.



Face it. GPUs are getting much warmer and consumes way more power.

Sure the high end enthusiast GPUs use more power, but there are far more price points in the graphic card market today than there was just a few years ago. The high end of a couple years ago was the same market pricing point as the mid high end of today. GPU manufactures are just responding to demand for this new niche.

vitaminc
03-29-2007, 10:24 AM
Yeah exactly, tech is getting better all the time and people continue to whine about it.

And the actual whining is reflecting on the unit sales. :p


Sure the high end enthusiast GPUs use more power, but there are far more price points in the graphic card market today than there was just a few years ago. The high end of a couple years ago was the same market pricing point as the mid high end of today. GPU manufactures are just responding to demand for this new niche.

Not true. There are more price points for video cards, but the market is shrinking. ATI ran away from the dying market and Nvidia is still adjusting. Truth of the matter is that the majority AIC users (which represents less than 33% of total computer unit sales) don't care about X2900/8800 but the mainstream affordable X2600/8600 or below, and the lowest grade GPUs are suffering a huge squeeze from integrated chipsets. I am eager to see how Nvidia will change its strategy to meet this trend.

DilTech
03-29-2007, 10:27 AM
It's because the mind-range DX10 cards are nowhere to be found.

So many people are waiting and not buying anything that doesn't support DX10, and for some people $100-$200 is the absolute limit they'll spend.

That, plus ATi not having their hand on the table, which there's a lot of ATi loyals who won't buy anything buy ati, and those people who also happened to be waiting for DX10 ended up not buying anything during 2006.

J-Mag
03-29-2007, 10:30 AM
Not true. There are more price points for video cards, but the market is shrinking.

Yes it is true. Niche(s) of a market can be created while the general market as a whole declines. Many times companies will expand into new market niches in an attempt to boost floundering sales from a declining market.

fan
03-29-2007, 11:05 AM
Anyone who plays the latest games will have an add in card (minus cs 1.6). Look at the ratio of gamers vs non gamers.

7he ]-[0rr0r
03-29-2007, 11:09 AM
Yeah exactly, tech is getting better all the time and people continue to whine about it.



Sure the high end enthusiast GPUs use more power, but there are far more price points in the graphic card market today than there was just a few years ago. The high end of a couple years ago was the same market pricing point as the mid high end of today. GPU manufactures are just responding to demand for this new niche.

So you think that todays gpus are runing todays games that much better than older gpus played the older games that it justifies the price difference? That is the bottom line to me and when i think back to 9700 pro and gforce 3 before that and the lower cost of the cpu and motherboard combos i had used I have to beg to differ.

J-Mag
03-29-2007, 11:20 AM
-[0rr0r;2098167']So you think that todays gpus are runing todays games that much better than older gpus played the older games that it justifies the price difference?

I never said anything of the sort. I was analyzing without this variable here (the game or application) because it complicates things. You now have to bring in coding efficiency and increased perception of the value of "new" games. Both of these concepts are almost impossible to define numerically and can vary on a per person basis.

thephenom
03-29-2007, 11:23 AM
Well, what you REALLY need to consider is how many graphics demanding game have seen seen since the summer?

No new exciting game to drive people to upgrade is the problem here.

The 8800 is an awesome card, but it's overkill for the games we have today. There isn't a point for most people to upgrade until they NEED the extra power, without new exciting games, most gamers will not upgrade. Getting 150fps on a new top end card isn't necessary an improvement comparing to your current one which gives you 100fps.

Not only are we limited by software, our LCD technologies is also partly to blame, 19" LCD still runs at 1280x1024 (you certainly don't need a pair of 8800 to drive that resolution), and only recently at the 1680x1050 LCDs are becoming more affordable. These are the same resolution we've been gaming at for years. Most cards are capable of driving those resolution.

24" 1920x1200 screen are becoming cheaper, but until it becomes even cheaper, it's hard to justify a $500+ video card to drive 12x10 or 16x10 screens.

Jimmer411
03-29-2007, 11:34 AM
Why does everyone whine about this? You are thinking incorrectly, IMO.

On a performance comparison basis GPUs are getting cooler and using LESS power. Low end cards today perform like high end cards of yore.

The reason you see bigger badder and more powerful GPUs sucking up more power and creating more heat is because enthusiasts are driving new market niches in the high end that just didn't exist before.



Im sorry, but the 7300GT and X1300 are no where near as powerful as my X850Xt. Hell even the X1600 in my laptop gets smoked by my X850XT in anything but SM3 and only because X100 series dont support SM3 at all.

I think you mean midrange+, like 7900GS X1900pros etc... :stick:




Ive been waiting for this to happen. I used to upgrade my video card with every cycle, Geforce 2,3,4,5 then jumped on the ATI bandwagon with the 9800XT, X800XT and X850XT. Skipping the X1800 and X1900/50, tho they are nice cards, there just isnt anything out there that needs more than what I have, especially with the price level that these cards have been at. SM4 just came out 6 months ago and Nvidia is already working SM5? When video game developers start releasing GAMES instead of long TECH DEMOS Ill get back into the cycle. Its just sad that the only games that have held my attention for more than 30 mins in the past 2 years are COD2 UT2K4. All that money Ive wasted buying the latest and greatest games the past few years :(

thephenom
03-29-2007, 12:09 PM
Im sorry, but the 7300GT and X1300 are no where near as powerful as my X850Xt. Hell even the X1600 in my laptop gets smoked by my X850XT in anything but SM3 and only because X100 series dont support SM3 at all.

I think you mean midrange+, like 7900GS X1900pros etc... :stick:
You're comparing the wrong generations. For low-end cards, it'll be out top end cards 2 generations ago. (So a X13 or 73GT would be faster than a 9800XT and a GeForce 4 (skipping GF5 cuz it sucked)

And I hope you realize, as much as the notebook GPU is called the same as the desktop GPU, a mobile x1600 is crippled comparing to a desktop x1600. A desktop version of the 1650XT (since X1600XT no longer exist and x16 pro is really a X13XT) would offer similar performance as your 850xt and have SM3 support.

awdrifter
03-29-2007, 12:19 PM
Well if the Fusion (or the Intel version of that) can match a 7600GT or 7900GS in performance it'll be enough for a lot of people.

J-Mag
03-29-2007, 12:23 PM
Im sorry, but the 7300GT and X1300 are no where near as powerful as my X850Xt. Hell even the X1600 in my laptop gets smoked by my X850XT in anything but SM3 and only because X100 series dont support SM3 at all.


LOL, where did I say the 7300GT is as powerful as an x850Xt? You must be significantly younger than I because the "days of yore" doesn't equal two years old in my book.

7he ]-[0rr0r
03-29-2007, 12:55 PM
I never said anything of the sort. I was analyzing without this variable here (the game or application) because it complicates things. You now have to bring in coding efficiency and increased perception of the value of "new" games. Both of these concepts are almost impossible to define numerically and can vary on a per person basis.

Ok... ok true enough it is a per person thing. I guess i'm starting to get old i miss the days of wicked overclocking chips at 100$ on current architecture a motherboard that overclocks well at under 150$ and a 300- 400$ gpu for a set up that could run alot of games at 1600x1200.
i spent almost as much for those three items + ocz 600w psu as i did for my entire athlon xp set up. it hurt.

vitaminc
03-29-2007, 12:58 PM
Yes it is true. Niche(s) of a market can be created while the general market as a whole declines. Many times companies will expand into new market niches in an attempt to boost floundering sales from a declining market.

In the case of NVIDIA, they expand into, uh, higher end GPU market because the sales in the GPU market is slowing down...

It makes no sense.

The Lexus pricing analogy is better fitting for Nvidia. Initially there's only LS class, then Lexus pushed out GS at the price of LS and raised the price of LS. It then pushed out ES at the price of GS, and bump the price of both GS and LS. And it created seperate LX and SC lines for SUV and sports car.

The niche market is the workstation line of products (which is not really growing). All the rest of GPU cards are not niche but line extensions.

J-Mag
03-29-2007, 01:19 PM
The niche market is the workstation line of products (which is not really growing). All the rest of GPU cards are not niche but line extensions.

Whatever you want to call it. The bottom line is there weren't nearly as many price points on GPUs just a few years ago. Having a wide range of price points is usually good for the consumer.

vitaminc
03-29-2007, 01:47 PM
Whatever you want to call it. The bottom line is there weren't nearly as many price points on GPUs just a few years ago. Having a wide range of price points is usually good for the consumer.

Wrong. Compare to the highest grade GPU you can buy 5 years ago, the prices for the highest grade GPU has more than doubled.

It would be fine if they add more price points in between the lines. But they are expanding price points to $600 and above. Good for consumer? bullcrap. Price point expansion/stacking hurts the consumer.

J-Mag
03-29-2007, 02:09 PM
Wrong. Compare to the highest grade GPU you can buy 5 years ago, the prices for the highest grade GPU has more than doubled.


You are obviously not getting it... The highest price point is irrelevant to the number of price points.

Here the number of options we had in April '02
http://img181.imageshack.us/img181/6122/q3a1600xa0.gif (http://img181.imageshack.us/my.php?image=q3a1600xa0.gif)

And here's our options now 5yrs later and still doesn't include many others that are available like the 8800GTS 320mb, 7600GS, 7300's, etc, etc):
http://img129.imageshack.us/img129/2935/quake41600qm2.gif (http://img129.imageshack.us/my.php?image=quake41600qm2.gif)


Price point expansion/stacking hurts the consumer.

Explain to me how having more options is bad for the consumer.

vitaminc
03-29-2007, 03:30 PM
You are obviously not getting it... The highest price point is irrelevant to the number of price points.

Explain to me how having more options is bad for the consumer.

Useless graphs taken out. It showed nothing. Civic is not more expensive just because it has more horse powers. $100 today buys you a dual core instead of a single core last year. I could show you charts of CPU benchies that doesn't mean jack sh1t.

Which part of PRICE POINTS is too hard for you to comprehend?!?!

Now, take that two graphs you have. Take out the performance and put in spot market price at their respective times. I would bet you the highest end GPU has more than doubled in price. And for this exact reason they are being investigated by FTC for potentially duopoly price fixing.

lol @ the highest price point is irrelevent to the number of price points. Could you please tell Nvidia to add 50 more new designs to fit between $50 and $800 price range, with difference in price points shrinking to less than $10?

J-Mag
03-29-2007, 03:35 PM
Ti4600 launch cost $399
8800GTS launch cost $449

Adjusted for inflation they are on par. So what if the highest of the high end options are more expensive...

Anyway, you still didn't explain to me how more options is bad for the consumer. Either way you are getting worked up for no reason, I am done. Maybe you need a multi-vitamin because that vitaminC isn't helping your mental stability. ;)

vitaminc
03-29-2007, 03:49 PM
You are obviously not getting it... The highest price point is irrelevant to the number of price points.


You are getting worked up, I am done. Maybe you need a multi-vitamin because that vitaminC isn't helping your mental stability. ;)

Want a math lesson?

Highest price point = price gap x number of products

price gap halved over the past 5-7 years, from $100 to $50.

number of products increased at least 3 fold.

=> highest price point increased by at least 50%.

More options? sure. ridiculous prices? hell the fkn ya. now may FTC order them to bring down those ridiculous pricing schemes.

vitaminc
03-29-2007, 03:53 PM
Ti4600 launch cost $399
8800GTS launch cost $449

Adjusted for inflation they are on par.

You are getting worked up, I am done. Maybe you need a multi-vitamin because that vitaminC isn't helping your mental stability. ;)

Adjusted for inflation Extreme Edition processors or FX processors have price decline then.

Time will tell. AIC market will shrink further and eventually turn into something like sound card or physics card market. Maybe bigger but certainly not like its present day glory.

situman
03-29-2007, 04:16 PM
when a mainstream card costs 300 and an outdated last gen card still costs around 200...this is the result. Maybe people are wising up and stopped paying big bucks for minimal increase in the top end cards.

justin_c
03-29-2007, 07:24 PM
I think the best way for them to make money is to make better integrated graphics. Come on, some teenager who wants to play Stalker (name a game) and just bought a brand new computer doesn't want to fork out another 300 from his allowance to buy a graphics card. But if they choose to do this, then the big guns (8800gtx) will suffer. They should market more to general, less tech-savvy consumers.

Edit: maybe not stalker, lets say HL2.

serialk11r
03-29-2007, 07:40 PM
I think the best way for them to make money is to make better integrated graphics. Come on, some teenager who wants to play Stalker (name a game) and just bought a brand new computer doesn't want to fork out another 300 from his allowance to buy a graphics card. But if they choose to do this, then the big guns (8800gtx) will suffer. They should market more to general, less tech-savvy consumers.

Edit: maybe not stalker, lets say HL2.

A lot of kids these days play games that aren't even worth playing like Runescape, Maple Story, AdventureQuest, its disgusting. For those games, any computer from 8 years ago will play them no worse than a fancy computer, so lot of ppl don't even care about having a good computer...

Daveb2012
03-30-2007, 12:08 AM
Well maybe if SOMEONE had their video card out like Nvidia we would see more competition, lower pricing, and better sales. But, with people holding out for the ATI component and nothing to compete against the 8800 this was expected.

Intel needs to get into the GPU business..

deathman20
03-30-2007, 04:38 AM
Intel needs to get into the GPU business..

They are in the GPU business. They have nearly half of the total market, but all low end solutions.

perkam
03-30-2007, 05:14 AM
They are in the GPU business. They have nearly half of the total market, but all low end solutions.Yep...40% to be precise: http://www.channelregister.co.uk/2006/12/06/q3_06_graphics_market/

Perkam

Jimmer411
03-30-2007, 08:36 AM
LOL, where did I say the 7300GT is as powerful as an x850Xt? You must be significantly younger than I because the "days of yore" doesn't equal two years old in my book.



Well todays low end is the 7300GT and X1300, unless you were refering to the yet to be released 8300 and x2300 as todays low end. So how many years does "Days of yore" equal in your book? Sorry but 2 years ago in GPU cycles is alot. Ask anyone what they consider the 9800XT cards today. Ancient.






If im going to spend $500+ on a computer component it better be something that isnt outdated a 6 months to a year from now, especially to the extent that GPUs are. My 4400+ X2 is still comparable to C2D in everyday use and gaming and its going on 2 years old, where most video cards after 2 years definately show a larger gap in performance.

Ouchy
03-30-2007, 08:50 AM
A lot of kids these days play games that aren't even worth playing like Runescape, Maple Story, AdventureQuest, its disgusting. For those games, any computer from 8 years ago will play them no worse than a fancy computer, so lot of ppl don't even care about having a good computer...

Yeah, gamers valuing gameplay over graphics, how absolutely disgusting. I WANTZ DA EYECANDYZZZ!!1!! :rolleyes:

DTU_XaVier
03-30-2007, 09:23 AM
Well todays low end is the 7300GT and X1300, unless you were refering to the yet to be released 8300 and x2300 as todays low end. So how many years does "Days of yore" equal in your book? Sorry but 2 years ago in GPU cycles is alot. Ask anyone what they consider the 9800XT cards today. Ancient.






If im going to spend $500+ on a computer component it better be something that isnt outdated a 6 months to a year from now, especially to the extent that GPUs are. My 4400+ X2 is still comparable to C2D in everyday use and gaming and its going on 2 years old, where most video cards after 2 years definately show a larger gap in performance.
The funny thing is, the 9800XT isn't 2 years old... It's 4 ;)
And 2 years old isn't long at all... that's only the 7800-series...

J-Mag
03-30-2007, 09:40 AM
Well todays low end is the 7300GT and X1300, unless you were refering to the yet to be released 8300 and x2300 as todays low end. So how many years does "Days of yore" equal in your book? Sorry but 2 years ago in GPU cycles is alot. Ask anyone what they consider the 9800XT cards today. Ancient.


Essentially I was reminiscing over my years building PCs (first rig I built personally was a 386) and how much more value I get for my money these days. I was primarily looking at the days when 3d GPUs became standard (right after 3dfx left the market, IMO).



If im going to spend $500+ on a computer component it better be something that isnt outdated a 6 months to a year from now, especially to the extent that GPUs are.


I am not quite sure how you plan on doing that, it is an impossibility to have a GPU that isn't outdated in 6months. Actually I just sold my 8800gtx in preparation for new high end GPUs to hit the market, because I know it's value will drop.



My 4400+ X2 is still comparable to C2D in everyday use and gaming and its going on 2 years old, where most video cards after 2 years definately show a larger gap in performance.

It really depends what you are doing: IF encoding, rendering, encrypting, compressing, or scientific type apps are your primary use of your CPU it can be outdated just as fast as GPUs are outdated when looking at the GPU's respective performance in games.

However, in my personal circumstance I'd agree...