PDA

View Full Version : Why are the top of the line ATi cards so cheap??



TheCarLessDriven
08-18-2009, 01:57 PM
Like this one....http://www.newegg.com/Product/Product.aspx?Item=N82E16814102847

native HDMI, Display Port, 2gb of GDDR5 RAM, insane core clock speed. none of the Nvidia cards even have native HDMI or Display port or GDDR5, yet my GTX 285 2gb cost $400 from newegg. Only way they best the ATi is in memory speed and they have 512bit cards. (Why does ATi never make 512 bit cards?)

So besides the 256bit why are they so cheap?

zanzabar
08-18-2009, 02:11 PM
ati has small chips so they are cheaper to make and the 256bit buss + gddr5 gives 512bit gddr3 bandwidth with half of the chips so that saves. amd also wants volume. then the ati dx11 cards are out in a month (but the normal 4890s wee around $150-160 a few weeks ago so who knows how its priced.)

and for a side note, dont buy somehting with display port it will mess up the HDCP flags on dvi/hdmi, and i dont think that display port is a goo idea or viable format for non laptop use

TheCarLessDriven
08-18-2009, 02:13 PM
Well eventually Display port is going to take over for DVi / HDMI so it should get more support in the future.

Thanks for the info.

SoulsCollective
08-18-2009, 03:53 PM
Display Port should die in a pit of burning fire.

Overfiend
08-18-2009, 11:06 PM
Maybe because they will be at the top for no longer than 3 weeks now? ;)

[XC] Oj101
08-19-2009, 02:19 AM
(Why does ATi never make 512 bit cards?)

My HD2900XTs are 512 bit ;)

r1rhyder
08-19-2009, 04:06 AM
It's because nvidia cards are so overpriced, it makes ati cards appear cheap.

labs23
08-19-2009, 04:45 AM
It's because nvidia cards are so overpriced, it makes ati cards appear cheap.

Bullseye! LOL. Agreed.:D

But, things are about to change to ATI's side, coming Sept.

Splave
08-19-2009, 04:49 AM
+2
285 gtx prices are ridiculous imo

larrabee
08-19-2009, 08:40 AM
nVidia charges that much, merely because they can. If they didn't have the performance crown, they would be priced much more competitively. Lots of people must still be buying their cards, although I hear ATI gained significant growth in the discrete graphics area recently.

WrigleyVillain
08-19-2009, 08:58 AM
It's because nvidia cards are so overpriced, it makes ati cards appear cheap.

+3. A pity $250 for a graphics card is considered "cheap".

texasreefer
08-19-2009, 11:40 AM
My HD2900XTs are 512 bit ;)

We're talking bandwidth here, not memory. 2900s are not 512 bit, are you kidding?

demonkevy666
08-19-2009, 01:41 PM
+3. A pity $250 for a graphics card is considered "cheap".

lol
2gbs 4890 is $250
while a
2gbs 285 gtx is $389

ELItheICEman
08-19-2009, 01:46 PM
We're talking bandwidth here, not memory. 2900s are not 512 bit, are you kidding?

Yes they are. I had one myself. 512-bit GDDR3.

The reason they don't usually use it is because 512-bit memory chips are :banana::banana::banana::banana:ing expensive.

ownage
08-19-2009, 01:48 PM
We're talking bandwidth here, not memory. 2900s are not 512 bit, are you kidding?

PWNED! :D
They are/were 512bit!

zanzabar
08-19-2009, 02:00 PM
nVidia charges that much, merely because they can. If they didn't have the performance crown, they would be priced much more competitively. Lots of people must still be buying their cards, although I hear ATI gained significant growth in the discrete graphics area recently.

the 4890 is the fastest it will win with the factory oc v factory oc, and 4x 4890 is the king of 3d mark. only suckers think that NV is winning in performance

texasreefer
08-20-2009, 07:56 AM
PWNED! :D
They are/were 512bit!

Sucks to be me..my bad OJ

perkam
08-20-2009, 08:04 AM
HD 5870 @ $299 will see an interesting drop in GTX 285 price.

Perkam

_G_
08-20-2009, 06:47 PM
Like this one....http://www.newegg.com/Product/Product.aspx?Item=N82E16814102847

Interesting , I have seen some spy pics of the backpane on the new 5xxx series and it looks identical to that one(including the display port lol). I wonder if that's the reference pcb for the 5 series?

zanzabar
08-20-2009, 07:47 PM
Interesting , I have seen some spy pics of the backpane on the new 5xxx series and it looks identical to that one(including the display port lol). I wonder if that's the reference pcb for the 5 series?

or sapphire has used that IO panel on their custom cards for a year or 2 now so the one that was shown was for sapphire. i will personally not be buying anything with DP for a non laptop and then for an internal connector only

zsamz_
08-20-2009, 11:54 PM
just be thankfull ati makin great cards or we be payin 400$ for a renamed 8800gt:rofl:

Lightning98
08-21-2009, 01:12 AM
Totally agree... if there wasn't any competition we'd still be paying premium for last gen. cards... just remember the huge price drop nVidia was forced to make when the 48xx series was released.
These days you can get 2x4870's for about 570$ (Europe), which means that's about 200$ is the US :D

So as far as i'm concerned, i'm totally not complaining.

Stijndp
08-21-2009, 01:28 AM
I started using NVIDIA since the gf256 and I stayed with them untill I saw the 4870. I just couldn't find reasons anymore why I wanted to stay with NVIDIA. Now going on to 2x4890. I used to buy simple 6800ultra's at €650 but thank god ATI/AMD came with good cards and NVIDIA's pricing is /fail now.

I don't know in USA but over here NVIDIA is doing 0,0 effort with their upcoming celebration of 10years GeForce brand (31st august). If I were a manafucturer who wasn't in deep sh*t, I'd be throwing competitions all over the world.

Security
08-21-2009, 07:13 AM
These days you can get 2x4870's for about 570$ (Europe), which means that's about 200$ is the US :D.

What?

3Z3VH
08-21-2009, 07:50 AM
the 4890 is the fastest it will win with the factory oc v factory oc, and 4x 4890 is the king of 3d mark. only suckers think that NV is winning in performance

I guess it all comes down to what you want in a card... good benchmarks, or good gameplay :shrug:

Personally, of the five ATI cards I have owned, all five had texture clipping issues, and improperly drawn shadows. Of the seven GeForce cards I have had, only one had issues and it was due to faulty memory on the card. You are correct, the ATI cards all outperformed their Nvidia counterpart at the time in synthetic benchmarks (which is why I bought them instead of the Nvidia at the time) but when it comes to gameplay, ATI still needs driver work. I have simply been burned with less than stellar video anomalies by too many ATI cards to trust them anymore.

SoulsCollective
08-21-2009, 08:22 AM
I guess it all comes down to what you want in a card... good benchmarks, or good gameplay :shrug:

Personally, of the five ATI cards I have owned, all five had texture clipping issues, and improperly drawn shadows. Of the seven GeForce cards I have had, only one had issues and it was due to faulty memory on the card. You are correct, the ATI cards all outperformed their Nvidia counterpart at the time in synthetic benchmarks (which is why I bought them instead of the Nvidia at the time) but when it comes to gameplay, ATI still needs driver work. I have simply been burned with less than stellar video anomalies by too many ATI cards to trust them anymore.Personally, of the five nVidia cards I've owned, three have died on me, with accompanying image quality issues and glitches, the other had a cooler that was louder than my desktop fan (no prizes for guessing the series), and the last is still going strong. Of the three ATi cards I've owned, all have OC'd extremely well and none have shown any problems, except for a few isolated specific driver package issues which were fixed in the next release.

Be that as it may, I was recommending nVidia cards when the 4-series, 6-series and 8-series were king, briefly the 260 216 when the price/performance was excellent, but now wouldn't dream of recommending nVidia cards, and the Rage, 6-series, 9-series, 1900-series, and now the 4-series. All anecdotes like yours and mine are going to do is confuse an issue with arbitrary units of data.

Particle
08-21-2009, 08:39 AM
ATI cards are inexpensive for two reasons:

- The dies are physically smaller, which is a principle dictator of unit cost for silicon. They get more done with less silicon.

- The nVidia brand is bigger. Overall, I'd expect nVidia to do well even if their products were obviously inferior. ATI and nVidia are pretty close with the current batch of hardware. This is just like how people would continue buying Intel CPUs even if they were inferior, just like during the Athlon 64 vs P4 days. They can charge more. We do see this in both the CPU and GPU arenas with Intel products offering similar performance to an AMD part usually costing more. Likewise with nVidia and ATI GPUs.

HuffPCair
08-21-2009, 08:45 AM
ATI cards are inexpensive for two reasons:

- The dies are physically smaller, which is a principle dictator of unit cost for silicon. They get more done with less silicon.

- The nVidia brand is bigger. Overall, I'd expect nVidia to do well even if their products were obviously inferior. ATI and nVidia are pretty close with the current batch of hardware. This is just like how people would continue buying Intel CPUs even if they were inferior, just like during the Athlon 64 vs P4 days. They can charge more. We do see this in both the CPU and GPU arenas with Intel products offering similar performance to an AMD part usually costing more. Likewise with nVidia and ATI GPUs.

Perfectly said.

Carfax
08-21-2009, 09:32 AM
One thing I've always noticed, is that if you go to Newegg's open box section, most of the video cards there will be ATI.. :shrug:

dum
08-21-2009, 05:40 PM
Display Port should die in a pit of burning fire.

agreed

dum
08-21-2009, 05:43 PM
+3. A pity $250 for a graphics card is considered "cheap".

well they were over 500.00 ten years ago, so if your young you might not think so, I swear I think voodoo's last piece of unsupported crap was 7 bills

T2k
08-21-2009, 06:57 PM
One thing I've always noticed, is that if you go to Newegg's open box section, most of the video cards there will be ATI.. :shrug:

Because nobody buys NV nowadays so no chance for "changed my mind" cards coming back... :rofl:

BTW I still believe the best card on the market is mine, this dual-chip 4850 X2 2GB which I bought it for ~290 (late last year or eraly this year)... too bad they don't make it anymore (ate up all the 4870 X2 sales! :D)

It will seve me fine until the end of this year - I might get a good 5800 or even X2 around Christmas...:cool:

NaMcO
08-22-2009, 12:43 AM
One thing I've always noticed, is that if you go to Newegg's open box section, most of the video cards there will be ATI.. :shrug:

Smart people tend to run away from trouble...

Armitage
08-22-2009, 04:16 PM
Half the reason I buy nVidia is because of eVGA.

tiro_uspsss
08-22-2009, 04:44 PM
ATi cards are cheaper because...

they want u to buy them

pretty smart business move, eh?

:D

B.E.E.F.
08-22-2009, 08:19 PM
ATi cards are cheaper because...

they want u to buy them

Because they're cheaper to produce. Smaller dies + less complex manufacturing means = more good chips per wafer = less cost per individual chip.

strange|ife
08-22-2009, 09:18 PM
dunno what cheap is too you

but 295's are still hella pricey for my liking.

and the 4870x2 still aint cheap:ROTF:

largon
08-23-2009, 01:35 AM
Display Port should die in a pit of burning fire.Why?
HDMI = DVI repackaged.
Legacy technology with bundle of useless features like audio channels and... Nothing else, really. Can't ignore the fact DP is technically more advanced, future proof than HDMI. So, again, which is it that should die?

The reason they don't usually use it is because 512-bit memory chips are :banana::banana::banana::banana:ing expensive.Ofcourse 16 memory chips cost more than just 8. :stick:

But then again, if using 16 chips (512bit bus) they can use lower density chips to reach the same capacity compared to a 8-chip solution. More lower density chips means cost/chip is lower and as such total cost is not actually :banana::banana::banana::banana:ing more expensive..

SoulsCollective
08-23-2009, 02:35 AM
Why?
HDMI = DVI repackaged.
Legacy technology with bundle of useless features like audio channels and... Nothing else, really. Can't ignore the fact DP is technically more advanced, future proof than HDMI. So, again, which is it that should die?
As I said - DisplayPort.

HDMI has more than adequate "bandwidth" to carry digital data at resolutions even higher than what we consider "extreme" now (ie. 2560), and I didn't think anyone would call the ability to carry full 7.1 audio data over the same cable "useless" (DisplayPort has the same functionality anyway - would you call it useless in that context?). Furthermore, HDMI supports xvYCC, which we should all be using anyway.

But this isn't about HDMI><DP, this is about DP itself - the supporters are advocating it as "HDMI for your PC", which sounds neat, but the problem is the DRM - the number of consumer-level displays which are HDCP certified is tiny compared to the number of otherwise perfectly good displays that will be needlessly and pointlessly rendered useless if DP gains wide acceptance. If Big Content (R) and Big Beige Boxes (TM) get together and enforce this standard on us, we all lose.

The point is, it isn't needed, it isn't even desirable when ordinary DVI works just fine for even the highest resolutions available today, and really it has no purpose other than to restrict consumer rights and choice.

largon
08-23-2009, 04:16 AM
7.1 audio - most video cards don't support it. None from nV, eg.
xvYCC - basically nothing supports it. Including Bluray.
DRM - already on the media, nothing can prevent it from coming to devices.
DP & old device incompatibility - adapters do exist.
consumer rights and choice - there's always a way.

SoulsCollective
08-23-2009, 04:36 AM
7.1 audio - most video cards don't support it. None from nV, eg.I don't understand why this is even an issue. Audio over the same cable as video is a good thing, even if it's not being used to its full potential yet. Yes?

xvYCC - basically nothing supports it. Including Bluray.Indeed - but again, support for what is a major step up from our current sRGB colour space is a good thing, and as above you were arguing that DP is a technically superior connection and more futureproof, it would be nice if it would support future standards - rather than just have a theoretical bandwidth advantage.

DRM - already on the media, nothing can prevent it from coming to devices.Yes, it can - if Apple and the other owners of content distribution services and stores aimed at PC users are convinced that DRM-infested media will not play well to their market because HDCP-compliant connections like DP aren't common amongst PC users, we don't get DRM-infested media. Case in point Apple at the end of last year when they realised that perhaps one in one hundred Macbook owners used the Displayport connection on their new laptops, and subsequently eased up on HDCP DRM.

DP & old device incompatibility - adapters do exist. What? I didn't mention compatibility issues - because you're right, adapters exist.

consumer rights and choice - there's always a way.Which would you rather - hackery and fiddling, or DRM-free media? Sure there might be a way, but that's not a good argument for the adoption of a cable which causes issues in the first place when the limits of our current standards haven't been reached yet.

ELItheICEman
08-24-2009, 06:22 AM
Ofcourse 16 memory chips cost more than just 8. :stick:

But then again, if using 16 chips (512bit bus) they can use lower density chips to reach the same capacity compared to a 8-chip solution. More lower density chips means cost/chip is lower and as such total cost is not actually :banana::banana::banana::banana:ing more expensive..

I'd have to think they'd use eight chips for a 512MB card and sixteen chips for a 1GB card - eight on the front and eight on the back. Maybe I'm wrong, but that seems to make a ton more sense to me - that way, to differentiate between the two variations in production, all you'd have to do is solder memory onto the front of the card for the 512MB cards instead of both sides, as you'd do for the 1GB cards. Otherwise you'd need two entirely different PCBs designed, and that's extra project time and too much trouble from a design standpoint. But that's just my thoughts.

Anyway, without looking at the layout I really don't know, but I'd have to assume they use 512-bit memory chips instead of twice as many 256-bit chips. It's a better (albeit more expensive) solution.

JonnyG
08-24-2009, 01:33 PM
The prices of the gtx275 (£150)/ 285 gtx (£230+) & gtx295 (£330+) are all jokes compared to the 4850 (sub £100)/4870(£105+)/4890 (£150+)/4870x2(£230+). Price/performance ATI win hands down v nvidia. Im surprised nvidia still manging to charge as much. I have nothing against nvidia, i actually perfer having nvidia card in my system, but u cannot justify the prices.

I was actually surprised yesterday to see a gtx285 at sub £200 at microdirect. Of course it was a typo as its priced at £247 today *sigh*

Serra
08-24-2009, 01:51 PM
Another reason for more expensive nVidia cards is the additional features and R&D that go in to them. With nVidia, their cards are not just about video performance, they also put R&D money into developing / maintaining / debugging CUDA and such. I don't know exactly how much they fund that, but considering the market that it opens up to them (ie. scientific computing) I would expect it to be a fair amount.

Probably not the #1 reason I expect, but another factor for sure.

zanzabar
08-24-2009, 02:51 PM
there is openCL on the ati cards, but no1 has updated the folding clients for it since they (ati, via that only use openCL) dont pay to subsidize software, there are real shader based transcoders that work on both and ati is faster now at it when comparing a 260 to a 4770. i would expect openCL to take over now that NV supports it and the spec is finalized and can work on the x86 and gpu

ive also seen more use for the cell than the gpgpu since the cell is cheap and it can do more kinds of workloads u need nested fractals for the gpgpu stuff right now to get an advantage so there are limitations

Serra
08-24-2009, 04:50 PM
ive also seen more use for the cell than the gpgpu since the cell is cheap and it can do more kinds of workloads u need nested fractals for the gpgpu stuff right now to get an advantage so there are limitations

What do you do, may I ask, that leads you to believe you are able to support this claim? Just how many of these installations have you actually seen? nVidia has the entire Tesla line and backing from the likes of Supermicro. The cell is an awesome chip, but unless you go to a PS3 farm the hardware costs are sky-high and you lose the price:performance ratio. Even the PS3 farms lose when you take into account the labor one has to put into setting it up and supporting it (as, obviously, there is no-one who will offer you any kind of technical support).

A Tesla-based computer, on the other hand, has nVidia support from day 1. That's why many hospitals use them in imaging machines, and why so many well-funded scientific projects use them (though the ones who have tons of free grad students will, of course, have the grad students put together the PS3 networks for them - provided their grads are knowledgeable in computers).

Of course there are limitations of CUDA, but that does not change the fact that there is a large market for it in industry and that nVidia has likely dumped a ton of money into R&D and promotion of it.


Again, I'm not saying that it alone is why prices are so high, but it is a factor. This post has largely been OT and I apologize to anyone I have derailed.

zanzabar
08-24-2009, 06:10 PM
ive seen 4 cell based setups 3 with ps3s, 1 with ibm boxes and 1 tesla, the cells were for space the telsa for proteins. the cell is more versatile from my understanding of it, it has 6 cores per chip and is good with floats and brute force the gpgpu systems use alot of cores and need to have vary specific uses were they can use fractals to simulate an assembly.

im not saying that one is better all the time its a matter of what u are doing. one of my favorite demos that i was shown was from MS were they have an ati streamGL with an rv670 and it runs through an sql database as fast as it can get data and its multiple times faster than the cpu only.

i do like the folding and crunching clients for cuda but i dont see the point in buying just for that (atleast for most people) since the power consumption dosnt seam worth it. and i would think that there would have been an openCL client if NV didnt push the cuda clients out with huge support and contributions from NV to go look what we do in marketing


also its clearly visible that i dont like NV (they have killed parts in my computers, shipped defective parts that i have to replace, have bad biz practices) but i also am not a real fan of the gpgpu (atleast right now, especially for gaming) it just seams like its a huge marketing scam when i cant get a game that uses more than 2 cores but all the sudden we need more gpu coding to take load from the cpueven though it dosnt do much anymore and sits idle. i do try to stay fair though for the gpus

TheCarLessDriven
08-27-2009, 03:52 PM
lol at what this turned into.

railmeat
08-27-2009, 10:09 PM
my friend bought his saphire 4870x2 last year for $500,sold last week for $275 ,put up another$225 and grabbed a single pcb 295 for $500...why?because its smoother,drivers are dam near flawless next to the 4870x2. he was having diff weird issues in games.he said he could not imagine going xfire over quad sli.some ppl want hassle free gaming and wil pay for it.plus the 295 is smoldering fast as a single slot watercooled solution.

mcmeat51
08-28-2009, 01:24 AM
I started using NVIDIA since the gf256 and I stayed with them untill I saw the 4870. I just couldn't find reasons anymore why I wanted to stay with NVIDIA. Now going on to 2x4890. I used to buy simple 6800ultra's at €650 but thank god ATI/AMD came with good cards and NVIDIA's pricing is /fail now.

I don't know in USA but over here NVIDIA is doing 0,0 effort with their upcoming celebration of 10years GeForce brand (31st august). If I were a manafucturer who wasn't in deep sh*t, I'd be throwing competitions all over the world.

Me too. SOOO glad I jumped ship. my last nVidia cards sucked asss


I guess it all comes down to what you want in a card... good benchmarks, or good gameplay

Personally, of the five ATI cards I have owned, all five had texture clipping issues, and improperly drawn shadows. Of the seven GeForce cards I have had, only one had issues and it was due to faulty memory on the card. You are correct, the ATI cards all outperformed their Nvidia counterpart at the time in synthetic benchmarks (which is why I bought them instead of the Nvidia at the time) but when it comes to gameplay, ATI still needs driver work. I have simply been burned with less than stellar video anomalies by too many ATI cards to trust them anymore.

I have found ati drivers brilliant apart from initially having to spend a day to get crossfire working. I found nvidia drivers terribly unoptomised.