PDA

View Full Version : The ATI Radeon 5XXX Thread



Pages : [1] 2 3 4 5 6 7 8 9 10

saaya
09-03-2009, 10:26 PM
http://www.semiaccurate.com/2009/09/03/and-cypress-shader-count/

and theres a rumor going around that the 5870 will cost 100$ less than a 295, which is 399$ then? thats pretty high... either yields are bad or ati wants to cash in as long as nvidia cant compete... i have to say, i really dont like what im hearing about prices now... it really spoils the anticipation of the new 800 series :/

i really hope those are only the launch prices and they will drop notably after a few weeks...

EDIT: and about gt300.. it should easily beat a 5870, i dont think anybody doubts that, the question is more about how much itll cost... gt300 will definately bring the perf crown back to nvidia once it comes out, but that seems to be a while and it wont be cheap...

i cant help but feel things actually got worse this time around... ati wont offer great price perf anymore cause nvidia has even less to compete with their mainstream parts, and the highend will be nvidia dominated as before but prices will probably be even higher than before... sigh...

Utnorris
09-03-2009, 10:35 PM
Very interesting, but until it debuts it's all speculation. Even if the HD5870 comes in at $400, that will still be a good deal if it beats the GTX295 in perfomance, which means Nvidia will have no choice but to drop the price of the GTX295 down to around $350 leading the rest of the GTX series to follow suit. That's far from bad news from a consumer's standpoint.

onethreehill
09-03-2009, 10:38 PM
Repost :)
http://www.xtremesystems.org/forums/showpost.php?p=3994346&postcount=418

labs23
09-03-2009, 10:55 PM
-Very interesting, but until it debuts it's all speculation.

-Even if the HD5870 comes in at $400, that will still be a good deal if it beats the GTX295 in perfomance, which means Nvidia will have no choice but to drop the price of the GTX295 down to around $350 leading the rest of the GTX series to follow suit. That's far from bad news from a consumer's standpoint.

-Exactly, something not to be taken as is until Sept. 10.

-I'm gonna love thathttp://www.operationsports.com/forums/images/smilies/graemlins/woot.gif. Should still be great for non-early DX11 adopters, ie to those who wait for a DX11 competition.

saaya
09-03-2009, 11:16 PM
Repost :)
http://www.xtremesystems.org/forums/showpost.php?p=3994346&postcount=418

i think this deserves its own thread?
its pretty unlikely that charlie got those numbers wrong so close to the launch...

this can only mean 2 things though... either going from 10.1 to 11 takes a lot more transistors than we thought, or those gpus actually contain some extra shader blocks and other blocks... those could be for redundancy since its a rather big chip and 40nm wafers are still somewhat the swiss cheese of semiconductors :D im curious though if those redundant parts can be enabled in the near future when yields improve and we will se a 5890 with actually more logic and not just higher clocks...

about pricing:
GTX295 500$
GTX285 350$
GTX275 225$
GTX260 175$
GTS250 125$

4870x2 400$
4890 200$
4870 175$
4850 125$
4770 125$

if the 5870 is 399$ and it beats the 295, veeery likely :p:, then what about the 5850?
it definately beats the 285, lets assume for 100$ less as well? then thats 299$... quite a jump from 5850 to 5870... but its possible...
now with the 5850 at 299$ what happens to the 4800 series? they get replaced by juniper aka 5830 at about the same price?

while ati could def position their cards that way, i doubt they will... it wouldnt generate a lot of drive for people to buy new cards...
make it 349, 279, 200 or even 299, 249 and 175 instead and people will suddenly feel a big urge to upgrade :D

jaredpace
09-03-2009, 11:27 PM
I don't think an HD5870 is going to be faster than a GTX295 on average - perhaps only in a few instances. Computerbase.de has some average performance ratings of the cards relative to the GTX295 based on averages from all the listed benchmarks. I expect 5870 to be 95-98% and 5850 to be 80-85% of a GTX295's overall performance (at this resolution). Less than one week away!

http://i31.tinypic.com/i2sl6t.jpghttp://i28.tinypic.com/j0f7s4.jpg
Source:
http://www.computerbase.de/artikel/hardware/grafikkarten/2009/test_13_grafikkarten/18/#abschnitt_performancerating_qualitaet

A B3D member made this post with a list of his predictions for the future card's performance in each game. He calls it his "5870 fake review" :rofl: It can be seen here:
http://img196.imageshack.us/img196/5589/20090904065937.jpg
Source:
http://forum.beyond3d.com/showpost.php?p=1329964&postcount=2223

LordEC911
09-03-2009, 11:39 PM
i think this deserves its own thread?
its pretty unlikely that charlie got those numbers wrong so close to the launch...

this can only mean 2 things though... either going from 10.1 to 11 takes a lot more transistors than we thought, or those gpus actually contain some extra shader blocks and other blocks... those could be for redundancy since its a rather big chip and 40nm wafers are still somewhat the swiss cheese of semiconductors :D im curious though if those redundant parts can be enabled in the near future when yields improve and we will se a 5890 with actually more logic and not just higher clocks...

Well, it only took him 2-3 weeks to get the codenames right...
Also, what are you talking about with the diesize/transistor count?
Nothing seems too out of line.

Also, redundant parts of the chip are never going to be "enabled" it doesn't work that way, unfortunately.

flopper
09-03-2009, 11:58 PM
A B3D member made this post with a list of his predictions for the future card's performance in each game. He calls it his "5870 fake review" :rofl: It can be seen here:
http://img196.imageshack.us/img196/5589/20090904065937.jpg
Source:
http://forum.beyond3d.com/showpost.php?p=1329964&postcount=2223

His predictions is on the low side also.
7days to go

570091D
09-04-2009, 12:13 AM
does anyone else remember rumblings from investors after the 48xx launch? we're pricing this card too low? i believe this pricing scheme i aimed at returning investor confidence, and truly, if amd has a superior product they should be paid for it. that's business.

saaya
09-04-2009, 12:14 AM
charlie had the codenames wrong? just hemlock afaik and thats probably cause he listened to sylvie :D

jared, why would the 5870 be that much faster than the 5850? those numbers look weird, i cant figure out what he based his perf boost over 4890 on..
what shader count did he expect when he made that graph? it doesnt make any sense that with about double the hw he only predicts a 10% boost in some games? :eh:

and about 5870 beating gtx295...
4870 to 4870x2 scaling is 55% of a 295 to 90% of a 295... considering that a 5870 appears to have the same hw specs as a 4870x2 but isnt castrated by having it on 2 chips and there is no overhead, id expect a 5870 to be around 110% of a 295 like a 4870x2 would be, if it wouldnt have any overhead caused by being based on 2 chips... and thats assuming the 5870 comes clocked at 750mhz core and 3600mhz mem... according to the rumored specs it looks like like 800-900 clocks for the gpu and 5000mhz for the mem... so it will DEFINATELY beat the 295...

Smartidiot89
09-04-2009, 12:16 AM
Charlie says that Juniper 180mm2 will be 800 shaders.

Will there be a "CypressLE" or something like HD4830(RV770LE) had 160 shaders disabled to 640 shaders instead of 800?

Lightman
09-04-2009, 12:22 AM
I don't think an HD5870 is going to be faster than a GTX295 on average - perhaps only in a few instances. Computerbase.de has some average performance ratings of the cards relative to the GTX295 based on averages from all the listed benchmarks. I expect 5870 to be 95-98% and 5850 to be 80-85% of a GTX295's overall performance (at this resolution). Less than one week away!

...

I think equaling GTX295 shouldn't be a problem for 5870 specced as rumor says. Remember that for the first time in recent ATi history we are getting not only 2xShader count, 2xRBE but also 2xROP. All this on one die instead of 2 dies. On top of that better scheduler plus other tweaks will mean much better scaling compared to older 4870X2 (no CF overhead). The only limiting factor can be memory bandwidth in some cases.

I'm looking forward to these new cards! :D


EDIT:
sayya you're typing too fast! We made same point in two different ways and two different posts :p

saaya
09-04-2009, 12:24 AM
does anyone else remember rumblings from investors after the 48xx launch? we're pricing this card too low? i believe this pricing scheme i aimed at returning investor confidence, and truly, if amd has a superior product they should be paid for it. that's business.
and those investors know more about running amd than amd does?
im not saying amd has been managed perfectly lately, FAR from it, but its ridiculous to think investors and analysts could do better...

you have to balance out demand, price per unit and units sold... investors arent happy that amd is losing money, but how retarded do you have to be to think the solution is to simply increase prices? :slap:

seriously :D its like they have no concept of commerce at all...

sure, lets them try how high they can price their cards and still sell some...
but im pretty sure if they price them as high as rumored lately, they will sell a lot less and overall make less profits than they would if theyd sell them cheaper...

if you increase the price by 10% and lose 10% customers as a result cause the pricing is too high for them, you actually make less money than selling it at the original price... if you reduce the price by 10% in most cases you end up with a more than 10% increase in volume, sometimes up to 30%... thats why the real money is being made selling mainstream and entry level parts aka juniper and below at tight margins, so why would ati price a highend part, which either way wont make them rich, at such high levels?

it has nothing to do with making money, if they price it that high its probably cause it costs them that much to make and theyd lose money if theyd sell it for less...

blindbox
09-04-2009, 12:33 AM
Since Chiphell's picture of Cypress is pretty spot-on, I bet their rumours of 1600sp would be correct too. That article just confirms it further, IMO.

LordEC911
09-04-2009, 12:37 AM
and about 5870 beating gtx295...
4870 to 4870x2 scaling is 55% of a 295 to 90% of a 295... considering that a 5870 appears to have the same hw specs as a 4870x2 but isnt castrated by having it on 2 chips and there is no overhead, id expect a 5870 to be around 110% of a 295 like a 4870x2 would be, if it wouldnt have any overhead caused by being based on 2 chips... and thats assuming the 5870 comes clocked at 750mhz core and 3600mhz mem... according to the rumored specs it looks like like 800-900 clocks for the gpu and 5000mhz for the mem... so it will DEFINATELY beat the 295...


I think equaling GTX295 shouldn't be a problem for 5870 specced as rumor says. Remember that for the first time in recent ATi history we are getting not only 2xShader count, 2xRBE but also 2xROP. All this on one die instead of 2 dies. On top of that better scheduler plus other tweaks will mean much better scaling compared to older 4870X2 (no CF overhead). The only limiting factor can be memory bandwidth in some cases.

That last bit was exactly what I was going to say to saaya.
We are doubling, if not more than doubling, almost every aspect of the chip but yet they are only, as far as we know, increasing the bandwidth by 39% (160 vs 115GBps).


Will there be a "CypressLE" or something like HD4830(RV770LE) had 160 shaders disabled to 640 shaders instead of 800?
Eventually. Remember, it isn't AMD/ATi's strategy anymore to rely on salvaged parts.
The last few generations they have been releasing those salvaged parts, seemingly, when the GPU is EOLed, when they know they have a decent stock and need to clear inventory w/o worrying about too many more salvaged parts flowing in.

purecain
09-04-2009, 12:45 AM
cant wait... this is truly a very exciting release..... :)

Smartidiot89
09-04-2009, 12:51 AM
Eventually. Remember, it isn't AMD/ATi's strategy anymore to rely on salvaged parts.
The last few generations they have been releasing those salvaged parts, seemingly, when the GPU is EOLed, when they know they have a decent stock and need to clear inventory w/o worrying about too many more salvaged parts flowing in.
True but there is a large gap between Juniper and Cypress so there should be something in between... atleast I hope so. Yields are never perfect so using salvaged part is better then to throw away chips which are semi-broken.

saaya
09-04-2009, 12:55 AM
That last bit was exactly what I was going to say to saaya.
We are doubling, if not more than doubling, almost every aspect of the chip but yet they are only, as far as we know, increasing the bandwidth by 39% (160 vs 115GBps).
that is assuming the 4870x2 has more bandwidth than the 4870... which it effectively doesnt... both cards mirror their counterparts memory afaik and hence the extra memory bandwidth is completely useless... if at all, its a minor boost...

dual gpu cards would be way less memory bandwidth limited than single gpu cards if the added bandwidth would really be useful... thats not the case, afaik its even the contrary, dual gpu cards are more bandiwdth limited than single gpu cards... thats why there has been all this talk about getting gpus to share their memory capacity and bandwidth. cause right now, there is double the mem and double the mem bw on a dual gpu card, but it doesnt result in higher overall bw and higher overall memory capcity...

in sfr there might be a boost from higher overall bandwidth of a dual gpu card... but even that is unlikely... and even if true, most of the time xfire and sli run afr cause it brings a notably higher boost...

LordEC911
09-04-2009, 01:10 AM
that is assuming the 4870x2 has more bandwidth than the 4870... which it effectively doesnt... both cards mirror their counterparts memory afaik and hence the extra memory bandwidth is completely useless... if at all, its a minor boost...

dual gpu cards would be way less memory bandwidth limited than single gpu cards if the added bandwidth would really be useful... thats not the case, afaik its even the contrary, dual gpu cards are more bandiwdth limited than single gpu cards... thats why there has been all this talk about getting gpus to share their memory capacity and bandwidth. cause right now, there is double the mem and double the mem bw on a dual gpu card, but it doesnt result in higher overall bw and higher overall memory capcity...

in sfr there might be a boost from higher overall bandwidth of a dual gpu card... but even that is unlikely... and even if true, most of the time xfire and sli run afr cause it brings a notably higher boost...

As to your first parts, you are correct, dual GPU cards like the 4870x2 don't increase the bandwidth vs a single GPU card but it also doesn't decrease it.
Each 800SP/40TMU/16ROP RV770 has 115GBps of bandwidth vs a 1600SP/80TMU/32ROP RV870 that has 160GBps.
Basically you are, possibly, more than doubling the potential throughput of the 5870 while only increasing bandwidth 39%, which was my original point.
Sure you don't have the inefficiencies of AFR and a dualGPU setup but you also run into other potential problems/bottlenecks.

I just have a feeling that we are in for a few surprises when we actually get to take a look at the architecture and performance, there is obviously quite a bit of tweaks/info that we haven't heard about/thought about.

PS- Before those Vantage scores were leaked, there was talk about Cypress running near 4890CF, which would put it in the same league as a GTX295.

jaredpace
09-04-2009, 01:20 AM
charlie had the codenames wrong? just hemlock afaik and thats probably cause he listened to sylvie :D

jared, why would the 5870 be that much faster than the 5850? those numbers look weird, i cant figure out what he based his perf boost over 4890 on..
what shader count did he expect when he made that graph? it doesnt make any sense that with about double the hw he only predicts a 10% boost in some games? :eh:

and about 5870 beating gtx295...
4870 to 4870x2 scaling is 55% of a 295 to 90% of a 295... considering that a 5870 appears to have the same hw specs as a 4870x2 but isnt castrated by having it on 2 chips and there is no overhead, id expect a 5870 to be around 110% of a 295 like a 4870x2 would be, if it wouldnt have any overhead caused by being based on 2 chips... and thats assuming the 5870 comes clocked at 750mhz core and 3600mhz mem... according to the rumored specs it looks like like 800-900 clocks for the gpu and 5000mhz for the mem... so it will DEFINATELY beat the 295...

well it may beat the 295. like everyone else i dont know - just guessing. I did read on CH that the internal workings of the dual core provide 200% efficient scaling, so it's like a 4870x2 with perfect scaling. if that were the case it would be 110% of the 295 like you said. but i don't think they're actually going to get 200% of a 48701gb based on the rumor that cypress = 16000-18000 vantage and a 48701gb is like ~10,000.


dual gpu cards are more bandiwdth limited than single gpu cards... thats why there has been all this talk about getting gpus to share their memory capacity and bandwidth. cause right now, there is double the mem and double the mem bw on a dual gpu card, but it doesnt result in higher overall bw and higher overall memory capcity

I wonder if they have this fixed for the 5870x2? I think there will be an operational sideport to provide more bandwidth directly between the gpus, but I wonder if they will be sharing the memory?

Calmatory
09-04-2009, 01:21 AM
True but there is a large gap between Juniper and Cypress so there should be something in between... atleast I hope so. Yields are never perfect so using salvaged part is better then to throw away chips which are semi-broken.

Right now the yields are lower than in the future. When the yields mature, the demand for salvaged chips is still higher than for those fully functional, so they have to salvage the chips and sell them with lower margins.

In short run in the launch having salvaged chips can be benefitical, but once the yields improve, salvaging the chips becomes more of a problem.

I guess AMD learnt this when they sold fully functional Barton cores as Durons, they had to sell the expensive core for cheaper and cripple it because there was huge demand for the Durons(Mainly because it was possible to mod the Duron to an Athlon, 64k L2 -> 256k L2. Those were the times. :))

Nedjo
09-04-2009, 01:28 AM
seriously :D its like they have no concept of commerce at all...

sure, lets them try how high they can price their cards and still sell some...
but im pretty sure if they price them as high as rumored lately, they will sell a lot less and overall make less profits than they would if theyd sell them cheaper...


Seriously saaya, to make your price analysis worthy you ought present some type of research to back up assumptions on equilibrium price-point of market demand and production output...

Yeah I know it sound difficult, and perhaps even AMD didn't do any research, perhaps they just look upon performance of their new parts, and performance of NV offering and decided to put some price-tag (not saying that what Theo And Charley are claiming is true prices - hell I'm pretty sure there's no way they could have official pricing leaked)

saaya
09-04-2009, 01:30 AM
As to your first parts, you are correct, dual GPU cards like the 4870x2 don't increase the bandwidth vs a single GPU card but it also doesn't decrease it.
Each 800SP/40TMU/16ROP RV770 has 115GBps of bandwidth vs a 1600SP/80TMU/32ROP RV870 that has 160GBps.i dont think thats how you can calculate it... by your math a 4870x2 has 1600sps 80TMUs 32RBEs and 230GBps bandwidth... but thats not the case... look at how xfire and sli actually scale and youll see that the extra tmus dont scale, and extra memory bandwidth doesnt scale either you only render half the scene or only every 2nd frame, but in either case you still have to fetch the same data into local memory as you dont know what texture will be on what part of the frame or if the texture will show on the even or odd or both frames in afr... the only thing that really scales with sli and xfire from what ive seen in reviews is: shader power. everything else is working exactly the same in single gpu and multi gpu mode... the gemoetry has to be calculated the same way, even if only half of it or only second frame gets rendered, and all the textures need to be fetched cause by that time you dont know yet if this gpu will render this texture or not... the diference in the rendering pipeline between single and multi gpu mode are so late that most of the work is being done twice with 2 gpus, and 4 times with 4 gpus... games that scale well with xfire and sli are shader power limited... or the gpus run in sfr, in which case geometry and texturing perf gets a boost too iirc... but sfr usually only gives you a 50% boost over single gpu... and you need to work through object dependencies which is a huge pita, thats why nvidia and ati push afr as much as they can cause it gives them better perf...

read what i said, the bw is actually used to fetch the very same textures and the very same infos twice... once for each gpu... the overall effective bandwidth for the 4870x2 and any dual gpu card is still the same bandwidth each gpu has, and actually even less for some reason in real world tests... like i said, dual gpu cards seem to actually be slightly more bw starved than single gpu cards of the same model and clocks and specs, most likely it has something to do with afr/sfr overhead or how sli/xfire work...

according to your math the 4870x2 has double the bandwidth of the 4870, true, but effective bw is the same as the 4870, just like effective memory capacity is the same as 4870 even though you got twice the mem on the card.

so a 5870 has 40% MORE effective bandwidth than a 4870x2 AND higher gpu clocks AND a tweaked gpu AND no xfire overhead... it will clearly be notably faster than a 4870x2...

correct me if im wrong, but thats the impression i got from testing myself and reading reviews and articles about sli and xfire...

Foamy
09-04-2009, 01:32 AM
399$ = 275€ = cheap to me .

eric66
09-04-2009, 01:34 AM
bs rumours everywhere ati does great job about keeping specs hidden

BenchZowner
09-04-2009, 01:37 AM
A birdie said "the 5870 is beating a GTX 295 in some cases, and is following the GTX295 closely in most cases".

jaredpace
09-04-2009, 01:39 AM
yah it should be faster than a 4870x2 - and probably trade blows with the 295 which puts it near the speed of 4890 CF.

It's like you guys are saying: 4870x2 specs (shader/tmu/rop) plus 40% greater memory bandwidth minus the associated crossfire overhead. but instead of having multi-gpu in two dies communicating through the plx chip, it all happens on one die, so no overhead or bottlenecks.

saaya
09-04-2009, 01:42 AM
Seriously saaya, to make your price analysis worthy you ought present some type of research to back up assumptions on equilibrium price-point of market demand and production output...

Yeah I know it sound difficult, and perhaps even AMD didn't do any research, perhaps they just look upon performance of their new parts, and performance of NV offering and decided to put some price-tag (not saying that what Theo And Charley are claiming is true prices - hell I'm pretty sure there's no way they could have official pricing leaked)
you can spend an infinite amount of time and money on researching the perfect price points in a given market... the easiest and safest way is still this:

start at the top, then lower your prices and monitor demand, if it stops picking up, keep the price the same or even increase it a bit again (read: stop giving special discounts to certain customers) then repeat this as time goes by since the market continues to evolve and mfc prices drop...

its really very similar to fishing :D

so yeah, atis approach makes sense, if thats what they are doing... i think they are starting at a too high price point though... and if you do that, you risk losing a lot of peoples attention, and, you risk seriously pssing off customers who bite early and find the same piece of hw for half the price a few weeks later... thats what nvidia ran into with their 8800 and 280 series...

if you start too high and then drop prices more and more, people become more reserved and sit back and wait to see how low your going to go... and the price point where demand then picks up is lower than it would have been if you wouldnt have started too high to begin with...

yeah this really is so much like fishing... its pretty obvious why so many business men enjoy fishing hehe :D

so, if you start too high you either have to drop prices too fast and that means people wait and you end up dropping the price a lot, or, you play the waiting game and drop the prices steadily, but then your risking that somebody else will throw in their worm at the right spot during the right time and steals your attention :D

saaya
09-04-2009, 01:51 AM
399$ = 275€ = cheap to me .
399$=!275E... if hardware launched at 299$ worldwide in the past it was usually 299E in europe, thanks to VAT...

so 399$ will be 399E or 349E, not 275... ;)

A birdie said "the 5870 is beating a GTX 295 in some cases, and is following the GTX295 closely in most cases".was it a turacu or a flamingo? ^^

Xoulz
09-04-2009, 01:56 AM
As to your first parts, you are correct, dual GPU cards like the 4870x2 don't increase the bandwidth vs a single GPU card but it also doesn't decrease it.
Each 800SP/40TMU/16ROP RV770 has 115GBps of bandwidth vs a 1600SP/80TMU/32ROP RV870 that has 160GBps.
Basically you are, possibly, more than doubling the potential throughput of the 5870 while only increasing bandwidth 39%, which was my original point.
Sure you don't have the inefficiencies of AFR and a dualGPU setup but you also run into other potential problems/bottlenecks.

I just have a feeling that we are in for a few surprises when we actually get to take a look at the architecture and performance, there is obviously quite a bit of tweaks/info that we haven't heard about/thought about.

PS- Before those Vantage scores were leaked, there was talk about Cypress running near 4890CF, which would put it in the same league as a GTX295.


Yeah dude, you^^ nailed it!



Secondly, I think people are pricing these too high. The 5850 Will debut @ $269, dropping to $225 by Xmas. The 5870 will debut @ $369 and carry a $299 price tag till next summer.

The X2 line will get it's own specialized treatment and starting @ $400 ...!


Some very interesting stuff coming though, specially since they are getting closer to dual-GPU (ie: dual core).. the cross-communication is getting sophisticated and whoever lith is most advanced and robust, will be the class leader of this next gen!

Expect the X2 to gobble Crysis ~ ARMA 2 and give a really immerse experience. We know nVidia's is going to outperform and cost more, but the X2 has the potential to upset nVidia's business plans.

The rest is already a forgone conclusion, given that they are already marketing the 300 series, as re-branded cards. nVidia's can only compete in the high-end market and if ATi does this X2 right... nVidia might have a shake up!

jaredpace
09-04-2009, 02:06 AM
About the press invites not for the September 10 function:
quote:
Just an update ... if you're sitting on the fence about going, I would recommend Deciding quickly, I'm down to only a few spots left. In case you did not Google it and figured it out, the event's being held on the USS Hornet aircraft carrier across the bay from SF to Alameda.

If you've already dropped me an e-mail you should have heard back from me by now. (If you did, PM me or write me again. ... Gmail was pretty wonky yesterday)
[...]
Bring a camera, Flip video camera, whatever you like. The whole joint will have wifi, so you can post pics and vids from the event live.

http://gathering.tweakers.net/forum/view_message/32521301

And there was the other message from the "AMD girl @ gmail.com" that said there would be plenty of chips & cookies to snack on. :ROTF: So sounds like it's going to be a fun time, hopefully they show us some comparison benchmarks, and reveal some official specifications.

Farinorco
09-04-2009, 02:09 AM
I don't find those prices "too high" at all. If they're nearly twice the power of previous gen, with DX11 support, they'll become basically an unrivaled product, with no competence at all. That increases the value considerably, because they are unique features: do you want a DX11 card? Buy that one because there is no other. Do you want the most powerful? The same. And so on.

We have had the chance to see this on the latter NVIDIA launches, with their 600$ 8800GTX / 800$ 8800Ultra, and their 650$ GTX280, and all the people paying those prices.

So I don't really think a 399$ top card would be nothing exorbitant. I'd say it's actually a very mild pricing given the circumstances, probably due to AMD wanting to recover market share as fast as possible.

Of course, other completely different thing will be when NVIDIA enters again to the market. Then, prices will depend on what each one have to offer, and for how much. If NVIDIA releases a faster and cheaper product than that, then AMD will have to lower their prices.

jaredpace
09-04-2009, 02:16 AM
We have had the chance to see this on the latter NVIDIA launches, with their 600$ 8800GTX / 800$ 8800Ultra, and their 650$ GTX280, and all the people paying those prices.So I don't really think a 399$ top card would be nothing exorbitant. I'd say it's actually a very mild pricing given the circumstances

:up: That's a good point. Those were all fastest cards at the time. Ati having the fastest cards ATM and charging 299 399 and 599 is actually quite a bargain for the consumer considering the alternative :bananal:. :rofl:

BenchZowner
09-04-2009, 02:23 AM
399$=!275E... if hardware launched at 299$ worldwide in the past it was usually 299E in europe, thanks to VAT...

so 399$ will be 399E or 349E, not 275... ;)
was it a turacu or a flamingo? ^^

Think it was a red macao :p:

Xoulz
09-04-2009, 02:28 AM
I don't find those prices "too high" at all. If they're nearly twice the power of previous gen, with DX11 support, they'll become basically an unrivaled product, with no competence at all. That increases the value considerably, because they are unique features: do you want a DX11 card? Buy that one because there is no other. Do you want the most powerful? The same. And so on.

We have had the chance to see this on the latter NVIDIA launches, with their 600$ 8800GTX / 800$ 8800Ultra, and their 650$ GTX280, and all the people paying those prices.

So I don't really think a 399$ top card would be nothing exorbitant. I'd say it's actually a very mild pricing given the circumstances, probably due to AMD wanting to recover market share as fast as possible.

Of course, other completely different thing will be when NVIDIA enters again to the market. Then, prices will depend on what each one have to offer, and for how much. If NVIDIA releases a faster and cheaper product than that, then AMD will have to lower their prices.


Apparently, you completely missed the global recession going on. ATi would kill themselves if they released their new cards that high. ATi already knows they'll have a 3 month reign (before the holidays) to sell everyone they can make.

While Nvidia plays lip service and slides.

Money talks! ATi doesn't have to play the market, they can completly walk away with it!

ie:
HD5850 (2GB, DX11) for $225 by Xmas ...! :up:





Also remember, there are 20million living in New York, half hit by a recession, their buying power alone is 50% of Spain. People are hurting, ATI isn't so aloof they would disregard the economy, as you apperently have...!

Farinorco
09-04-2009, 02:41 AM
Apparently, you completely missed the global recession going on. ATi would kill themselves if they released their new cards that high. ATi already knows they'll have a 3 month reign (before the holidays) to sell everyone they can make.

While Nvidia plays lip service and slides.

Money talks! ATi doesn't have to play the market, they can completly walk away with it!

ie:
HD5850 (2GB, DX11) for $225 by Xmas ...! :up:

Also remember, there are 20million living in New York, half hit by a recession, their buying power alone is 50% of Spain. People are hurting, ATI isn't so aloof they would disregard the economy, as you apperently have...!

I don't really think that recession is affecting the prices of computer components in any noticeable way, so I don't think AMD should be the only company aware of that situation in the world, but if you think so...:shrug:

I've seen the Core i7 releasing for the usual price for a new CPU, the X58 boards for the usual price of a high end chipset, now they are going to release the P55 platform with the usual prices for the performance segment, hard drives of top sizes for the usual huge prices... only thing I see going down are SSD, and way slower than I'd think a year or two ago.

BenchZowner
09-04-2009, 02:43 AM
ie:
HD5850 (2GB, DX11) for $225 by Xmas ...! :up:

There's no word to describe how impossible this is...

Wishful thinking doesn't even come close.

For what it's worth, a HD5870 2GB will more than likely cost 350 - 399$.

dan7777
09-04-2009, 03:01 AM
is there going to be a 5870x2 ?

saaya
09-04-2009, 03:06 AM
Think it was a red macao :p:
ohhh hows he doing these days? :D hahahah

i agree with xoulz, i dont think they can keep those prices up, if those prices are correct to begin with... but then again, like others here pointed out, without any competition they can ask for pretty high prices and cash in... but, like i said previously, they shouldnt be too greedy and start with too high prices and then slash them over and over... if prices are too high then a lot of people will enter a wait and see winter sleep until gt300 and larrabee arrive, cause the perf/$ was never as high for vgas as it has been in the past year, and the requirement for games has never been so low as in the past year either... lets face it, nobody really NEEDS a faster vga right now unless they upgrade their displays to a higher resolution first... and nobody NEEDS dx11 with barely any dx10 titles around thanks to consoles holding everything back to dx9...

so i think ati should try to focus on volume and not too much on making high margins with low volume, cause it might end up being really really low volume... then again, whip out the fishing gear and start teasing and we will see how it will work out :D

as long as juniper is priced nicely i wont complain :D

xoulz, i dont think 5870x2 will have any technological advancements regarding to multi gpu or dual core gpus etc... thats complete nonsense... before anything like that will make sense you need a high speed low latency bus like HT or qpi and not pciE, and youd need to do lots of logic reworking and adding to make propper use of both gpus more efficiently... and all that for what? you might get 25% extra performance out of a dual gpu card that way... thats not really worth all the effort i think...


is there going to be a 5870x2 ?yes, but not at launch...

Olivon
09-04-2009, 03:09 AM
so 399$ will be 399E or 349E, not 275... ;)


You're right saaya, 399€ seems plausible.

But HD5850 is maybe a better deal. Is it only frequencies differences between those two cards ?

Smartidiot89
09-04-2009, 03:20 AM
You're right saaya, 399€ seems plausible.

But HD5850 is maybe a better deal. Is it only frequencies differences between those two cards ?
My guess is lower binned GPU (obviously) with lower frequencies, and perhaps it will use different/lower binned GDDR5 memory modules? There has to be some difference betweeen the cards imo, if not it will be another HD3850/HD3870 situation were it was obsolute:shrug:

saaya
09-04-2009, 03:37 AM
largon braught up a good point in the other thread just now... 870 is a big gpu and 40nm yields are far from perfect... they cant throw away bad parts, and the next lower part they could use these gpus for is juniper with 800sps... thats a lot lower... so it makes sense if the 5850 is actually lower in specs and has less sps and maybe even other blocks disabled... 1200sps should be enough tp beat the 285 easily and gives them enough headroom to get even notably damaged rv870 cores sold without losing a lot of money on them...

Kylzer
09-04-2009, 03:46 AM
Ok so if its 100 less than the gtx285 then what would that be here cause the gtx285 is about 250 so maybe it will 199.99 ?

Calmatory
09-04-2009, 05:37 AM
largon braught up a good point in the other thread just now... 870 is a big gpu and 40nm yields are far from perfect... they cant throw away bad parts, and the next lower part they could use these gpus for is juniper with 800sps... thats a lot lower... so it makes sense if the 5850 is actually lower in specs and has less sps and maybe even other blocks disabled... 1200sps should be enough tp beat the 285 easily and gives them enough headroom to get even notably damaged rv870 cores sold without losing a lot of money on them...

So HD5850 would go EOL in few months once the process matures? Or would AMD start to cripple the fully functional HD5870 cards to HD5850 cards because of high demand for the cheap 1200 SP cards?

A good plan for the first few months, which will backfire with good yields.

iboomalot
09-04-2009, 05:40 AM
looks like its EBAY time for my 4870x2 and EK nickel block and live on my backup card for a couple weeks :)

oh wait is there a 5870x2 coming out soon ? or just the 5870 ?

informal
09-04-2009, 05:41 AM
So HD5850 would go EOL in few months once the process matures? Or would AMD start to cripple the fully functional HD5870 cards to HD5850 cards because of high demand for the cheap 1200 SP cards?

A good plan for the first few months, which will backfire with good yields.

Nope ,they will just lower the clock speed on GPU or both on GPU and memory.Voila ,you have the 20-30% less performance with the same shader count- HD4850 all over again :)

saaya
09-04-2009, 05:46 AM
Ok so if its 100 less than the gtx285 then what would that be here cause the gtx285 is about 250 so maybe it will 199.99 ?
nobody said 100$ less than GTS285... theres a rumor saying 100$ less than GTX295...


So HD5850 would go EOL in few months once the process matures? Or would AMD start to cripple the fully functional HD5870 cards to HD5850 cards because of high demand for the cheap 1200 SP cards?

A good plan for the first few months, which will backfire with good yields.hw mfcs really dont care about selling fully functional silicon partly disabled... they sell top bin silicon as lowest bin in huge volume all the time as well... the more yields improve the lower the price diference between 5850 and 5870 could get i suppose, they could balance it out that way :shrug:


looks like its EBAY time for my 4870x2 and EK nickel block and live on my backup card for a couple weeks :)

oh wait is there a 5870x2 coming out soon ? or just the 5870 ?only 5850 for now, then 5870 shortly after and then 5870x2 "soon"

FischOderAal
09-04-2009, 06:13 AM
and those investors know more about running amd than amd does?

Two words: shareholders value.


So HD5850 would go EOL in few months once the process matures? Or would AMD start to cripple the fully functional HD5870 cards to HD5850 cards because of high demand for the cheap 1200 SP cards?

A good plan for the first few months, which will backfire with good yields.

Doesn't matter. That's how it has been all the past years, why should it change? Why can many AMD X3s be unlocked to X4s? Why were there so many X1800 GTOs that could be unlocked (until ATI introduced a laser cut?)?

Bgriffs
09-04-2009, 06:15 AM
So HD5850 would go EOL in few months once the process matures? Or would AMD start to cripple the fully functional HD5870 cards to HD5850 cards because of high demand for the cheap 1200 SP cards?

A good plan for the first few months, which will backfire with good yields.

What I'd like to see if these specs are true would be some kind of HD5850+ model once the process improves. Charge a bit more or something and EOL the 5850.

saaya
09-04-2009, 06:17 AM
like the gtx260 216sp? :D
yeah wouldnt be surprised if amd did that... provided the 5850 IS cut down in hw specs...

Calmatory
09-04-2009, 06:46 AM
Doesn't matter. That's how it has been all the past years, why should it change? Why can many AMD X3s be unlocked to X4s? Why were there so many X1800 GTOs that could be unlocked (until ATI introduced a laser cut?)?

It certainly does matter whether you pay $180 for a card, can you sell it for $249 or $299. (And yeah, the values come from my magic hat).

If the yields are low, then of course it is benefitical to salvage some SP's, but I believe this is not the case. I doubt any SP's will be disabled, but HD5850 to use lower binned(cheaper) GDDR5 and lower binned RV870 core.

jmke
09-04-2009, 07:18 AM
and theres a rumor going around that the 5870 will cost 100$ less than a 295, which is 399$ then? thats pretty high... either yields are bad or ati wants to cash in as long as nvidia cant compete... i have to say, i really dont like what im hearing about prices now... it really spoils the anticipation of the new 800 series :/
sorry, but we're still living in a capitalistic society, the reason the current 4xxx series is so affordable is not because AMD/ATI is being nice/friendly; it's because they want to grab market share by offering better price/performance ratio; if their 5870 performs on par with a more expensive NVIDIA product and they price it lower they continue the trend. If their products performs better and is priced the same, they also continue the trend. I see no fault in this logic; a fault in the logic would be for them to give away their products at lower prices "just because".

JohnJohn
09-04-2009, 07:28 AM
I have a feeling that the main difference between 5870 & 5850 will be clocks, and 2gb vs 1gb. I think there won't be any 2gb 5850 anytime soon. I think also there will be salvaged parts for 5830 in 3-4 months or so

saaya
09-04-2009, 07:31 AM
that would require a lot of redundant sps and other logic blocks in the gpu... which would explain the large die size... that plus dx11 i guess...
so yes, we might not see logic cut down rv870s... or the few that will be damaged beyong redundancy replacement will be cut down and sold as juniper...


sorry, but we're still living in a capitalistic society, the reason the current 4xxx series is so affordable is not because AMD/ATI is being nice/friendly; it's because they want to grab market share by offering better price/performance ratio; if their 5870 performs on par with a more expensive NVIDIA product and they price it lower they continue the trend. If their products performs better and is priced the same, they also continue the trend. I see no fault in this logic; a fault in the logic would be for them to give away their products at lower prices "just because".who said just because?
i think they should continue to capture market share and if they price their cards that high they wont capture as much... a lot of nvidia fans will camp and wait with prices that high... what won over people from nvidia to ati in the past was better perf at better price, or worse perf at much better price... a little better perf at a little lower price wont wow anybody, especially long term nvidia users...

like ive said, who knows what the right price is especially in the industries current situation... so yes, it makes sense to start high... i just hope they drop prices soon after the launch...

Etihtsarom
09-04-2009, 07:48 AM
sorry, but we're still living in a capitalistic society, the reason the current 4xxx series is so affordable is not because AMD/ATI is being nice/friendly; it's because they want to grab market share by offering better price/performance ratio; if their 5870 performs on par with a more expensive NVIDIA product and they price it lower they continue the trend. If their products performs better and is priced the same, they also continue the trend. I see no fault in this logic; a fault in the logic would be for them to give away their products at lower prices "just because".

Thats mostly it. But does it not escape you all that 4850 launched at $200 and 4870@ $300, that is to say, these cards "branded" us with the notion that they're mid-ranged, Not high-end; 4870x2 was high-end@$500. IF they were to launch another XX50 and XX70 set of cards, they ought to retain that pricing structure, else call the cards by a different name, say, 5880 and 5890. What I'm really saying is, I don't like the sound of 5850, 5870 @$300 and $400; but .... 5880@$300 and 5890@$400 Sounds better. I know you're all disliking what you're hearing about that pricing, don't lie.

N/e ways, the thing is, with a X2, I'm so NOT excited about these cards coming b/c there are not that many games out there that demands much more than X2, even @ 2500 resolution, and Especially @ 1900 res that I have.

Etihtsarom
09-04-2009, 07:53 AM
Yeah dude, you^^ nailed it!



Secondly, I think people are pricing these too high. The 5850 Will debut @ $269, dropping to $225 by Xmas. The 5870 will debut @ $369 and carry a $299 price tag till next summer.

The X2 line will get it's own specialized treatment and starting @ $400 ...!


Some very interesting stuff coming though, specially since they are getting closer to dual-GPU (ie: dual core).. the cross-communication is getting sophisticated and whoever lith is most advanced and robust, will be the class leader of this next gen!

Expect the X2 to gobble Crysis ~ ARMA 2 and give a really immerse experience. We know nVidia's is going to outperform and cost more, but the X2 has the potential to upset nVidia's business plans.

The rest is already a forgone conclusion, given that they are already marketing the 300 series, as re-branded cards. nVidia's can only compete in the high-end market and if ATi does this X2 right... nVidia might have a shake up!
:toast::toast:

antiacid
09-04-2009, 09:08 AM
rumors, rumors and rumors. Let's get our panties in a bunch because it might be true in a parallel universe! Awesome.

I thought that the "news" section was to discuss actual news... This could be moved to the ATI/AMD section perhaps.

Manicdan
09-04-2009, 09:57 AM
like ive said, who knows what the right price is especially in the industries current situation... so yes, it makes sense to start high... i just hope they drop prices soon after the launch...

my rule #1 when shopping, launch prices are never a good price

this goes with technology products and cars particularly, and many other random things i cant think of. but its very expected for a product to come out high, get a nice margin, then shock the world when it comes down 20% and make the next wave of people think they are getting an incredible deal. but no one really knows what the price should be (and im sure those who think they know probably had to sift through 1000 factors like cost, research, advertising, storage, trasportating, depreciation, taxes, ect)

Baron_Davis
09-04-2009, 10:01 AM
I think we can all agree that the 58xx series will tear :banana::banana::banana::banana: up...

Dragy2k
09-04-2009, 10:06 AM
http://www.semiaccurate.com/2009/09/03/and-cypress-shader-count/

and theres a rumor going around that the 5870 will cost 100$ less than a 295, which is 399$ then? thats pretty high... either yields are bad or ati wants to cash in as long as nvidia cant compete... i have to say, i really dont like what im hearing about prices now... it really spoils the anticipation of the new 800 series :/

i really hope those are only the launch prices and they will drop notably after a few weeks...

EDIT: and about gt300.. it should easily beat a 5870, i dont think anybody doubts that, the question is more about how much itll cost... gt300 will definately bring the perf crown back to nvidia once it comes out, but that seems to be a while and it wont be cheap...

i cant help but feel things actually got worse this time around... ati wont offer great price perf anymore cause nvidia has even less to compete with their mainstream parts, and the highend will be nvidia dominated as before but prices will probably be even higher than before... sigh...

omg ..sayya any chance you could pass on next week lotto numbers please ...just email them to me ......cant we wait till its out in the wild (ati 5xxx)before we start firing silly comments ....cmon

jmke
09-04-2009, 11:32 AM
But does it not escape you all that 4850 launched at $200 and 4870@ $300, that is to say, these cards "branded" us with the notion that they're mid-ranged, Not high-end;
part of my argument is that it definitely did not escape me; they had to launch their 4800 series at those price points to sell; if their 5800 series offers better price/performance than NVIDIA, they will most definitely keep that trend; and since performance will go up, you can't have a working equation if the price doesn't go up also;)

that's why we need competition, because without it, we'll be paying $1000 again for a fast CPU and $700 for a good VGA card. Instead of being able to build a complete gaming rig for $700 as it is now:)

strange|ife
09-04-2009, 12:48 PM
so there will be a 2GB 5870 at launch? or was that a 5850x2?

Sh1tyMcGee
09-04-2009, 12:59 PM
so there will be a 2GB 5870 at launch? or was that a 5850x2?

honestly nothing has been confirmed, everything is all rumors, never been so close to launch with no concrete info, ATI must be really tight lipped about their new gem, lets hope the new driver team has made some improvements and fix things without breaking another like previous ati driver releases. :clap:

Baron_Davis
09-04-2009, 01:34 PM
I hope they upgraded their drivers so that you can run something super sick like tri or quad 5870 X2, basically hexa or octa crossfire...man that would be the best news ever

BenchZowner
09-04-2009, 02:11 PM
I hope they upgraded their drivers so that you can run something super sick like tri or quad 5870 X2, basically hexa or octa crossfire...man that would be the best news ever

LOL :rofl:

villa1n
09-04-2009, 02:20 PM
so there will be a 2GB 5870 at launch? or was that a 5850x2?

I d be interested in that as well, 4 TFlops single card, at the right price... remove the clutter from my antec 900 ^^ If its cooled properly.

I guess they know how to build tension.... I want to see what these have in store for us.:yepp:

initialised
09-04-2009, 03:28 PM
-Exactly, something not to be taken as is until Sept. 10..Or until someone on here gets pre-release hardware...

But with a week to go and no proper leaks it doesn't look likely.

blindbox
09-04-2009, 04:10 PM
Somebody here already has the cards initialised (of course, bounded by NDA). AMD is just being tight-lipped.

labs23
09-04-2009, 08:51 PM
Somebody here already has the cards initialised (of course, bounded by NDA). AMD is just being tight-lipped.

Good, we should send PM to every friend we have in here. Who knows, one might give a reply, a hint and a teaser.:D J/K.

Seriously, its just within a week guys, and soon "You won't believe your eyes", per what AMD said.http://www.operationsports.com/forums/images/smilies/graemlins/appl.gif

Chrono Detector
09-05-2009, 02:54 AM
Out with the specs already ATI, we're all dying to know.

At least ATI is being earlier this year than NVIDIA, and thats good. I wonder whats NVIDIA cooking up.

Helloworld_98
09-05-2009, 03:03 AM
Out with the specs already ATI, we're all dying to know.

At least ATI is being earlier this year than NVIDIA, and thats good. I wonder whats NVIDIA cooking up.

Maybe not, Nvidia's press conference is on the same day as ATi's, so they're definitely trying to take a slice of the cake.

Jamesrt2004
09-05-2009, 03:24 AM
this is dumb why argue about pricing we all know that there just having high prices for a few months cos they can... theyll be the only dx11 card and many OEM's and people will be like WOAAA NEW BEST THING....

then all of a sudden oh nvidia are releasing there cards... out with the x2 and a big price drop... its so obvious that this is going to happen .. just wait a couple months :/

RejZoR
09-05-2009, 03:35 AM
399$ = 275€ = cheap to me .

In your dreams. It'll be more like 399$ = 399€...

FischOderAal
09-05-2009, 04:01 AM
Maybe not, Nvidia's press conference is on the same day as ATi's, so they're definitely trying to take a slice of the cake.

Up to now it seems like ATI will be earlier to the market (and basically, that's nearly all we care about, isn't it?). What NVIDIA does appears as a act of desperation to me. "Hey, look over here! I won't be releasing new cards anytime soon, but when they arrive they will be xy% faster than the competitor's cards!" Better/wiser than just play the sitting duck and do nothing. Of course, this might work for all those who are not in dire need of an upgrade. But missing the holiday business definitely isn't good...

saaya
09-05-2009, 07:36 AM
well thats the thing... if ati is first but the cards cost a lot, the time to market benefit might not work out too well and people will camp and wait for nvidia and intel... i remember several people making comments that they wont upgrade until larrabee is available and that was when intel had just anounced it lol... and others said the same about gt300... :D unless atis price/perf is tempting being first to market with dx11 might not make such a big diference... at least for end users... for system builders its everything im sure... ati must have loads of system integrator orders piling up :D

003
09-05-2009, 08:08 AM
Up to now it seems like ATI will be earlier to the market (and basically, that's nearly all we care about, isn't it?).

Whether you go with ATI or Nvidia, it would be extremely wise to wait for the G300 line to launch as it will drive down HD5000 prices.

labs23
09-05-2009, 08:21 AM
Whether you go with ATI or Nvidia, it would be extremely wise to wait for the G300 line to launch as it will drive down HD5000 prices.

Its what I'm thinking of doing right now, yeah I'm ready for some waiting game til Q1 2010.http://www.operationsports.com/forums/images/smilies/graemlins/duck.gif

Barys
09-05-2009, 08:34 AM
A birdie said "the 5870 is beating a GTX 295 in some cases, and is following the GTX295 closely in most cases".

Well, so it means it wll be probably as fast as GT300 which is said to be about 10-20% faster than GTX295. Then NVIDIA is going to be in a big trouble again (no significant performance advantage, a 2-3 months behind AMD and more expensive and complex chip).

Hell Hound
09-05-2009, 08:52 AM
Well, so it means it wll be probably as fast as GT300 which is said to be about 10-20% faster than GTX295. Then NVIDIA is going to be in a big trouble again (no significant performance advantage, a 2-3 months behind AMD and more expensive and complex chip).

NVIDIA'S 2900XT :ROTF:

Chickenfeed
09-05-2009, 09:02 AM
I'd hate to wait for that to turn out to be the case :p: Hopefully we get some early insight on how it might compare to the 5XXX line.

RPGWiZaRD
09-05-2009, 09:26 AM
Yea for it to be succesful it would need to be at least 20~30% faster than GTX 295 assuming HD 5870 is roughly as fast as a GTX 295. The pressure is clearly on nvidia, big chip, lots of possibilities to fail. Will be nice to see how much the current GTX 2xx cards will be pressed down pricewise cuz of HD 5xxx. Clearly nvidia will have some painful months ahead.

saaya
09-05-2009, 09:30 AM
Well, so it means it wll be probably as fast as GT300 which is said to be about 10-20% faster than GTX295. Then NVIDIA is going to be in a big trouble again (no significant performance advantage, a 2-3 months behind AMD and more expensive and complex chip).actually no, what his source told him is that 5870 comes close to 295 perf but doesnt beat it in the whole field... gt300 is rumored to be a bit faster than a 295, so then gt300 should be 10-40% faster than 5870... considering that its coming at least 1 quarter later and the die size is almost double, it BETTER be that fast :D

orangekiwii
09-05-2009, 09:31 AM
I thought from certain expectations the gt300 was about 40% faster than the gtx295... don't ask for a source its just in a couple threads i've seen "leaked" info on expected performance around 14000x score in vantage which is like 40% faster than the 295

ALSO

nvidia is meaning to release an x2 version of the gt300... whether it will in fact happen is unknown but I would count on it if its close to performance to the 5870 in performance... otherwise they have no performance offering thats competitive with a 5870 x2

Smartidiot89
09-05-2009, 09:36 AM
nvidia is meaning to release an x2 version of the gt300... whether it will in fact happen is unknown but I would count on it if its close to performance to the 5870 in performance... otherwise they have no performance offering thats competitive with a 5870 x2
I would say 100%.

Nvidia will think twice now before handing over the performance crown to ATI again like they did last time (HD4870X2)

FischOderAal
09-05-2009, 09:56 AM
Whether you go with ATI or Nvidia, it would be extremely wise to wait for the G300 line to launch as it will drive down HD5000 prices.

Yes, it's more sensible to do so, but not everyone does, especially when your better half needs a new GFX and christmas is coming ;) Just one example.

Or for people like me, who really need a card (or: a whole computer...). I gladly pay 50 Euro plus if I don't have to wait anymore... In the past I never bought a GFX right away, this might actually be the very first time for me.

saaya
09-05-2009, 10:02 AM
Yes, it's more sensible to do so, but not everyone does, especially when your better half needs a new GFX and christmas is coming ;) Just one example.

Or for people like me, who really need a card (or: a whole computer...). I gladly pay 50 Euro plus if I don't have to wait anymore... In the past I never bought a GFX right away, this might actually be the very first time for me.in my experience, either buy right after launch (wait for a few days until prices have settled or even pre order if you dont mind about 25E premium)

OR

you wait for at least 2 months for the prices to go down...

railmeat
09-05-2009, 10:20 AM
@smartidiot89

"Nvidia will think twice now before handing over the performance crown to ATI again like they did last time (HD4870X2)"

%80 of the tests showing a 295 beats the 4870x2???.....my buddy sold his 4870x2 for $275and bought a single pcb 295 like me.he said its just a "smoother feeling" card gaming for hours.


i had click to buy a few times on the 4870x2 with the option for xfire 4870x2 on my mobo....but the 295 in the long takes it.im NOT pro nvidia,im with whoever is faster.
________
__________









actually no, what his source told him is that 5870 comes close to 295 perf but doesnt beat it in the whole field... gt300 is rumored to be a bit faster than a 295, so then gt300 should be 10-40% faster than 5870... considering that its coming at least 1 quarter later and the die size is almost double, it BETTER be that fast :D

info accurate sayya?:confused: if so im happy to hear...


i knew returning my dual pcb 285 for the single pcb 295 was the right move for many reasons.watercooling = space saved on them being there single slot after u block them.plus there just so f,n fast im gaming 706 core...2048x1536 cod4,bf2 sikness.it takes 1 powerful card to stretch its legs running a res that big.not many cards can it,4870x2 is def 1 of them.its VERY fast too.

i seen the 5870x2 specs and thought for sure this 5870x2 would dominate the 295,but not the gtx 300...

another year for intel/nvidia?.....:shrug: i like options tho....xfire 5870,s on my rig WOULD fly..need a q9550 at that point tho :o

NaMcO
09-05-2009, 10:30 AM
399$ = 275€ = cheap to me .

$399 usually means €399+ don't know where you get your numbers, but they're hardly accurate. Sure Dollar is under the Euro, but there's taxes, shipping, customs...

Vit^pr0n
09-05-2009, 10:52 AM
Why are people worrying about the price all of a sudden? Wasn't there a thread earlier where people were saying $299 for the 5870 is too low? Now people are complaining about it being too high?'

This isn't like the 4800 vs GT200 launch where ATi was able to launch with great prices. This time around ATi is launching their next-gen cards alone for a good 2-3 months. They're a business out to make money, so why wouldn't they launch at a higher price? Not to mention the 5870 is supposedly performing similar to the GTX295, which costs more than $399.

:shrug:

NaMcO
09-05-2009, 10:56 AM
I think the price is right, people here in local forums are dreaming of 5870's at 199, come on, wakey wakey :rolleyes:

N19h7m4r3
09-05-2009, 11:02 AM
I just want them to be released already, so many rumors about this and that.

We need some clarification.

Calmatory
09-05-2009, 11:08 AM
i seen the 5870x2 specs and thought for sure this 5870x2 would dominate the 295,but not the gtx 300...

another year for intel/nvidia?.....:shrug: i like options tho....xfire 5870,s on my rig WOULD fly..need a q9550 at that point tho :o

You're telling that GT300 would be 100 % faster than DUAL GT200? That means, that it would take FOUR GT200 chips to match one GT300?

Dream on. :rolleyes:

Jamesrt2004
09-05-2009, 11:15 AM
I think the price is right, people here in local forums are dreaming of 5870's at 199, come on, wakey wakey :rolleyes:

+1

and they will go that price just in a year or maybe jan depending how gt300 is

Smartidiot89
09-05-2009, 11:50 AM
@smartidiot89

"Nvidia will think twice now before handing over the performance crown to ATI again like they did last time (HD4870X2)"

%80 of the tests showing a 295 beats the 4870x2???.....my buddy sold his 4870x2 for $275and bought a single pcb 295 like me.he said its just a "smoother feeling" card gaming for hours.


i had click to buy a few times on the 4870x2 with the option for xfire 4870x2 on my mobo....but the 295 in the long takes it.im NOT pro nvidia,im with whoever is faster.
So how many months did HD4870X2 stand as the performanceking unrivaled before the GTX295 was released? What I meent obviously was that Nvidia will have a dual-GPU card ready in advance this time, cause last time they got bit hard by HD4870X2 and it took the company months to release their sandwich to counter it.

w0mbat
09-05-2009, 11:59 AM
Guys dont freak out pls, only a few days to go. And surfing the web on a hd5870 isnt better than on a x1900 xt.

Origin_Unknown
09-05-2009, 12:17 PM
Guys dont freak out pls, only a few days to go. And surfing the web on a hd5870 isnt better than on a x1900 xt.



PROOF!?

Lightman
09-05-2009, 01:16 PM
Guys dont freak out pls, only a few days to go. And surfing the web on a hd5870 isnt better than on a x1900 xt.

Oh, I'm sure surfing the web on HD5870 is much better than on X1900XT :p:

NOTES: (when you utilize 3 monitor setup to simultaneously read 3 different XS threads each on it's own screen, on X1900 you could do only 2 threads) :D

initialised
09-05-2009, 01:22 PM
Not to mention the 5870 is supposedly performing similar to the GTX295, which costs more than $399.$299 may be the 5850 price rather than the 5870 price but knowing ATi they will price a similar performance part slightly lower than the equivalent NV. ATi don't have to cover the costs of CUDA, PhysX, 3DVision & "The Way it's Meant to be Played" which is why you still pay a premium for NV parts.

Performance wise if 5870/Cypress is equivalent to GTX295, how will Crossfire(X) performance compare with Quad-SLi and Tri-Sli GT200?

Will reference parts in CrossfireX be feasible in boards without a spare slot between the cards?

And most importantly, when will I be able to get my hands on one?

To(V)bo Co(V)bo
09-05-2009, 02:47 PM
We can say for sure that the 5870 X2 will cream the gt300. Nvidia was hoping this wouldnt be the case, as they believed the gt300 was still gonna be faster. They now know the only way to be on top is that they will have to release a gt300 x2 card. How much will this cost? I cant see this going for no less than around 1000.00$. Who in there right mind will drop that kind of cash to play a game you can already nearly max on a 4870 or 4850, hell you could buy a ps3 and a ton of games for that, or think off all the extra hardware you can buy going with just a mainstream card. The market isnt heading for top end "balls to wall" performance anymore, Its heading for how much performace can u give me for around 200-300$ + or -

People have a limit that they will pay. Look at consoles, they all cost roughly 300$, and If a new grfx card cost more than that I will just be better served getting a console and the library of game titles they offer. Look at how poorly the PS3 sales are for a 400$ system compared to the cheaper nearly identicle performance systems.

If I can score a DX11 high performace part for about the same cost as a slim ps3 (300$) Than thats what Ill buy, it is way more powerfull and future proof than the PS3. All these DX11 parts wont even begin to be utillized untill current consoles are replaced. All PC games are just DX9 ports anyways.

The real war between ATI and NVIDIA is in this mainstream category, this 300$ and less segment. ATI has a full lineup ready to push out the door. Nvidia only has a expensive GT300 parts that will cost probally 500$ and a X2 1000$, With all current GT200 parts that will then get a 40nm refresh and get rebranded as a new line up. These wont even show up till after Christmas. I myself am waiting till Christmas to get a new card, as its the only time of the year I can really afford to get something I really want. ATI has all this covered and that is what Im probally gonna get. Christmas hopefully will have aftermarket cards too I hope. When GT300 hits ATi will be releasing all there tweaked aftermarket parts to fight with also. Christmas is huge! If you miss Christmas you lose a lot of the market.

I wouldnt be surprised if the green empire collapses entirely

strange|ife
09-05-2009, 03:09 PM
299 for a 5850 is retarded expensive, should be like 269. While 5870 would be nice around 359$ or somewhere around that price

ahh just gotta wait and see

the suspense! oh the suspense!!

Chumbucket843
09-05-2009, 03:12 PM
We can say for sure that the 5870 X2 will cream the gt300. Nvidia was hoping this wouldnt be the case, as they believed the gt300 was still gonna be faster. They now know the only way to be on top is that they will have to release a gt300 x2 card. How much will this cost? I cant see this going for no less than around 1000.00$. Who in there right mind will drop that kind of cash to play a game you can already nearly max on a 4870 or 4850, hell you could buy a ps3 and a ton of games for that, or think off all the extra hardware you can buy going with just a mainstream card. The market isnt heading for top end "balls to wall" performance anymore, Its heading for how much performace can u give me for around 200-300$ + or -

People have a limit that they will pay. Look at consoles, they all cost roughly 300$, and If a new grfx card cost more than that I will just be better served getting a console and the library of game titles they offer. Look at how poorly the PS3 sales are for a 400$ system compared to the cheaper nearly identicle performance systems.

If I can score a DX11 high performace part for about the same cost as a slim ps3 (300$) Than thats what Ill buy, it is way more powerfull and future proof than the PS3. All these DX11 parts wont even begin to be utillized untill current consoles are replaced. All PC games are just DX9 ports anyways.

The real war between ATI and NVIDIA is in this mainstream category, this 300$ and less segment. ATI has a full lineup ready to push out the door. Nvidia only has a expensive GT300 parts that will cost probally 500$ and a X2 1000$, With all current GT200 parts that will then get a 40nm refresh and get rebranded as a new line up. These wont even show up till after Christmas. I myself am waiting till Christmas to get a new card, as its the only time of the year I can really afford to get something I really want. ATI has all this covered and that is what Im probally gonna get. Christmas hopefully will have aftermarket cards too I hope. When GT300 hits ATi will be releasing all there tweaked aftermarket parts to fight with also. Christmas is huge! If you miss Christmas you lose a lot of the market.

I wouldnt be surprised if the green empire collapses entirely
1. you can not be sure about any numbers or dates, even the ones you made up all by yourself. everything out there is rumours.
2. a real enthusiasts like to buy the most expensive hardware around
3. this sounds more like your budget, not the market

zalbard
09-05-2009, 03:26 PM
We should have an official "The Best Rumour" contest.

XS2K
09-05-2009, 03:37 PM
We can say for sure that the 5870 X2 will cream the gt300. Nvidia was hoping this wouldnt be the case, as they believed the gt300 was still gonna be faster. They now know the only way to be on top is that they will have to release a gt300 x2 card. How much will this cost? I cant see this going for no less than around 1000.00$.
I wouldnt be surprised if the green empire collapses entirely

LSD can be a powerfull drug :peace::rehab::eleph:

Vit^pr0n
09-05-2009, 03:39 PM
1. you can not be sure about any numbers or dates, even the ones you made up all by yourself. everything out there is rumours.
2. a real enthusiasts like to buy the most expensive hardware around
3. this sounds more like your budget, not the marketEnthusiasts make up less than 1% of the market. Nvidia is not going to make money off of selling an expensive card like the GTX395 ( If that's what the name is going to be )

It looks like this gen is going to be a repeat of the last: Nvidia is sticking with big chips, while ATi sticks with smaller chips and doubling it up for enthusiast chips.

Just by looking at previous generations, a GTX380/385 should be 10-30% faster than the GTX295 but have the same price as the GTX280 when AFTER the first price drop when the 4800 series was released, which was $450 I believe?

Either way, ATi will be dropping their prices when Nvidia releases their next-gen cards.

That's just my speculation though.

Sh1tyMcGee
09-05-2009, 04:42 PM
Most of these rumors seem to be BS to me, i think were all going to be fooled once the info comes out on the 10th.

Vit^pr0n
09-05-2009, 05:06 PM
Most of these rumors seem to be BS to me, i think were all going to be fooled once the info comes out on the 10th.Would be hilarious if the rumoured specs are actually lower than the real specs. Wishful thinking though :ROTF:

~CS~
09-05-2009, 05:26 PM
I cannot believe there is still no real performance figures , am not sure how AMD managed it , makes you think there is more to it ....

To(V)bo Co(V)bo
09-05-2009, 05:29 PM
1. you can not be sure about any numbers or dates, even the ones you made up all by yourself. everything out there is rumours.
2. a real enthusiasts like to buy the most expensive hardware around
3. this sounds more like your budget, not the market

A REAL ENTHUSIAST TAKES WHAT THEY CAN AFFORD AND MAX'S IT OUT.

getting a opty 165 over 3.0ghz priceless
volt-mod a 2900xt priceless
blowing up a celeron on dice priceless
running a amd k5 without heatsink till it pops priceless

some people buy old :banana::banana::banana::banana: to blow up, thats what being EXTREME is

Glow9
09-05-2009, 05:29 PM
Most of these rumors seem to be BS to me, i think were all going to be fooled once the info comes out on the 10th.

I find a lot of the rumors make more sense than the idiotic conclusions people come up with on here. Someone should chart: the rumors, the crap on here people come up with, then the end results. Just to show how idiotic some of this crap is.

003
09-05-2009, 05:51 PM
They now know the only way to be on top is that they will have to release a gt300 x2 card. How much will this cost? I cant see this going for no less than around 1000.00$.

You're delusional if you think nvidia would charge $1000 for a GTX395.

Sh1tyMcGee
09-05-2009, 05:57 PM
I find a lot of the rumors make more sense than the idiotic conclusions people come up with on here. Someone should chart: the rumors, the crap on here people come up with, then the end results. Just to show how idiotic some of this crap is.

That would be an interesting chart, hahaha

Heinz68
09-05-2009, 06:26 PM
I don't believe the GTX 395 is going to be $1,000.00 on the the other hand the $829.00 MSRP for GTX 8800 Ultra was even more crazy.
http://www.techarp.com/showarticle.aspx?artno=403

annihilat0r
09-05-2009, 06:57 PM
The 4870 prices were that good because at that time NVidia was practically dominating ATI, and GT200s and 4000s were released at approx. the same time so AMD/ATI had lots of reasons for going for a very competitive price. But now things are much better for both AMD and ATI, and NVidia's new parts don't come out till Q1 so all ATI has to do with pricing at this moment is to provide similar speeds NVidia's very high/highest end products (295) and price them considerably lower.

Even if the yields and costs were extraordinarily great AMD still has no reason to price a 5870 at $300. The price will be around $400 and I can't see it varying much at the launch by any stretch of imagination. Sure, after the GTX 300's come out it's going to be a different matter entirely, but that doesn't happen until at least a couple of months.

btw: daamn, I haven't been posting here for nearly one year! love product launch days, the GTX200 - HD4800 days were aaaaawesome!

003
09-05-2009, 07:39 PM
NVidia's new parts don't come out till Q1

They are going to be out in November or December of 09.

Redsand426
09-05-2009, 07:50 PM
It's wishful thinking to say that the new 5870 is going to outperform a gtx295. I just don't see a single card having that much power just yet. The performance will fall between a gtx 285 and the gtx 295, and same with the price. If I'm wrong then'll I'll eat my words and buy one.

To(V)bo Co(V)bo
09-05-2009, 07:57 PM
Maybe not 1000$ but id think around 900$ for sure. Thats just for the X2 GT300. Yeah these are assumptions, but Im going off these numbers.

Ati 5870 will probally be close to 400$
so how much for the X2 5870? Im sure about double at launch 800$

GT300 probally will be close to 500$
X2 GT300 would be about double,1000$ (800$ if its a nutered die possibly)

The GT300 was designed to be competitive and better than the 5870X2, but rumors from BSN say that it is gonna end up a bit short. Thus forcing Nvidia to produce the GT300X2 (GTX395). From what Ive read Nvidia is only gonna make a 40nm refresh of current GT200b cores to fill thier lineup. GT300 is gonna be only one single part and not have partly disabled cores like a GTX 260 was from the GT200 core. So by this reasoning I can see a hugely expensive 900~1000$ part.:yepp:

You also have to remember, who ever has the spoils of the fastest card gets to overprice thier top end parts. (NVIDIA TAX)
Look at the gtx285 for 350$, how much faster is it really when compaired to a 4890? is it 150$ more faster. not even close

orangekiwii
09-05-2009, 10:28 PM
I think the gtx300 will fall about 20% behind the 5870x2

if the 5870 is about 95% of a gtx295 then a 5870x2 would be about 160% of a gtx295...

current specs for gtx300 place it around 40% faster than a gtx295...


if and when nvidia releases a gtx395 it basically HAS to be faster than the 5870x2 or nvidia fails horribly and i'm never buying from them again just because they sat on their ass for 3 + years

Xoulz
09-05-2009, 10:42 PM
Maybe not 1000$ but id think around 900$ for sure. Thats just for the X2 GT300. Yeah these are assumptions, but Im going off these numbers.

Ati 5870 will probally be close to 400$
so how much for the X2 5870? Im sure about double at launch 800$

GT300 probally will be close to 500$
X2 GT300 would be about double,1000$ (800$ if its a nutered die possibly)

The GT300 was designed to be competitive and better than the 5870X2, but rumors from BSN say that it is gonna end up a bit short. Thus forcing Nvidia to produce the GT300X2 (GTX395). From what Ive read Nvidia is only gonna make a 40nm refresh of current GT200b cores to fill thier lineup. GT300 is gonna be only one single part and not have partly disabled cores like a GTX 260 was from the GT200 core. So by this reasoning I can see a hugely expensive 900~1000$ part.:yepp:

You also have to remember, who ever has the spoils of the fastest card gets to overprice thier top end parts. (NVIDIA TAX)
Look at the gtx285 for 350$, how much faster is it really when compaired to a 4890? is it 150$ more faster. not even close


Dude, lay off teh bad weed... the 5870 will debute at $369 and be $299 by Xmas!

570091D
09-06-2009, 12:12 AM
We should have an official "The Best Rumour" contest.

http://www.tz-uk.com/pics/tinfoilhat.jpg

we could just have this thread re-named....


i really can't see anyone charging more than $800 for a gfx card this year, current economic conditions just don't support this.

ajaidev
09-06-2009, 12:31 AM
I think hamburgers can see the future, i'll just eat one and tell who will win the 5870 x2 or gt300.

To(V)bo Co(V)bo
09-06-2009, 12:41 AM
Well I really hope Im wrong!:sofa:

Sly Fox
09-06-2009, 01:07 AM
Well I really hope Im wrong!:sofa:

Well shortly after release I recall the 8800 Ultra going for ~$900 USD in multiple shops. So assuming a dual GPU G300 chip would be utterly and completely dominant by a wide margin... It could happen. :eek:

Seems kinda unlikely to me that ATI will let themselves be caught with their pants down yet again though. But who knows.

Smartidiot89
09-06-2009, 01:14 AM
They are going to be out in November or December of 09.
Yupp yupp, full stocks everywhere, it will be fully available to everyone and their mother who wants one:rolleyes:

november/december it will "hit the market" ie. reviewers and the in very limited quantities around the globe - we can compare this "launch" to HD4770. January/February will have it's real launch.

Barys
09-06-2009, 01:15 AM
Don`t you think if there are GT300 samples already done there should be some performance leaks this week ? Or maybe some people know some interesting numbers.

flopper
09-06-2009, 01:28 AM
november/december it will "hit the market" ie. reviewers and the in very limited quantities around the globe - we can compare this "launch" to HD4770. January/February will have it's real launch.

that is also best case scenario without any re-spins and having good yeilds.

Tim
09-06-2009, 01:30 AM
Don`t you think if there are GT300 samples already done there should be some performance leaks this week ? Or maybe some people know some interesting numbers.

Same applies to ATI's chips right? Still nothing, and that's with one week to go.

saaya
09-06-2009, 01:30 AM
info accurate sayya?:confused: if so im happy to hear...
very reliable about the 5870 performing about the same as 295, wouldnt trust the gt300 10-40% above 295 rumor... nvidia doesnt even have their first silicon back so it must be based on nvidias calculations/predictions or they told that to their partners cause thats how fast gt300 HAS to be to prevent nvidia partners from becoming nervous, or its wishful thinking... def not reliable...

doesnt mean its not true... but its not reliable...


Oh, I'm sure surfing the web on HD5870 is much better than on X1900XT :p:

NOTES: (when you utilize 3 monitor setup to simultaneously read 3 different XS threads each on it's own screen, on X1900 you could do only 2 threads) :Dyou can do that with any ati vga since the 680 chipset :D
def nice :) and atis hydravision finally works right with that as it now centers everything to the center of the middle display and not exactly the border between 2 displays like it does with 2 monitors :D

To(V)bo Co(V)bo, you didnt really think nvidia didnt plan to build a dual gt300 right? when they planned to do it is another question, but they def had plans for it all along... will they need a dual gt300 to bear the 5870x2? of course... will their dual card beat a 5870x2? thats the more interesting question!

in theory, yes... in practise they are facing the same power limit as ati, so the only way to beat the 5870x2 is to be more energy efficient... and thats something that nobody expects gt300 to be, and for a reason :D

its interesting, originally ati used to go for the brute force approach in raw perf and nvidia went for efficiency, then ati managed to outdo nvidia in efficiency with rv670 and since then they have had the lead in perf/transistor and perf/watt, depending on the tdp the advantage isnt very big though... if nvidia reshuffles their transistors and boosts efficiency only by a little it could be enough to beat ati... and then they have double gt200s transistors with a higher efficiency per transistor, which is probably where the 2.x times gt200 perf rumor comes from

seeing that nvidia has recently not done that well engineering wise... i mean everything after g92 was just a copy paste patchwork with some tweaks... im not sure if gt300 will really be a new design or just gt200 doubled up with a dx11 strapped to the side with tape :D

Smartidiot89
09-06-2009, 02:03 AM
that is also best case scenario without any re-spins and having good yeilds.
There won't be another re-spin, the wafers currently in production will hit market. But Nvidia apparently is doing a few for riskproduction to make sure they can at the very least paperlaunch their DX11 cards 2009.

BenchZowner
09-06-2009, 02:10 AM
There won't be another re-spin, the wafers currently in production will hit market. But Nvidia apparently is doing a few for riskproduction to make sure they can at the very least paperlaunch their DX11 cards 2009.

Just one thing... tape out a few months ago, rev A1.
Production rev A2...

saaya
09-06-2009, 03:41 AM
I cannot believe there is still no real performance figures , am not sure how AMD managed it , makes you think there is more to it ....well i guess perf is going to be better than most people expect and ati want to surprise everybody...


You're delusional if you think nvidia would charge $1000 for a GTX395.not saying the mars dual gtx285 card from asus sells well... but it does sell... and the 8800 ultra wasnt far off from 999...



i really can't see anyone charging more than $800 for a gfx card this year, current economic conditions just don't support this.asus mars dual gtx285... doesnt sell well i think, but there are def people who buy them out there... not many, but they exist...

TheBlueChanell
09-06-2009, 03:47 AM
I think $400 is a good price for the 5870 especially if it does indeed have 1600 SP's. This is the first DX11 card and the first time ATi has beat nVidia to a flagship launch if ever.

It's not the $300 people were expecting but that's because everybody assumed that because the 4870 was $300 @ launch the 5870 was going to be. That's wheat you get when you assume. :D

They could gouge the price even higher like nVidia does when they're the first to the market i.e 8800GTX $500, GTX285 $650 but they didn't. They're maybe charging $100 more than they did last gen but so what? They've been on a roll with cards, they're the first to DX11 and it will hopefully give the company some breathing room.

I say bring it on. This is the most exciting graphics release for me since G80. As soon as the GT300 is released we will see a reduction in price and the launch of the 5870x2. So those people like me, who are going to be purchasing a card at launch will be able to add a 2nd card for crossfire on the cheaper.

m.fox
09-06-2009, 03:49 AM
i mean everything after g92 was just a copy paste patchwork with some tweaks...

The same with ATI since RV670, more SPs, tweaks and shrinks. ;)
Nothing REALLY new from both sides since G80/R600. No need to reinvent the wheel... yet. ;)

saaya
09-06-2009, 04:45 AM
The same with ATI since RV670, more SPs, tweaks and shrinks. ;)
Nothing REALLY new from both sides since G80/R600. No need to reinvent the wheel... yet. ;)I think G80 to G92 was still some notable tweaking, and R600 ro RV670 was some major tweaking...

G92 to g200 no tweaking at all... was it? just pumping it up...
and g200 65nm to 55nm was no tweaking at all either, was it? just shrinking it down... g200 to g2xx was just gluing 10.1 and gddr5 support on and shrinking... i really hope g300 isnt just g2xx bumped up again...


Just one thing... tape out a few months ago, rev A1.
Production rev A2...thats what nvidia says, but they decide whats written on the chip... they could as well print A5 on it... and if they would have had semi working g300 that long ago im pretty sure they would have used it somehow in pr or showed to their big partners... but they havent... so i think thats just nvidia pr... i think the big chip that taped out in 40nm that long ago were 40nm gt200 direct shrink attempts, not gt300... of course, they might have doubled up gt200 and called it gt300 if it would have worked...

BenchZowner
09-06-2009, 05:02 AM
For what it's worth both companies ( AMD & nVIDIA ) have been very secretive the last few months, and do their best to make sure nothing really useful gets leaked ( months before the G200 launch I had archit. info that I partly released and other people had that and other info as well, this time nada from both companies ).
Does this means that they both have something good in their hands ? Possibly.
But it could be the opposite as well.

Jowy Atreides
09-06-2009, 05:15 AM
Roll on the 10th.

If no big news posts come from the 10th then i'll FFUUUU

saaya
09-06-2009, 05:35 AM
For what it's worth both companies ( AMD & nVIDIA ) have been very secretive the last few months, and do their best to make sure nothing really useful gets leaked ( months before the G200 launch I had archit. info that I partly released and other people had that and other info as well, this time nada from both companies ).
Does this means that they both have something good in their hands ? Possibly.
But it could be the opposite as well.
yeah but i wonder why... the head honcho at amds graphics divison, read ati, just admitted in an interview that gpu specs are completely frozen more than 1 year prior to launch... so whats the big deal then?

whats the reason to be so secretive then? creating a hype in the market? idk... if you dont leak anything about perf how would that get people excited? right now its really retarded cause prices have leaked before perf has leaked, which if anything creates an anti hype cause prices are higher than expected and nobody knows what perf to expect...

i dont get it, really... i think all this secrecy is just cause some people like playing corporate james bond and are paranoid about the competition knowing what their plans are... its almost as if the apple spirit has infected ati and nvidia :lol:

edit, oh and intel too...

Chumbucket843
09-06-2009, 06:21 AM
edit, oh and intel too...

i would laugh my ass off if larrabee launched september 9th.:ROTF:

saaya
09-06-2009, 06:52 AM
i would laugh my ass off if larrabee launched september 9th.:ROTF:heh, yeah imagine that... :D
wont happen tho, def... itll launch in 32nm afaik and that will be christmas earliest for a paper launch, more like q1/q2...

003
09-06-2009, 07:07 AM
of course, they might have doubled up gt200 and called it gt300 if it would have worked...

Why would that be an issue? That's basically what ATI is doing with RV770 to RV870 (doubling SPs, ROPs, TMUs, etc...).

But nvidia will go through a much more significant change as the SPs are now MIMD as opposed to SIMD.

Chumbucket843
09-06-2009, 07:10 AM
Why would that be an issue? That's basically what ATI is doing with RV770 to RV870 (doubling SPs, ROPs, TMUs, etc...).

But nvidia will go through a much more significant change as the SPs are now MIMD as opposed to SIMD.

one could argue todays gpu's are mimd. 64 cores is what i am waiting for.

003
09-06-2009, 07:13 AM
one could argue todays gpu's are mimd. 64 cores is what i am waiting for.

No you can't. Each SP on all cards bar G300 are SIMD only.

orangekiwii
09-06-2009, 07:22 AM
g300 better be new for nvidias sake... if its not i'll probably be a loyal AMD fan until they can make something truly new

I mean after their epic success with g80 they quite literally have done nothing noteworthy architecturally while ATI has stepped up... apparently 3 times in a row

saaya
09-06-2009, 07:42 AM
g300 better be new for nvidias sake... if its not i'll probably be a loyal AMD fan until they can make something truly new

I mean after their epic success with g80 they quite literally have done nothing noteworthy architecturally while ATI has stepped up... apparently 3 times in a rowyepp, definately... r600 was a big ouch, ati has really done a great job while nvidia has been slackin... :D
gotta give nvidia lots of respect though, their arch was that great they really could slack and screw up for this long and they are STILL very competitive... 40nm gt200 could still compete very well in the mainstream segment and G92 shrunk to 40nm would be amazing for entry level and laptops... seriously, who needs dx11... but if they want to keep the perf crown they really need to get their stuff right now... if they havent gotten a propper new chip out by mid 2010 even a theoretical 40nm GT200 and G92 wont keep them alive for long...


Why would that be an issue? That's basically what ATI is doing with RV770 to RV870 (doubling SPs, ROPs, TMUs, etc...).

But nvidia will go through a much more significant change as the SPs are now MIMD as opposed to SIMD.well is that really such a big jump? not every unit has to be able to do diferent instructions, they might just add one beefy unit to each sp block similar to what ati did?

003
09-06-2009, 07:42 AM
g300 better be new for nvidias sake... if its not i'll probably be a loyal AMD fan until they can make something truly new

That's a load of crap. HD2000 and HD3000 were epic failures that didn't even remotely compete with the GeForce 8.
Then came the HD4000 series and GTX200 series. Both were a significant step up from their predecessors.


they quite literally have done nothing noteworthy architecturally while ATI has stepped up... apparently 3 times in a row That's funny. Now you do of course realize, that the RV770 and upcoming RV870 are still using the R600 (HD2000) architecture?

vietthanhpro
09-06-2009, 07:46 AM
Why would that be an issue? That's basically what ATI is doing with RV770 to RV870 (doubling SPs, ROPs, TMUs, etc...).

But nvidia will go through a much more significant change as the SPs are now MIMD as opposed to SIMD.

but MIMD only good for GPGPU ?:shrug:

Smartidiot89
09-06-2009, 07:52 AM
yeah but i wonder why... the head honcho at amds graphics divison, read ati, just admitted in an interview that gpu specs are completely frozen more than 1 year prior to launch... so whats the big deal then?
Agree, AMD said (according to fudzilla) that 2 years prior to launch very little can be done to the architecture, so even if AMD would've known about GT300 for 2 years the changes that could be done would be very minimal, and that 1 year statement surely is very true.

The only thing I can see AMD or Nvidia doing is bumping clockfrequencies I suppose... But I totally agree it is weird that they both keep this info so tightlipped.

Origin_Unknown
09-06-2009, 07:56 AM
HD2000 and HD3000 were epic failures



in what sence?

Jamesrt2004
09-06-2009, 07:57 AM
That's a load of crap. HD2000 and HD3000 were epic failures that didn't even remotely compete with the GeForce 8.
Then came the HD4000 series and GTX200 series. Both were a significant step up from their predecessors.

That's funny. Now you do of course realize, that the RV770 and upcoming RV870 are still using the R600 (HD2000) architecture?

3000 series actually competed very well at a price perf... hd2000 was crap I give you that..

and I agree about the next series although ati won.. essentially by beating them price/perf wise

and it may be essentially based on that achitecture but they have changed it and added things, all nvidia has done has a shrunk + renaming which is lame.

orangekiwii
09-06-2009, 08:18 AM
thats what I meant 003

3, 4, and soon to be 5 series from AMD appear to be VERY good on price performance compared to nvidias offering at the time

each brought substantial (... 5 series unknown but really its probably going to be close to double) performance boost over the previous gen


Nvidia on the other hand...

8800 gtx --> 9800 gtx... what 10 20% max? 9800 gtx --> gtx280 was 50 - 60% (double some cases but few) and it wasn't really an improved architecture just more "schtuff" added to it the core differences between g80 and g200 are minor compared to the differences between 2000 series and the 4000 series

Chumbucket843
09-06-2009, 08:19 AM
but MIMD only good for GPGPU ?:shrug:

mimd good for everything. especially raytracing and anything branch heavy.

003
09-06-2009, 08:39 AM
in what sence?
The GeForce 8 totally destroyed them? The RV670 was quite literally a die shrink of the R600 with a slightly higher stock core clock speed and a smaller memory bus, with DX10.1 support later added in a driver update. Clock for clock, the 2900XT is actually slightly faster than the 3870 due to the 512-bit memory bus.


and I agree about the next series although ati won.. essentially by beating them price/perf wise Yeah in price/performance maybe, although the GTX280 was still faster than the 4870 and consequently the GTX295 is faster than the 4870X2.


8800 gtx --> 9800 gtx... what 10 20% max? 9800 gtx --> gtx280 was 50 - 60% (double some cases but few) The 8800GTX (G80) competes with the 2900XT and utterly destroys it. Then ATI rolls out the 3870 which is not much faster than the 2900XT, but just much cheaper to produce and sell.

Nvidia responded with the G92 (9800GTX) which was just a tweaked version of the G80, which was also cheaper to produce and sell, and obviously still destroyed the 3870.

Then when ATI released the 4870, it was finally competitive with the 9800GTX and beat it slightly in most cases, to which nvidia responded with the 9800GTX+ which is almost as fast as the 4870, and in some cases, still a bit faster.

Then, nvidia really opened the can of whoop ass and unleashed the GTX280 which is undeniably faster than the 4870. ATI responds with the 4870X2 which was faster than the GTX280, and nvidia responded with the GTX295 which is faster still.

Now as far as single GPU cards are concerned, we have the GTX285 and 4890 which are very close in terms of performance and trade blows most of the time, with some noteworthy outliers on both sides of the fence.

orangekiwii
09-06-2009, 08:53 AM
i'm talking generational improvement on amds side to nvidias side... the jump from 2000 to 3000 was greater than g80 to g92

the jump from 3000 to 4000 was greater than g92 to g200


the 9800gtx+ does NOT perform better than a 4870....
http://www.guru3d.com/article/radeon-hd-4870-review--asus/9

4850 and 9800gtx+ are more or less equal ... thats why nvidia released the 9800 gtx + because of the 4850 not the 4870... can of whoopass? are you for real? 4890 and gtx285 are more or less equal

4870 vs gtx280 (can of whoopass my ass) is about 10 - 20% performance difference... YEAH WHAT A CAN OF WHOOPASS costs twice as much and performs 10% better...

4870 is about equal to gtx260/gtx260 216sp


nvidias can of whoopass was not a can but more of sandvich bag full of moldy whoop ass

FischOderAal
09-06-2009, 08:53 AM
Then when ATI released the 4870, it was finally competitive with the 9800GTX and beat it slightly in most cases, to which nvidia responded with the 9800GTX+ which is almost as fast as the 4870, and in some cases, still a bit faster.

In which world are you living? The 9800GTX+ was released in order to keep the HD-4850 behind, not the HD-4870. I mostly agree on the rest, though I won't call HD-3000 an utterly failure. RV670 was what R600 should have been. RV770 has proven that the arch isn't bad.

Telperion
09-06-2009, 08:54 AM
Then when ATI released the 4870, it was finally competitive with the 9800GTX and beat it slightly in most cases, to which nvidia responded with the 9800GTX+ which is almost as fast as the 4870, and in some cases, still a bit faster.Wait... are you serious? Tell me you're joking.

cegras
09-06-2009, 08:59 AM
The GeForce 8 totally destroyed them? The RV670 was quite literally a die shrink of the R600 with a slightly higher stock core clock speed and a smaller memory bus, with DX10.1 support later added in a driver update. Clock for clock, the 2900XT is actually slightly faster than the 3870 due to the 512-bit memory bus.

Yeah in price/performance maybe, although the GTX280 was still faster than the 4870 and consequently the GTX295 is faster than the 4870X2.

The 8800GTX (G80) competes with the 2900XT and utterly destroys it. Then ATI rolls out the 3870 which is not much faster than the 2900XT, but just much cheaper to produce and sell.

Nvidia responded with the G92 (9800GTX) which was just a tweaked version of the G80, which was also cheaper to produce and sell, and obviously still destroyed the 3870.

Then when ATI released the 4870, it was finally competitive with the 9800GTX and beat it slightly in most cases, to which nvidia responded with the 9800GTX+ which is almost as fast as the 4870, and in some cases, still a bit faster.

Then, nvidia really opened the can of whoop ass and unleashed the GTX280 which is undeniably faster than the 4870. ATI responds with the 4870X2 which was faster than the GTX280, and nvidia responded with the GTX295 which is faster still.

Now as far as single GPU cards are concerned, we have the GTX285 and 4890 which are very close in terms of performance and trade blows most of the time, with some noteworthy outliers on both sides of the fence.

Heh. Heheheh.

marten_larsson
09-06-2009, 09:03 AM
The GeForce 8 totally destroyed them? The RV670 was quite literally a die shrink of the R600 with a slightly higher stock core clock speed and a smaller memory bus, with DX10.1 support later added in a driver update. Clock for clock, the 2900XT is actually slightly faster than the 3870 due to the 512-bit memory bus.

Yeah in price/performance maybe, although the GTX280 was still faster than the 4870 and consequently the GTX295 is faster than the 4870X2.

The 8800GTX (G80) competes with the 2900XT and utterly destroys it. Then ATI rolls out the 3870 which is not much faster than the 2900XT, but just much cheaper to produce and sell.

Nvidia responded with the G92 (9800GTX) which was just a tweaked version of the G80, which was also cheaper to produce and sell, and obviously still destroyed the 3870.

Then when ATI released the 4870, it was finally competitive with the 9800GTX and beat it slightly in most cases, to which nvidia responded with the 9800GTX+ which is almost as fast as the 4870, and in some cases, still a bit faster.

Then, nvidia really opened the can of whoop ass and unleashed the GTX280 which is undeniably faster than the 4870. ATI responds with the 4870X2 which was faster than the GTX280, and nvidia responded with the GTX295 which is faster still.

Now as far as single GPU cards are concerned, we have the GTX285 and 4890 which are very close in terms of performance and trade blows most of the time, with some noteworthy outliers on both sides of the fence.

Wow, you've got so many things wrong there. First of all 9800GTX+/GTS250 is as fast as a 4850 and thus a 4870 (1GB) is 20-30% faster - actually as fast as the GTX260 (216SP). The fact that Nvidia had to increase the SP-count from 192 to 216 to compete with the 4870 has gone unnoticed by you I assume.

Also, the GTX280 and GTX260 were released prior to the 4870 launch. Nvidia had to drop the prices with 150-200$ a month later and gave cash backs to early adopters. The 9800GTX+ was released after the 4850 to give it competition since the 9800GTX was loosing...

So the alleged can of whop ass from Nvidia was never there. AMD/ATI opened some cans, first 4850/199$ beating 9800GTX/(don't know exact price but 8800GT seemed to be around 200$ at that time) and 4870(512MB)/299$ trading blows with GTX260(192SP)/450$ and then 4870X2 which took the performance crown. GTX275 was also a response to what AMD/ATI did - the 4890.

003
09-06-2009, 09:05 AM
Wait... are you serious? Tell me you're joking.

Are you trying to suggest that the 3870 is in any way comparable to the 9800GTX?


The 9800GTX+ was released in order to keep the HD-4850 behind, not the HD-4870.

Ah you are correct, I got that mixed up. Well most of what I said is still relevant. The 4870 is only ~30% faster than the 4850 and the GTX280 is faster than the 4870 so it mostly holds true.

marten_larsson
09-06-2009, 09:10 AM
Are you trying to suggest that the 3870 is in any way comparable to the 9800GTX?

You should read a few of the replies made here and maybe surf over to Anandtech to get some of your facts straight.

http://www.anandtech.com/showdoc.aspx?i=3341

http://www.anandtech.com/showdoc.aspx?i=3415

http://www.anandtech.com/showdoc.aspx?i=3340

http://www.anandtech.com/showdoc.aspx?i=3334

http://www.anandtech.com/showdoc.aspx?i=3408

http://www.anandtech.com/showdoc.aspx?i=3354

Chumbucket843
09-06-2009, 09:11 AM
So the alleged can of whop ass from Nvidia was never there. AMD/ATI opened some cans, first 4850/199$ beating 9800GTX/(don't know exact price but 8800GT seemed to be around 200$ at that time) and 4870(512MB)/299$ trading blows with GTX260(192SP)/450$ and then 4870X2 which took the performance crown. GTX275 was also a response to what AMD/ATI did - the 4890.

having a good performance per dollar is not opening a can of whoop ass and using prices of cards from over a year ago is completely inane. it is also important to note that 4890 is ati's single gpu flagship card while the 275 is a crippled 285.

marten_larsson
09-06-2009, 09:29 AM
having a good performance per dollar is not opening a can of whoop ass and using prices of cards from over a year ago is completely inane. it is also important to note that 4890 is ati's single gpu flagship card while the 275 is a crippled 285.

What cards should I be using then since the only cards released this year is 4890 and GTX275... The can of whoop ass that Nvidia opened wasn't a can since AMD/ATI hade similar performance. The GTX280 was good but the 4870 (1GB) wasn't all that much slower (~10% avg at launch according to Anandtech). And the 4870X2 beat it quite easily. GTX295 isn't far superior to 4870X2 so... Adding price only magnifies the slap in the face of Nvidia when Rv770 launched but the slap would have been there even if AMD/ATI would have priced 4870 to 350-400$...

Wesker
09-06-2009, 09:30 AM
having a good performance per dollar is not opening a can of whoop ass.

It is when you start outselling your competition.


and using prices of cards from over a year ago is completely inane. it is also important to note that 4890 is ati's single gpu flagship card while the 275 is a crippled 285.

He was quoting the correct prices, as they were when the 4000 series launched, AFAIK.

Also RV790 is almost half the size of GT200 and has one-third less transistors too. That was I think the most impressive part of RV790.

To(V)bo Co(V)bo
09-06-2009, 10:40 AM
The ATi architecture hasnt changed much at all over the past few generations. Its actually a VERY modular design, more parts can be added without shaking up the whole core logic. Thats what keeps making the architecture so much stronger as it shrinks. As it shrinks there is a whole lotta more room to add what ever else they need to it. It totally scales with size. Also remember this, ATi actually already has DX11 features built into the 4800 series. Its already in the architecture, but since Nvidia wouldnt implement all DX10 features it forced DX10 to be a half @ss standard. Thats where ATi DX10.1 comes from, its already in the core. DX11 is just the leftover features of DX10 that didnt get proper release.

Hell Hound
09-06-2009, 10:44 AM
I hope they get their driver's together,I keep getting the atiogl.dll error when I play the cod 1 using ccc 9.8.:mad: :down: :shrug:

LiquidReactor
09-06-2009, 11:24 AM
Then when ATI released the 4870, it was finally competitive with the 9800GTX and beat it slightly in most cases, to which nvidia responded with the 9800GTX+ which is almost as fast as the 4870, and in some cases, still a bit faster.

Then, nvidia really opened the can of whoop ass and unleashed the GTX280 which is undeniably faster than the 4870. ATI responds with the 4870X2 which was faster than the GTX280, and nvidia responded with the GTX295 which is faster still.


Umm...I dont know where you got your info from but you are dead wrong.

HD 4850 =/~ 9800GTX+ or >>>9800GTX

HD 4870 1Gb = GTX 260c216

HD 4870x2 >>> GTX 280 or =/~ GTX 295 (at stock speeds). GTX 295 does usually have better overclocking head room though.

LightSpeed
09-06-2009, 11:49 AM
If the 5870 is indeed faster than the GTX295, prices should tumble for all existing cards considerably

Also, nvidia will be in a world of trouble. Funny how they always reacted to every GPU ATI released hastily. 4850 was met with a 9800GTX+, 4890 made them put together a GTX275...what are they going to do now? overclock the GTX295 and cal it the GTX299 Ultra? Zzz

gamervivek
09-06-2009, 12:22 PM
with DX10.1 support later added in a driver update

oh hello thar,can I download teh gpu over teh internezz..

Smartidiot89
09-06-2009, 12:23 PM
If the 5870 is indeed faster than the GTX295, prices should tumble for all existing cards considerably

Also, nvidia will be in a world of trouble. Funny how they always reacted to every GPU ATI released hastily. 4850 was met with a 9800GTX+, 4890 made them put together a GTX275...what are they going to do now? overclock the GTX295 and cal it the GTX299 Ultra? Zzz
Don't forget the bundled nuclearplant and phasecooler included!

Migi06
09-06-2009, 12:34 PM
If the 5870 is indeed faster than the GTX295, prices should tumble for all existing cards considerably

Also, nvidia will be in a world of trouble. Funny how they always reacted to every GPU ATI released hastily. 4850 was met with a 9800GTX+, 4890 made them put together a GTX275...what are they going to do now? overclock the GTX295 and cal it the GTX299 Ultra? Zzz


Don't forget the bundled nuclearplant and phasecooler included!

:ROTF::rofl::scope::rofl::ROTF:

Chumbucket843
09-06-2009, 02:17 PM
Also RV790 is almost half the size of GT200 and has one-third less transistors too. That was I think the most impressive part of RV790.

this is why I like nvidia:
nvidia cards on F@H: 4 petaflops with 16,500 gpu's
ATi cards on F@H: 1 petaflop with 10,000 gpu's

cegras
09-06-2009, 02:41 PM
this is why I like nvidia:
nvidia cards on F@H: 4 petaflops with 16,500 gpu's
ATi cards on F@H: 1 petaflop with 10,000 gpu's

This is why I don't like poor coding.

Chumbucket843
09-06-2009, 02:59 PM
This is why I don't like poor coding.

haha, we all know stanford cant code for sh*t.:rofl:
this is from their FAQ:

Why are the ATI x86 FLOP numbers half of the ATI native FLOP numbers?

Due to a difference in the implementation (in part due to hardware differences), the ATI code must do two force calculations where the x86, Cell, and NVIDIA hardware need only do one. This increases the overall native FLOP count for ATI hardware, but since these are not useful FLOPS in a sense, we did not include them in the x86 count.
besides this cuda with the new mimd architecture is going to add a huge speed up due to the fact mimd achieves maximum ipc.

Hell Hound
09-06-2009, 03:19 PM
haha, we all know stanford cant code for sh*t.:rofl:
this is from their FAQ:

besides this cuda with the new mimd architecture is going to add a huge speed up due to the fact mimd achieves maximum ipc.
Seems bias to me.:yepp:

003
09-06-2009, 03:25 PM
Also RV790 is almost half the size of GT200 and has one-third less transistors too. That was I think the most impressive part of RV790. And yet it consumes more power and puts out more heat than the GT200b...

jagvar
09-06-2009, 03:35 PM
http://img196.imageshack.us/img196/5589/20090904065937.jpg
Source:
http://forum.beyond3d.com/showpost.php?p=1329964&postcount=2223

oh my father:shocked:
it's grate

Utnorris
09-06-2009, 03:42 PM
You do realize that is fake, right?

Sh1tyMcGee
09-06-2009, 03:56 PM
And yet it consumes more power and puts out more heat than the GT200b...

Very true, hope they improved upon these things on the new chips.

Solus Corvus
09-06-2009, 04:17 PM
haha, we all know stanford cant code for sh*t.:rofl:
this is from their FAQ:

besides this cuda with the new mimd architecture is going to add a huge speed up due to the fact mimd achieves maximum ipc.
I wonder if that will continue to be the case under OpenCL.


And yet it consumes more power and puts out more heat than the GT200b...
At idle.

003
09-06-2009, 05:10 PM
At idle.

And load. The GTX280 uses more power than the 4890 but not the GTX285.

mindfury
09-06-2009, 05:33 PM
And load. The GTX280 uses more power than the 4890 but not the GTX285.
LoL

http://images.anandtech.com/graphs/radeonhd4890_040209033751/18769.png

LiquidReactor
09-06-2009, 05:37 PM
http://images.anandtech.com/graphs/radeonhd4890_040209033751/18769.png

http://tpucdn.com/reviews/MSI/HD_4890_Cyclone_SOC/images/power_average.gif

http://tpucdn.com/reviews/MSI/HD_4890_Cyclone_SOC/images/power_peak.gif

and ofcourse furmark maximum where ati cards get over 9000 fps and suck up ungodly amounts of power

http://tpucdn.com/reviews/MSI/HD_4890_Cyclone_SOC/images/power_maximum.gif

AMDDeathstar
09-06-2009, 06:49 PM
http://www.semiaccurate.com/forums/showpost.php?p=4066&postcount=15


There is at least one new name you will hear next week, and it is a neat one.
Now what's the boy talking?
:shrug:

Chickenfeed
09-06-2009, 06:55 PM
I believe Charlie is saying that one of the new cards is called something other than what everyone has assumed / rumored thus far. Whether its actually a different gpu or just a different model who knows. But I do agree with him, the level of secrecy AMD have maintained is rather impressive.

As far as the whole power usage debate, I do hope to high hell that these new cards are competitive in power usage being they are on a new process. If I see a 300watt single gpu card in the next 6 months I'll cry. Hopefully AMD also have their idle usage down this time. The 4xxx's sucked big time compared to the GT200s as far as that is concerned. I still don't understand why they didn't opt for lower 2d clocks as manually reducing them doesn't seem to cause any problems ( Nvidia's 2d clocks are way way lower proportionally )

003
09-06-2009, 07:30 PM
LoL

http://images.anandtech.com/graphs/radeonhd4890_040209033751/18769.png

That graph is wrong.

eleeter
09-06-2009, 07:39 PM
That graph is wrong.
Thanks for the detailed explanation.

.....who cares about current soon to be last generation power usage. Bring on the new stuff! The wait is killing me.

Kylzer
09-06-2009, 07:46 PM
Thanks for the detailed explanation.

.....who cares about current soon to be last generation power usage. Bring on the new stuff! The wait is killing me.

Indeed only 3 or 4 days (depending where you are)

OMFG !!! im excited :D

Wesker
09-06-2009, 10:34 PM
this is why I like nvidia:
nvidia cards on F@H: 4 petaflops with 16,500 gpu's
ATi cards on F@H: 1 petaflop with 10,000 gpu's

While it's great for Nvidia to have such a close relationship with the researchers at Stanford, you're completely overlooking in-game performance. :confused:


And yet it consumes more power and puts out more heat than the GT200b...

You have got to be joking...

RejZoR
09-06-2009, 10:39 PM
Of course AMD has so low flops score if 3/4 of the shader units aren't even utilized because of lame client coding? And AMD doesn't have much to do with that.

tajoh111
09-06-2009, 10:47 PM
You have got to be joking...

Its almost a certainty that the 4870 uses more power in idle, while its a certainty that the gt200 uses more power in peak.

However unless you game more than you actually use your computer or bench(which you don't care about power in the first place), then the energy usage cost is much higher to run a 4870.

Solus Corvus
09-06-2009, 11:02 PM
Its almost a certainty that the 4870 uses more power in idle, while its a certainty that the gt200 uses more power in peak.

However unless you game more than you actually use your computer or bench(which you don't care about power in the first place), then the energy usage cost is much higher to run a 4870.
They could be constantly running a GPGPU application of some sort. In that case it would be cheaper to run a 4890 then a 275, 280, or 285. Though you'd probably get more work per energy spent on the NV cards. We will see if OpenCL changes that scene at all in one direction or the other.

Heinz68
09-06-2009, 11:08 PM
http://www.semiaccurate.com/forums/showpost.php?p=4066&postcount=15


Now what's the boy talking?
:shrug:

Probably something to do with this:
http://www.semiaccurate.com/2009/08/11/evergreen-has-six-members/

Wesker
09-06-2009, 11:10 PM
Its almost a certainty that the 4870 uses more power in idle, while its a certainty that the gt200 uses more power in peak.

However unless you game more than you actually use your computer or bench(which you don't care about power in the first place), then the energy usage cost is much higher to run a 4870.

I know that was the case when the 4870 first launched, but I would like to see recent power consumption figures.

At launch, PowerPlay was broken with the RV770 core only downclocking to 500MHz. Now, even RV790 downclocks to 240MHz and I assume that ATI made changes to idle state definitions through driver updates.

jaredpace
09-07-2009, 12:41 AM
30 watt idle for 5870?:

http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2332709

http://www.phoronix.com/forums/showpost.php?p=88865&postcount=826

Tim
09-07-2009, 01:06 AM
Wow, that would be utterly fantastic. I like it.

30w seems unbelievable though.

Jamesrt2004
09-07-2009, 01:13 AM
*snip the pics*

and ofcourse furmark maximum where ati cards get over 9000 fps and suck up ungodly amounts of power



how do you even measure just the cars power withdraw

that 320w in the furmark seems unbelivable seeming you can run it on pcie 1.1 so that would be 2 adapters @ 75w each plus pcie slot only gives 75w so 225w max so yeah :confused:

tbh I believe the one above it more seems more reasonable I always thought believed ati had crap power idle settings this time around and nvidia had good ones but just for the obvious reasons that the nvidia cards die size was/nearly twice the size is easy indicator of power consumption :shrug:

CrimInalA
09-07-2009, 01:57 AM
The 5870 cards are said to be slightly longer than a 2900xt . Means I'm out , my case can't take it . Since my 2900 only has only 1cm free space .

labs23
09-07-2009, 01:59 AM
Wow, that would be utterly fantastic. I like it.

30w seems unbelievable though.

And 4890 idles at what W?

Origin_Unknown
09-07-2009, 02:16 AM
nvidias can of whoopass was not a can but more of sandvich bag full of moldy whoop ass



i really lol'd @ this

jaredpace
09-07-2009, 02:19 AM
And 4890 idles at what W?

--cards: 4850 4870 3870X2 GTX280 9800GX2 9800GTX+ 9800GTX
Idle load 123 --149 ---119 --- 127 ----174 ---- N/A ----- 141
Full load 203 --229 ---289 --- 279 ----313 ---- N/A ----- 223

Edit: Actual card consumption here:
http://www.xbitlabs.com/images/video/bfg-gf-gtx275oc/gtx275_power.png
http://www.xbitlabs.com/articles/video/display/bfg-gf-gtx275oc_5.html#sect0

A 4890 is slightly less than a 48701gb at idle ~ 45 watts. The anandtech graphs (that i first posted) are total system power consumption. So you have a 4890 @ 43w idle, 121w load, both hexus, xbit, and anand agree the rv770/790 uses ~80w going from 2d-> 3d. And now supposedly, the 5870 @ 30w idle & 195w load (165w between 2d->3d!). Maybe they've fixed the deal with the ram's idle power draw?
Edit again: 80watts for running 800 shaders and 160 watts for 1600?:ROTF:

Calmatory
09-07-2009, 02:20 AM
Wow, that would be utterly fantastic. I like it.

30w seems unbelievable though.

Just great. A graphics card using more power at idle than my computer comsumes at full load. :shrug:

labs23
09-07-2009, 02:28 AM
jared - Thanks for the graphs.

I'm starting to think this 30W idle is BS. I mean how will they(AMD) do that?:confused:

jaredpace
09-07-2009, 02:40 AM
jared - Thanks for the graphs.

I'm starting to think this 30W idle is BS. I mean how will they(AMD) do that?:confused:

well i updated the post because the hexus and anand numbers were of "total system" consumption. Xbit has numbers from the actual cards.

looks like rv770 consumes about 70w-80w idle, the revised rv790 about 45w idle, and the rv870 25w-30w idle (according to rumors). Sounds possible, 15 to 20 watts less than a 4890 or gtx260.

FischOderAal
09-07-2009, 02:56 AM
I'm starting to think this 30W idle is BS. I mean how will they(AMD) do that?:confused:

The biggest power-hog wasn't the GPU but the GDDR5. You could save much energy when you downclock the RAM down to 200 MHz. IIrc the consumption decreased by 20 Watt in IDLE after downclocking.


Just great. A graphics card using more power at idle than my computer comsumes at full load. :shrug:

That's why you should always have a notebook to surf teh interwebz and do office-tasks. I don't want to know how many do such things with a i7... Energy-costs don't seem to be high enough me thinks :shrug:

jaredpace
09-07-2009, 03:17 AM
Im wondering if a 5870 can consume 180-190w, how much can a 5870x2 pull? 360w? I was under the impression they couldn't exceed 300watts with 150 from the 8-pin, 75w from the pci-e power lanes, and 75w from the 6-pin. I mean a gtx295 consumes a little more than 2 55nm gtx260's and a 4870x2 consumes 2x a 4870. So i think they're going over 300watts unless they have some magic going on in the 5870x2. Maybe ati has increased transistor efficiency with the 40nm, but that doesn't explain the 5870 max of 180w!!?!
http://www.xbitlabs.com/images/video/evga-geforce-gtx295/gtx295_power.png

Smartidiot89
09-07-2009, 03:23 AM
--cards: 4850 4870 3870X2 GTX280 9800GX2 9800GTX+ 9800GTX
Idle load 123 --149 ---119 --- 127 ----174 ---- N/A ----- 141
Full load 203 --229 ---289 --- 279 ----313 ---- N/A ----- 223

Edit: Actual card consumption here:
http://www.xbitlabs.com/images/video/bfg-gf-gtx275oc/gtx275_power.png
http://www.xbitlabs.com/articles/video/display/bfg-gf-gtx275oc_5.html#sect0

A 4890 is slightly less than a 48701gb at idle ~ 45 watts. The anandtech graphs (that i first posted) are total system power consumption. So you have a 4890 @ 43w idle, 121w load, both hexus, xbit, and anand agree the rv770/790 uses ~80w going from 2d-> 3d. And now supposedly, the 5870 @ 30w idle & 195w load (165w between 2d->3d!). Maybe they've fixed the deal with the ram's idle power draw?
Edit again: 80watts for running 800 shaders and 160 watts for 1600?:ROTF:
40nm and improved architecture:ROTF:

Helloworld_98
09-07-2009, 03:46 AM
Im wondering if a 5870 can consume 180-190w, how much can a 5870x2 pull? 360w? I was under the impression they couldn't exceed 300watts with 150 from the 8-pin, 75w from the pci-e power lanes, and 75w from the 6-pin. I mean a gtx295 consumes a little more than 2 55nm gtx260's and a 4870x2 consumes 2x a 4870. So i think they're going over 300watts unless they have some magic going on in the 5870x2. Maybe ati has increased transistor efficiency with the 40nm, but that doesn't explain the 5870 max of 180w!!?!
http://www.xbitlabs.com/images/video/evga-geforce-gtx295/gtx295_power.pngthe 5870 x2 will probably consume about 325w if each 5870 uses 180-190w since the PCB uses a fair chunk of that amount.

and I'm going to guess that the 5870 X2 will have dual 8 pin connectors like the ASUS MARS.

annihilat0r
09-07-2009, 07:28 AM
about that 5870 card and its SPs.... :D

I don't understand why everyone takes it for granted that the 5800s will have 1600 sps each... Don't forget that the 4800s were rumored to be 320sp and turned out to be 800. I think the reverse of that might just happen here. And still no numbers from any sources makes me think that this card won't live up to the GTX 295-killing hype, I won't be surprised if that's the case, in fact I'll be surprised if 5870 can pull within %10 performance of GTX 295

I'll buy it anyway.

Helloworld_98
09-07-2009, 07:42 AM
about that 5870 card and its SPs.... :D

I don't understand why everyone takes it for granted that the 5800s will have 1600 sps each... Don't forget that the 4800s were rumored to be 320sp and turned out to be 800. I think the reverse of that might just happen here. And still no numbers from any sources makes me think that this card won't live up to the GTX 295-killing hype, I won't be surprised if that's the case, in fact I'll be surprised if 5870 can pull within %10 performance of GTX 295

I'll buy it anyway.

there's also rumours about 1200SP's and 2000SP's but the general consensus is that no matter how many SP's it has, it's going to make up for it in clock speed.

Chumbucket843
09-07-2009, 07:48 AM
there's also rumours about 1200SP's and 2000SP's but the general consensus is that no matter how many SP's it has, it's going to make up for it in clock speed.

the problem with this will be the heat. look at the 4890. the power density on it is insane.

Miwo
09-07-2009, 08:02 AM
I'm looking forward to seeing what the 4770 replacement specs will be. Hopefully at least 256bit bus this time around

orangekiwii
09-07-2009, 09:12 AM
I'm looking forward to launch. Period.

spursindonesia
09-07-2009, 09:12 AM
I'm looking forward to seeing what the 4770 replacement specs will be. Hopefully at least 256bit bus this time around

Not gonna happen, if Cypress only gets 256 bit mem interface buswidth, 128 bit for mainstream will stay. We can expect faster mem ICs though, such as 5 Gbps chip to replace the older 4 Gbps one used in RV740.

Chickenfeed
09-07-2009, 09:13 AM
Im wondering if a 5870 can consume 180-190w, how much can a 5870x2 pull? 360w? [/IMG]

Dual 8pin perhaps? I wouldn't be surprised but I'm hoping they don't use much more than the current gpus if not at all.

LordEC911
09-07-2009, 03:11 PM
At launch, PowerPlay was broken with the RV770 core only downclocking to 500MHz. Now, even RV790 downclocks to 240MHz and I assume that ATI made changes to idle state definitions through driver updates.
Dave Baumann (http://forum.beyond3d.com/showpost.php?p=1283275&postcount=1036) begs to differ, PowerPlay was not "broken" at launch.



The biggest power-hog wasn't the GPU but the GDDR5. You could save much energy when you downclock the RAM down to 200 MHz. IIrc the consumption decreased by 20 Watt in IDLE after downclocking.

That's why you should always have a notebook to surf teh interwebz and do office-tasks. I don't want to know how many do such things with a i7... Energy-costs don't seem to be high enough me thinks :shrug:
Also, it wasn't just the GDDR5 ICs but also the memory controllers, since the individual ICs simply couldn't be consuming such large amounts of wattage.

Edit- There was talk of no PCI-e stickers if they go over the 300w TDP, so it doesn't look like something AMD/ATi themselves will be doing anytime soon.

Hornet331
09-07-2009, 03:55 PM
Dave Baumann (http://forum.beyond3d.com/showpost.php?p=1283275&postcount=1036) begs to differ, PowerPlay was not "broken" at launch.



So it just sucked... :ROTF:

Even worse, especial it was very impressive with the R600 and totaly sucked with R700... :rolleyes:

Manicdan
09-07-2009, 04:05 PM
Edit- There was talk of no PCI-e stickers if they go over the 300w TDP, so it doesn't look like something AMD/ATi themselves will be doing anytime soon.

explain this to me pls, i have no idea what your mean by stickers

jaredpace
09-07-2009, 04:08 PM
maybe he means if it pulls over 300watts it cant be certified as a safe "genuine" PCI-e peripheral in the labs, and it doesn't get the pci-e sticker on the box? lol I dunno.

LordEC911
09-07-2009, 04:17 PM
explain this to me pls, i have no idea what your mean by stickers
http://www.pcisig.com/specifications/pciexpress/high_power_graphics/

To be able to make your product PCI-e "compatible" you have to meet certain requirements/specifications set forth by PCI-Sig.

Manicdan
09-07-2009, 08:42 PM
so if its over 300W, does it void your warranty if you blow up your mobo

FischOderAal
09-07-2009, 09:56 PM
Also, it wasn't just the GDDR5 ICs but also the memory controllers, since the individual ICs simply couldn't be consuming such large amounts of wattage.

Ah :) Thanks for clearing things up! :up:

Tim
09-07-2009, 11:29 PM
Everyday I check back here. With 2 days to go, and no real leaks it's quite incredible.

I don't think we've ever seen a more closely guarded launch!

eric66
09-08-2009, 01:16 AM
wth its annoying i want some benchs :(

largon
09-08-2009, 01:30 AM
so if its over 300W, does it void your warranty if you blow up your moboNo, because the motherboard has hardly anything to do with the power consumption of a GPU. That is, as long as the card has an external power source the wattage the PCIe slot delivers is never used to feed the GPU, but rather those secondary loads like VRAM and onboard 3rd party chips that combined draw something like 20-50W.

zalbard
09-08-2009, 01:43 AM
Someone should spill some beans, seriously!
Where are all those people saying "What is NDA? LOL!"?! :D

T.Goat
09-08-2009, 01:44 AM
What are the chances that the 5870/nVidia equivalent will be the last card you need for 1680x1050 until something big changes? I know we're close with the GTX275/4890.

flopper
09-08-2009, 02:15 AM
What are the chances that the 5870/nVidia equivalent will be the last card you need for 1680x1050 until something big changes? I know we're close with the GTX275/4890.

5870 be enuff for 1920x1200 for a long time.

CrimInalA
09-08-2009, 02:33 AM
Does anyone think that the 5850 will also be launched at Sept 10th ?

And what Length will that card be compared to a 2900XT ?
I have read the 5870 will be a tad longer than a 2900 , but screenshots of a 5850 made me think that card will be less lengthy .

Is anything known about that ?

zalbard
09-08-2009, 02:45 AM
Does anyone think that the 5850 will also be launched at Sept 10th ?

And what Length will that card be compared to a 2900XT ?
I have read the 5870 will be a tad longer than a 2900 , but screenshots of a 5850 made me think that card will be less lengthy .

Is anything known about that ?
No one knows anything, or they at least doesn't want to tell us.

Farinorco
09-08-2009, 02:51 AM
I don't think they're going to launch any card the Sept 10th, it's only the date of a press event under NDA at US (and I think that today is the date for the same event in Europe). More than probably, both HD5850/HD5870 are going to be launched at the same time, as usual, and there's rumors about more cards being launched simultaneously to cover other market segments, but I don't know. The fact is that very little is known about that cards, the secrecy around this launch is surprising to say the least...

Regarding the question about this gen cards being enough "until something big changes", I think that videoconsoles are what limit that aspect more: there are practically no PC exclusive games aside online multiplayer (because of their little profitability), and online multiplayer games are usually targeted to very afordable hardware specs (the more players, the better the value of a multiplayer game, so publishers have a higher than usual need to target the wider possible audience). Chances are that there's not going to be any substantial changes until the next videoconsoles generation is released.

Lightman
09-08-2009, 02:53 AM
10th is not a launch.
So don't get your hopes high ...

labs23
09-08-2009, 03:55 AM
No one knows anything, or they at least doesn't want to tell us.

A fellow member from a forum here in the Phils. mentioned an 18K pesos price for a 5870, roughly $370+.
Price is way out of proportion here due to taxes:mad:, so you might expect a much lower price in your place.

No word mentioned on the release date though. But why would they wait for October?
Sept. 10 is just near anyway, we'll soon find out everything.;)

Origin_Unknown
09-08-2009, 04:33 AM
But why would they wait for October?

stock building possibly?

Chrono Detector
09-08-2009, 05:00 AM
So at September 10th will be see reviews, specs and benchmarks? I just wanna know already.

Vinas
09-08-2009, 05:30 AM
Regarding the question about this gen cards being enough "until something big changes", I think that videoconsoles are what limit that aspect more:

there are practically no PC exclusive games aside online multiplayer (because of their little profitability)... Chances are that there's not going to be any substantial changes until the next videoconsoles generation is released.False and false. :rofl:

Not to be rude but consoles aren't that important -- this is going to be a great year for PC gaming. :up: Dx 11 and all the new titles set to release have me very excited!!!

What are the consoles using... dx9 still? blah... consoles are really showing their age these days. Don't get me wrong if they could get a technology bump I'd game with them. Until then I'll get that 5870 so I can be ready for L4D2, Diablo 3, Fallen earth, Starcraft 2, Dawn Of War 2, Empire: Total War, Demigod, new COH expansion, BF3?, Half-Life 2: Episode Three?, Mass Effect 2,etc...

but the console have "The Beatles: Rock Band" i guess i'll pick that up too... meh

Chickenfeed
09-08-2009, 05:49 AM
but the console have "The Beatles: Rock Band" i guess

LOL :p:

As far as the notion of consoles holding back PC hardware they aren't related. The issue is many of the current engines are developed with consoles in mind. If an engine were developed properly from the ground up with modern x64 cpus and multigpu setups in mind, current hardware could decimate consoles. The issue is this makes no sense financially when you can spend less money on a fixed platform that will result in a much higher return for the overall investment.

I still like to have my Ps3 for the odd exclusive (Rachet, MGS ect ) but for the most part I stick to the PC side of things.

I can't wait to see what these cards can do... and hopefully AMD can pick up their socks driver wise then I'll have no reason not to get a 5800.

largon
09-08-2009, 06:22 AM
I blame MS and Sony for sitting on their hands selling ancient hardware with Xbox360 and PS3, thwarting gamedevs from making real progress.

generics_user
09-08-2009, 06:26 AM
LOL :p:

As far as the notion of consoles holding back PC hardware they aren't related. The issue is many of the current engines are developed with consoles in mind. If an engine were developed properly from the ground up with modern x64 cpus and multigpu setups in mind, current hardware could decimate consoles. The issue is this makes no sense financially when you can spend less money on a fixed platform that will result in a much higher return for the overall investment.

I still like to have my Ps3 for the odd exclusive (Rachet, MGS ect ) but for the most part I stick to the PC side of things.

I can't wait to see what these cards can do... and hopefully AMD can pick up their socks driver wise then I'll have no reason not to get a 5800.

QFT

the only reasons why i didn't sell my X360 were Halo and day-long lanpartys on 8 consoles + flat screens with 20 or more players :D

Farinorco
09-08-2009, 06:29 AM
False and false. :rofl:

Not to be rude but consoles aren't that important -- this is going to be a great year for PC gaming. :up: Dx 11 and all the new titles set to release have me very excited!!!

What are the consoles using... dx9 still? blah... consoles are really showing their age these days. Don't get me wrong if they could get a technology bump I'd game with them. Until then I'll get that 5870 so I can be ready for L4D2, Diablo 3, Fallen earth, Starcraft 2, Dawn Of War 2, Empire: Total War, Demigod, new COH expansion, BF3?, Half-Life 2: Episode Three?, Mass Effect 2,etc...

but the console have "The Beatles: Rock Band" i guess i'll pick that up too... meh

I'd be glad of being wrong with that: I've never owned a videoconsole, I've always been (and always will) a PC guy.

I agree with consoles showing their age nowadays. And current (and coming) PC hardware being far ahead of it now too. But it's not a thing about technologies but market.

I don't think any of those games you've named are going to be any kind of graphical or technical revolution. Some of them have to be done to be playable too on current gen consoles, some of them aren't targeting super high end hardware...

Even companies like id software or Crytek are developing their next engines to be multiplatform (and therefore useable) for videoconsoles.

I don't think we will see the new hardware shining in games up to its true potential until we get rid of those ageing consoles that rule the videogames market nowadays. Maybe a little cantrip here and there on PC versions, maybe some PC exclusive games built on PC exclusive game engines (a few, because it's a not so profitable market to invest developing resources on)... but I hope I'm wrong. :shrug:

Wesker
09-08-2009, 06:39 AM
Oh, hey Lord, long time no see! :D


Dave Baumann (http://forum.beyond3d.com/showpost.php?p=1283275&postcount=1036) begs to differ, PowerPlay was not "broken" at launch.


I read the B3D thread and there were a lot of unhappy RV770 customers (when it came to idle power consumption).

Dave, as an employee of ATI/AMD, had to say that RV770's PowerPlay was functioning at launch. I'm not denying that PowerPlay wasn't working (as I mentioned the core did downclock from 750MHz -> 500MHz), just that it wasn't working to the best of its potential.

VID's, clock speeds and heat output of RV770 cards in the idle state, around launch time, they all pointed to PowerPlay not being very effective.

As I mentioned before, it's much better now with the GPU downclocking to only 240MHz.

Smartidiot89
09-08-2009, 06:39 AM
So at September 10th will be see reviews, specs and benchmarks? I just wanna know already.
Specifications most likely. Benchmarks and reviews will come later ;)

Revv23
09-08-2009, 06:48 AM
I hope I'm wrong but typically when there is such a small amount of info around pre-launch it's because there isn't much exciting data around to leak.

w0mbat
09-08-2009, 06:49 AM
Yeah, like HD4800 series wasnt exciting at all oO

Calmatory
09-08-2009, 07:10 AM
I hope I'm wrong but typically when there is such a small amount of info around pre-launch it's because there isn't much exciting data around to leak.

Talk about 1600 shaders isn't exciting? Yeah, they keep everythinng closed, Nvidia can almost rest that theur GT300 will be significantly faster. And BOOM; 2000 shaders @ 995 MHz with $299 price tag, full availability from day one.

Now I sit back and wait for the first "omg lolz no :ROTF:" postings from people with no sense of humour. (1600/773MHz for me as a real guess.).

labs23
09-08-2009, 07:12 AM
Specifications most likely. Benchmarks and reviews will come later ;)

I agree, might turn up that way. And of course them stating the date for the hard launch.http://www.operationsports.com/forums/images/smilies/graemlins/woot.gif
Who knows, the hard launch might just be 2 weeks after the 10th.:shrug:



I hope I'm wrong but typically when there is such a small amount of info around pre-launch it's because there isn't much exciting data around to leak.

More like it, but not this time it seems.
I bet we won't end up saying "There's nothing special about it." this time, I mean its DX11 and all that(hype?). LOL.

flopper
09-08-2009, 07:26 AM
Who needs Viagra when ati is not letting any information slip between their lips about specs?
:D

Jowy Atreides
09-08-2009, 07:31 AM
Does anyone think that the 5850 will also be launched at Sept 10th ?

And what Length will that card be compared to a 2900XT ?
I have read the 5870 will be a tad longer than a 2900 , but screenshots of a 5850 made me think that card will be less lengthy .

Is anything known about that ?

Yeah, I noticed a price drop today at a retailer of all 4 series cards.

The HD4890 selling for 40% of the price of the CHEAPEST gtx 285 ...

jam2k
09-08-2009, 07:43 AM
Yeah, like HD4800 series wasnt exciting at all oO

It was exciting, mostly because it forced nV to cut prices in half.

I wonder how many people payed $650 for the GTX280 :rofl:

XKaan
09-08-2009, 08:07 AM
Yeah right out of the gate the pricing was a bit absurd!

Hell Hound
09-08-2009, 08:09 AM
5870 be enuff for 1920x1200 for a long time.

Yeah right,what games do you play lol.:rofl:

Dainas
09-08-2009, 11:51 AM
Who needs Viagra when ati is not letting any information slip between their lips about specs?
:D

if theres one thing I've learned about the ATI only crowd, its hope springs eternal. Always hoping to have a product that smashes nvidia the way the 9700pro did and then dancing like little school girls just because it competes.

Rhys
09-08-2009, 12:15 PM
if theres one thing I've learned about the ATI only crowd, its hope springs eternal. Always hoping to have a product that smashes nvidia the way the 9700pro did and then dancing like little school girls just because it competes.

What???
From my understanding the current gen competes pretty darn well?
Guess you must be in the Nvidea only crowd who cries like little schoolgirls when ATI/AMD competes.