i doubt that that fugliness will fit to anybodies sig :ROTF:
Printable View
First, I don't know how old are you but there is definitely about time for you to grow up.
Sure some people were critical and unhappy about the the Fermi delays but i don't remember people posting being happy about it as you keep repeating in your post.
Any product being delayed or having higher production cost is definitely not good news for the consumer, so what is there to be happy about?
Anyway lets wait till November 22 or something more official from AMD. Lets not forget this is Fuad news the same Fuad from Fudzila who was reporting Fermi coming out just about every months since November 2009 and it was about 6 months later on April 2010 when first Fermi cards arrived. We're still waiting for the dual Fermi cards which Fuad also reported to arrive in November 2009.
Lets also not forget the NVIDIA CEO Jen-Hsun Huang introducing the Fermi in October 2009 at GPU Technology Conference
"This Ladies and Gentlemen, this one here is Fermi." He forgot to mention it's fake with some wood screws.
http://img829.imageshack.us/img829/2908/jensenfermi.jpg
So off-course people were disappointed about the many delays, I don't think anything close to it will happen to the HD 6900 cards.
It doesn't even make logical sense for the card to be on single digit yields. I didn't even believe Fermi was in single digits, and Fermi is much larger and more complicated
So somehow AMD, which delivered the entire Evergreen family + Barts on time on 40nm, who is on their 3rd generation of 40nm products, is having issues with their chips much smaller than GF100/GF110... even Charlie couldn't write this story
Sounds interesting. Way better than games would be awesome... single GPU > previous dual GPU is awesome when one considers the previous dual GPU was far and away the top dog
come on guys quote the troll nevermore :p:
"Nonetheless, building a 3 billion transistor chip was never going to be easy, a fact clearly demonstrated Nvidia's GF100."
-so is he telling us this chip will be the apox. same size as the gtx 480 ?
I wonder how true the delay is. There would be alot less credibility if fud was the only source, but he wasn't the original source. If there are yield problems, what could it be a result of? Size would be one I could imagine but how about the memory controller? Isn't 1.5 or 6ghz ghz crazy fast for a memory controller?
amd/ati drivers team was caught with there pants down, AIB's have the card there jsut waiting on bios and drivers. apparently the drivers r bad.
AMD have greater transistor density then nvidia on TSMC's 40nm process, they can hit 3 billion transistors and still be below 500mm^2. However, I haven't a clue what kind of complications this could incur. If the 580GTX is based on the GF104 then it's probably below 3 billion transistors.
Edit: GF110 is 3 billion, GF100 is 3.2 billion
Or AMD it's waiting to see what kind of performance GTX 580 gives to finally adjust clocks/voltages and so on. For what can be seen on techPowerUp! it's going to have low clocks and voltages :ROTF:
Sadly the tpu "review" is a mess of drivers xD (compare 580 preview with HD6870 review numbers).
the tpu review is kinda messy but things seem the point out than gtx 580 is around 5970 perf beating it in some games and losing in others while mostly around the same perf +-5%
i've been reading this thread since page 1 -- actually, others cayman related as well -- and i believe i haven't heard of crossfire scalability on caymans yet.
surely, that would be something that only reviewers and costumers with the real cards at hand could attest to, but since this is a rumor and speculations thread ;) ...
in topics (cause i'm not very articulated :p:):
- what got me thinking is that the gtx580 will be tri-sli capable (not sure if this is a cap* or just common sense :confused:)
- but since we all know that despite poor scalability quad xfire is a reality, regarding the cypress series, specially on the hd5970 (x2) -- + amd will be releasing another dual chip version (antilles)
- and i`ve read on hardwarecanucks that the barts cards offer an "awesome" scalability in crossfire (x2)
from that i`ve got 2 inquiries:
have anyone heard of this sli limitation on the 580 series?
would that awesomeness on xfire scalability be exclusive to the barts series (and so, more hardware bound, so to speak) or have amd already tweaked the drivers anticipating this ...and so extending its effects throughout the 6k series ...?
what i mean is this: suppose the 580`s are capped to x3 sli, suppose catalyst new installments are "awesome" and they got rid of the bottlenecks that affected the x4 gpu`s (even yielding inferior performance against x3 (5870`s) on some games benchies) ...couldn`t we assume that the ultimate e-peen rig would be composed of x4 caymans (and thus negating the possible 5-10% superior performance of a 580... x3)?
sorry, i`m all over the place and not sure if made my point across. as you can see, lots of doubts ^^ (maybe i could summarize it by saying: any word/expectation on xfire for caymans yet?)
*my doubts are substantiated by this 'rock solid' argument: i saw that on a gtx580 galaxy box, written on a sticker, that was posted on the 580 thread (which would add further to the vagueness of this limit -- and i also know for sure that quad sli`s are possible on the 480`s)
ps it got me 2 years to get a login, another year to post this. so, my next post will be in 6 months :D back to lurking
edit to add: but then again we`ve got this new fud of driver issues :confused:
response to (@)nintendork -- cause i do not wish to surpass my 1 post count :D : "ultimate e-peen rig" + IF that tri-sli cap is true (for 580`s) then antilles (in xfire, virtually x2, effectively x4) would definately smoke them ...depending on the drivers, i guess. maybe =p
What would you run on 4 caymans? Just two of them with 2gigs each must deliver enough for an eyefinity setup specially with MLAA.
Barts HD6800 scale better than fermi in multigpu, cayman should not be different.
From my understanding nothing will stop you from using 4x gtx 580 in a setup although you would need a separate 1200w power supply for that. The barrier comes also from drivers not being too efficient in distributing the load among 4 cards and its the same deal with 2x 5970 although less obvious in the latter. All gtx480 owners I spoke with say the best possible setup is tri-sli and quad is a waste of money.
We dont know how Cayman will come out but we can definately expect that it will at least close the gap vs the gtx580 better than did the 5870 vs the 480. So having tri 580 or tri 6970 will be a comparible setup and possibly the same level of 2x Antilles due to reasons given above.
btw if you wanna see a real epen system it doesnt get much better than this:
http://www.youtube.com/watch?v=-s9LM..._order&list=UL
Even on a pessimistic perpective, i don't buy the low yield rumor, the chip has been in development for quite a long time, already taped out sometime ago, and if there was a low yielding situation it would already came out, not just leaked right when the product is gonna hit the street. But the difficult driver development, those i can somewhat weigh on, because the chip will be a new mArch that won't be easy to extract performance upon early on.
This damaged Nvidia more than the 59xx series cards ever did. As a long time nvidia user I now think twice before buying anything made by them. I take this deception personally along with the promises that were broken at the time. Mr. Can of ***** *** now has the contents if it all over his face. This is going to take Nvidia a long time to recover at the pace they are going. Nvidia's driver support seems to be slipping while ATI's is improving. One can only hope that Intel buys nvidia and turns the name around and then we might see competition again. Of course ATI could get lazy and let Nvidia catch up much like AMD-core let Intel catch up and pass them.
Anything and everything at 2560x1600!
I always thought he was presenting a carpenter's sample
http://i.ytimg.com/vi/ei2m-zPHEkg/0.jpg
as well as their new line of mobile-phones....
http://ixbtlabs.com/articles/siemensfactory/pic0.jpg
Come on AMD, you've seen the performance of the GTX 580, now set those clocks, write those BIOS' and drivers accordingly, and get shipping :D
All we need now is for Cayman to be faster than the 5970 by 5% and Antilles by 50%. Is that too much to ask?
word is some reviewer is adding some of AMD's comments to his GTX580 review article. I have no idea why, this ought to be fun though.:D
His name is apoppin at Alienbabeltech.
This ought to be good. :up:Quote:
i'm doing that now. i *got* to put in a few more unique images and i should have it up (minus some parts) within the hour. My least late under-NDA article to date.
i think it is better i am getting instead of improving.
- a local Irish country saying
i am adding AMD's reaction to the GTX 580 article from my interview with Stanley Ossias this morning at 6 AM
i need a nap
Why do you "need" Cayman to be 5% faster? The sentence "We just need Cayman to be 5% faster" sounds like it's begging to be followed by "... so that it convincingly beats the GTX 580. Any higher would be fine, but we need it to at least beat the GTX 580."
Rooting for a certain side are we? :yepp:
Strange there is nothing from AMD about this :confused:. I assume that they are going to make some sort of communication.
Of course not. Personally I think I'll probably be buying a Cayman and I hope it's incredibly fast. However, when you put it like "come on AMD we just need 5% faster than the 5970" it sounds too much like "... so that we can beat Nvidia". Or at least it sounded too much like that to my ears.
In a way I do want it to beat the gtx580 as its been what.. 2 years since AMD didnt hold the single chip crown? I would find that really exciting but although Im slightly more inclined to buy radeons today I was all up the 8800gt and 6600gt bandwagon long ago. I have seasonal preference but my loyalty is with my gaming and my pc.
ps: actually it might have been some 5 years now... last time was the X1950XTPE no?
See, I was right ;)
Also I must add that I see nothing wrong with having a preference to a certain brand... as long as you don't wish slowness on your "rival" brand and just wish for blazing fast speeds on your preferred brand, there's nothing wrong. Not that there would be anything "wrong" if you wished for a slow Nvidia (like anyone's wishing would make it happen) - it would just be plain stupid :D
Nah I would never wish for a slower product that would be stupid. I need faster and faster from both camps and perhaps a faster Cayman would also trigger more reactions from nvidia such as price cuts and the fabled dual gf104.
Im building a rig when bulldozer is out, likely in June 2011 and I will simply take the fastest of everything independent of brand (though I dont think Im ready to pay 1000$ on a processor but I would 500$). So either SB or BD, nvidia or ati, they better impress me!
Yeah. :rolleyes:
Alien Bobble Tech
I guess that makes everything alright, doesn't it? :confused:Quote:
Since we are using the fastest of the fast single-GPU video cards, it makes sense to test at the highest resolutions and with the most demanding settings. Since we are matching the top single-GPU video cards to each other in a performance showdown, we do not include the dual-GPU HD 5970 nor CrossFire nor SLI configurations.
I suppose he didn't get that AMD email either. Check the card prices..oh looky there, they are the same..the GTX580 AND the HD5970. What a shocker.:shakes:
Another bought and paid for NV reviewer. It's like they think they owe something to their master or something. Just take the f%ckin card and test with a price equivelent card. *cough* 5970 *cough*
You also have 2 NV OC cards in the review too. Where's the AMD OC cards? Hmm? Where?, where? Yeah, nice fair review you got there. :rolleyes:
Don't give your BS excuses why you didn't. At least be honest as say that nVidia told me not to include the 5970...um..cuz it would make our new card look over a year old. Well, duh! apoppin, you lose! :down:
Charlie has been wrong as a dog writing an actuary exam, he said reviewers were going to be picked carefully and only reviewers that bent to Nvidia's will were going to get cards, yet pretty much every reviewer out there got one. Are they all paid off now!!! The conspiracy.
I think your taking it a bit far, in terms of releases from AMD, Kyle has been one of the most positive. I think he called out NV about something a few weeks ago.
If you look at the 5970 at the moment, especially from newegg, the quantities are not exactly overflowing, 1 brand and only 1 card(besides a 1200 dollar edition).
If NV wanted to respond, like AMD did(a dirty marketing move at that), they could say these price are only temporary and the price will come back up or the 5970 won't be available for sale anymore.
I think stock of the 5970 is going to run really thin soon and that's a good thing and hopefully means cayman and antilles are coming.
The only place I see a 499 price is newegg. Provantage, Mwave and anywhere in Canada it is still 600 dollars plus. If one wanted to be a conspiracy theorist like you, one could say AMD gave a special deal to newegg, sent a memo to gtx 580 reviewers saying see, the 5970 is 499 when in reality, it is a singular deal for one card at one place.
http://www.mwave.com/mwave/dosearch_...atpromotename=
http://www.provantage.com/scripts/se...x=0&Submit.y=0
http://www.shopbot.ca/m/?m=5970
Low 5970 stock is most likely cause it's about to go EOL... after all, its 2 x Cypress so its even harder to come by
i hope no one buys a 5970, it just dosnt make sense anymore.
either go with a pair of 6870s or buy a 580.
sure the 5970 might be a little faster than 6870x2 once overclocked, but it would come with a much louder fan.
2 6850s are a better choice considering that they clock like hell and are barely slower than 6870 once overclocked (barts is heavily bandwidth starved and the additional SPs on the 6870 don't do much once both cards are clocked above 1ghz core)
i hope AMD settles for some insane clocks on the mem
People have to realize that interviews given about upcoming products will ALWAYS end up that way. I'm not sure what everyone was expecting.
At this point in time, AMD is one of two things: very worried or supremely confident. Either one of those things hinges on the public performance of the GTX 580 in relation to their internal numbers on Cayman XT / Pro. They will not even so much as mention which way their thought process is leaning due to the market implications any statement may have.
Think about it:
"Cayman will Rick Roll all over the GTX 580" : NVIDIA releases a higher clocked GTX 580, which they seem more than capable of doing.
"Cayman may have issues with GTX 580": Investors loose confidence.
So AMD (much like NVIDIA) won't say anything because it isn't in their best interest to do so. That's a good way to go about things in my books.
I can't say I agree with the 1st point though. Nvidia is well known to overclock their cards. We've seen it with the 8800 series, 9800 series, the 200 series and the 400 series to know that it will eventually happen with the 500 series. Nvidia has officially released their refresh 1st this go around. I can't see a reason why AMD shouldn't counter react to the release if they have just as good/better card(s) :shrug:.
Price segment wise, 5870's refresh should be the 6970, not 6870.
I like AMD's approach though. If someone accuses you of misleading people, tell them 6870 is the new version of 5770, not 5870. If someone says Nvidia refreshed their GPU in 6 months whereas you still haven't done so after a year, tell them that 6870 is a refresh of 5870. :D
Who said anything about the 6800:confused:? I'm referring to the enthusiast end. But it does put into prospective one's opinion based on their perception on current events regardless if it's right or wrong, doesn't it? In any case, the only real clue I got out of it is that they may have known about the 580 performance before release. That may or may not be true though.
When is that exactly? Did they ever release a launch date?
AMD needs to come out with some tid bit of news to keep my saliva running :p
looking for a black-ops bundle! hurry hehe
My point wasn't focused on the 6800 nor the 580 but to say that we should at least get some sort of information about the 6900 series. For example:
-an official release date
-some possible performance leaks
-some information on what features 6900 may provide
-etc
Even if AMD doesn't want to counter the 580 with benches of their own they can at least show the community something as to what one can expect and when to expect it.
Why do you think you are entitled to information about an unreleased product? lol...
Entitled? I don't think that term is used correctly. Or are you baiting me? If a company is suggesting they have a product that maybe of interest I think some basic information should be made available about it. Specially when their competitor has released their version already.
Could be, if company A is scared of company B they let the marketing guys loose in the hope of doing some damage control, this pattern held true for R600 and GF100.
When a company is mostly silent, chances are, they are confident the item will sell itself. When they shout from the roof tops, that product needs marketing support.
Thinking about it, PC hardware is probably one of the few areas when quality will sell itself. Cars, food, clothing will almost always come down to taste, but with PC hardware the superior product sells.
I bet you would be happy only with performance number. As there is lot of other info available. 4VLIW, 32ROP, 128 Z/Stencils, TDP under 300W and under 225W. More than 1,6K shaders, 2GB GDDR5, UVD3, 3 Tesselators.
Those 4 shaders take less space and are little less performing, but save space so that same space can have more shaders than in 5VLIW config.
Only thing anymore is excact shader count thus performance.
I don't know, Nvidia has laid their cards on the table so now would probably be the most opportunistic chance to steal 580's thunder by showing something about their new product.
To me it seems to be the best time for an announcement would follow on the heels of the 580 release.
I don't think that would be in their best interest. This industry is all about price and market protection and releasing details about a product too far in advance gives the competition that much more time to formulate a response.
Then again, if they release info that proves an upcoming card WON'T perform in-line with the competition, it gives an opening for rival marketing gurus to pick the product apart before it even hits the market.
This game is all about NVIDIA and AMD keeping each other guessing.
http://blogs.amd.com/work/fadcodenames/
Says Antilles is Q1 2011.
Seems on track.Quote:
“Cayman”
Market: Discrete GPUs
What is it? Second-generation DirectX® 11-capable GPU to launch in the “Northern Islands” family, will be branded AMD Radeon™ HD 69XX graphics processors.
Planned for introduction: Q4 2010
From here :).
Well, that's slipped. :(
http://images.hardwarecanucks.com/im...HD6800-202.jpg
The X2 cards have always come a few months after right? Trying to remember how long the 4870x2 and 5970 took after initial release
Cayman is still on track though. In all honestly I think AMD is just giving Nvidia their 15 minutes. Cayman is probably stronger than the 580 and 5970 by a good margin.
http://blogs.amd.com/work/fadcodenames/
“Cayman”
Market: Discrete GPUs
What is it? Second-generation DirectX® 11-capable GPU to launch in the “Northern Islands” family, will be branded AMD Radeon™ HD 69XX graphics processors.
Planned for introduction: Q4 2010
“Antilles”
Market: Discrete GPUs
What is it? AMD Radeon™ HD 6000 Series graphics card for ultra-enthusiasts that will feature two GPUs on one board.
Planned for introduction: Q1 2011
From Charlie with love:
Do I smell harsh in his tone?Quote:
Then it was on to the demos. AMD wowed the audience with a demo of a Cayman equipped 8 core Bulldozer playing an HD video while running MS's CPU load meters. At the same time! No, really, they pulled it off! Wow!
Oh wait, that has been doable for several years now. With a 1 core machine armed with an IGP. Call me unimpressed, especially since they showed a Llano doing 5x that at AMD's TFE conference 3 weeks ago. Then again, it was new running silicon.
And more:
Quote:
The short story is that Cayman Pro is called HD6950, Cayman XT is HD6970, and Antilles, the dual Cayman, is called HD6990. We hear three people not paying attention were actually surprised.
seems like my gpu upgrade will come on 2011 :D
Antilles slips to 2011, does it mean that AMD's confident Cayman would be good enough to compete GF110 by itself at launch ? :D
Well, Newegg is the place then to get them. I do see many other places a little higher at ~$549.99. A lot at these prices, a LOT.(See Here) Ok, so lets say the Newegg is the start of the 5970 deals. I think Newegg makes the trends in a lot of these cases.
Also, why compare a GTX580(retail $499.00, but still selling for $535.00+ in many places) I can say the same about the 580. Why ONLY compare the AMD $300.00 card to NV $500.00+ card?
Don't you think if they were going to compare cards with a $200.00 difference they could have least compared a card with $0 ~ $50.00 difference????:confused:
This is why I questions their method of testing.
Don't get me wrong, the 580 is a great card NOW. It just should have been here at least 6 months ago, maybe a year.
And just because a site says something negative about a company's card doesn't mean they're now "impartial". It just means that the card company is not looking over their shoulder at the moment. Right now NV def. has there eye on [H]. I mean look at the reviewers who DIDN'T cards. They may have done some things NV didn't like and are now on their "list". :yepp:
So that's the crux, [H] compares $200.00 difference in card price, but not $50.00. Is that a fair review? If you can say yes with a straight face then..:down:
He was comparing single GPU performance. It's not a good comparison in terms of money spent but it is a good way of showing if money is not as serious a concern how much performance do you get for a single gpu get set up.
[H]'s complaint has revolved around the the 480 being too expensive while remaining too closely to the 5870 in performance. For the time being the 580 will correct that flagrant faux pas until Cayman comes out.
Newegg's prices on the GTX 580 have spike to about $459 on the gtx580 :(
So 6970 will be slower than 5970?
I would say No, the 6970 should be faster. read the wording below:
“Barts”
Market: Discrete GPUs
What is it? AMD Radeon™ HD 6800 series GPUs featuring AMD’s second-generation Microsoft DirectX® 11-capable architecture, best-in-class energy efficiency, and a feature set including AMD Eyefinity multi-display technology.
Introduced: 2010
“Cayman”
Market: Discrete GPUs
What is it? Second-generation DirectX® 11-capable GPU to launch in the “Northern Islands” family, will be branded AMD Radeon™ HD 69XX graphics processors.
Planned for introduction: Q4 2010
“Antilles”
Market: Discrete GPUs
What is it? AMD Radeon™ HD 6000 Series graphics card for ultra-enthusiasts that will feature two GPUs on one board.
Planned for introduction: Q1 2011
But why try to read into words when the slide they made very clearly indicates that 6970 is going to be one class slower than the 5970?
I think your not on the same line with him. I think he means this:
http://vipeax.nl/slide.png
If you draw it like this, it comes down to:
6990 > 5970 > 6970 > 6950 > 5870 > 6870 > 5850 > 6850 > 5770.
We know that this part is correct: 5870 > 6870 > 5850 > 6850 > 5770.
It's quite logical the rest is sorted like this too.
The difference between the 5870 is damn huge compared to 6970 -> 6990/5970 -> 6990 :(.
Yep. I took one step further in nerdiness and counted the actual distance in pixels between the placements. The bottom of the 5850 box and the 5870 box have a 50 pixel vertical difference, whereas 5870 and 5970 is roughly 135. We know for a fact that HD5870 is around 17 percent faster than HD5850, and HD5970 is around 55 percent faster than the HD5850. Doing the math, I can definitely say that the placements of the boxes have a very good level of precision.
So why is 6990 only slightly higher than 5970? Two possible reasons:
1. There was simply not enough space on the slide and it had to be compressed
2. This is something I have been fearing: AMD will actually respect the 300W TDP limit, which would put HD6990 and HD5970 on the same TDP, and that means any increase in performance has to come solely from improvements in Performance/Watt, which cannot be so great given both products are produced in the same node, 40nm - so 5970 and 6990's performance won't be much different.
Seeing as the 6xxx seem to overclock a bit better than the 5xxx, it's not improbable that factory oc'ed 6990's could be more than just 15-20% faster than the current 5970.
It would be a good time for amd to allow 5xxx cards to be xfire'd with 6xxx cards. If the upgrade in performance is minimal, that would be an enticing way for users to buy the new cards AND gaining a lot more performance ;)
6990 xfired with 5970...... I'd be all in on that one
If 6970 is 225W and 6990 has to be 300W, this means an incredible amount of crippling has to be done to two 6970's. And since when you lower the frequency of a GPU its cost to you doesn't magically decrease, this would mean that AMD would be selling a card that, at stock speeds, wouldn't be much faster than a HD5970 (or even a single 6970) yet would be sold at something like $700. Everything is performance/watt when it comes to a TDP limit.
Yeah, I know, AMD will "beg us to overclock" the 6990 too, and by "overclocking" them to stock 6970 speeds we'd get a stupidly powerful graphics card; but the fact is that every review site will base its review on the stock speed, and will add "Overclocking" as merely another page among the 10 something page reviews. It will no doubt make the card look bad in performance and price.
So I don't think that would be very feasible. To me, with Antilles AMD has got two options:
1. Disregard 300W TDP and let the card overwhelmingly break single card performance and power usage records.
2. Build Antilles with 2x6870 instead of 2x6970. 2x6870 is already 12% faster than HD5970. 6870 is already 150W. Doubling it in a card will save one PCB worth of wattage, around 20W, which they can use to increase the clocks slightly :D to fill it to 300W and around a 15% performance difference from HD5970. The card would cost less than $500, but on the downside it wouldn't have the (possible) tessellation and DX11 tweaks the new Cayman architecture has.
You guys are reading WAAYYYYYY too much into PR slides
They have those funny graphs that start at 0.8 for performance slides, why would they put them exactly in a layout like that?
I mean, reading into pixels? If so, then why is the 6850 touching the 5770 when its obvious by now the 6850 is faster?
And for that matter, why is the 5970 so much higher than the 5870 when the 5870 should be quite big compared to the 5770?
Besides, they always make their dual GPU cards look much better than in actual performance... wouldn't want to make your own drivers look bad :rofl:
Also, on the fliip side, look at the 6950... it's placed higher than the 5870 so if the 6970 is 17% faster than the 6950 (like 5870 is to 5850), and we say 6950 is even a conservative 10% faster than the 5870, the 6970 is going to be close to the 580 and probably be ~ the 5970 depending on the scenario. But then, I'd be looking way too much into this slide like everyone else, so that ought to prove to you how ridiculous it is to use this to guess actual performance ;)
It's pointless to argue this - the drivers already put Antilles as 2xCayman, so using Barts is a non-sequitur
Furthermore, the 5970 was already theoretically crippled - it was 2 x Cypress XT cores clocked at Cypress PRO. AMD can take it a step further and make it 2 x Cayman PRO cores @ Cayman PRO clocks, with Cayman PRO supposed to be < 225W (my guess is in the 175-200W envelope) which would fit with a X2 configuration.
That would actually leave room for AMD to keep the higher binned XT cores to reach higher clocks and thus make the 6970 stand out even further while the ones that couldnt reach such clocks get clocked lower / have units disabled and put in the PRO and X2 config
Uh, the 6850 is touching it, but is positioned higher. Difference between the 5770 and 5870 is the same as 5870 to 5970 and it's not about % performance. It's about it's placement....