I decided that I'm better off with a second 6870 right now. as I won't be buying a 30" monitor in the next 4-5 months!
Printable View
I decided that I'm better off with a second 6870 right now. as I won't be buying a 30" monitor in the next 4-5 months!
I'm sorry but, that doesn't make sense. If the GTX 560 is to beat the 6950 by a large margin then that would put it in GTX 570 territory...
And also, the reference cooler is NOT loud, and is NOT hot. How about trying the cards out for yourself before spouting regurgitated nonsense. :shakes:
/end rant :)
I should have said significant sorry, not large.
I've tried ATI reference coolers enough times - every generation from X1900 xtx to 5770. My reference 5770s were louder and hotter than my MSI GTX 460s, considering that I heavily OC my graphics cards. My current GTX 460s are completely silent when clocked to 900 Mhz - no reference cooler, whether ATI or Nvidia ever has been silent at max OC with acceptable temperatures for me.
The ATI reference coolers may be fine at stock speeds, but once you try to overclock them to their max, they either sound like hairdryers to keep acceptable temperatures, or the temperatures run far too high with them running silent.
What I find even less impressive about the 6900s is how insignificantly better they are than the previous 5800 range.
I play at 1920x1200, 1 Gb is plenty fine for that. You should never compare video cards by the amount of memory on them. 2 Gb isnt going to become a standard requirement for games for a very long time yet, and when it is you will be able to buy £150 cards with that much ram.
I agree, i was disappointed too. I remember saying in the rumor thread that if the 6970 is any less powerful than a GTX 580 it would be fail, and wouldn't make sense. Look what happened... the only thing keeping it from falling into the fail category is the price.
Wasn't the GTX 580 only 20% more powerful than its predecessor as well? Although, AMD has had much longer to produce something better. But overall they have made great progress in tessellation performance and both companies have fixed what needed fixing with their new series (while still being stuck on 40nm).
That makes the statement even worse :confused:Quote:
Originally Posted by bhavv
Regarding the cooler, you're comparing the past now. Weren't we talking about the cooling performance on the 6900 series?? Which is fantastic even when comparing it to the great cooling on the 5xx series.
This page of Anandtech's article is in line with everything i've found with my card so far. (except for power consumption, i have no way of testing that :rofl:)
http://www.anandtech.com/show/4061/a...eon-hd-6950/24
All asked for was a for link or from where you have the info since your post was hard to believe.
I Googled "TSMC 32nm canceled" got many articles about the cancellation starting with our friend Charlie rumors. :) Still a good read.
Could not find anything about TSMC 32nm cancellation because off AMD canceling first, same as could not find that TSMC was ready with 32nm process as you said:Finally I found the AnandTech, article which I believe describes the situation the best. Looks to me that Mr. Skynner was not "selective truth telling"Quote:
"Contrary to popular belief, TSMC didn't have issues with 32nm."
Does the above sounds like "TSMC didn't have issues with 32nm"?Quote:
With the launch of the Barts GPU and the 6800 series, we touched on the fact that AMD was counting on the 32nm process to give them a half-node shrink to take them in to 2011. When TSMC fell behind schedule on the 40nm process, and then the 32nm process before canceling it outright, AMD had to start moving on plans for a new generation of 40nm products instead.
One more time. Does the above sounds like "TSMC didn't have issues with 32nm"?Quote:
The 32nm predecessor of Barts was among the earlier projects to be sent to 40nm. This was due to the fact that before 32nm was even canceled, TSMC’s pricing was going to make 32nm more expensive per transistor than 40nm, a problem for a mid-range part where AMD has specific margins they’d like to hit. Had Barts been made on the 32nm process as projected, it would have been more expensive to make than on the 40nm process
The bottom line is, if there was no issue with the 32nm process, Cayman would be a little beast, actually the same applies to Bart.Quote:
Cayman on the other hand was going to be a high-end part. Certainly being uneconomical is undesirable, but high-end parts carry high margins, especially if they can be sold in the professional market as compute products (just ask NVIDIA). As such, while Barts went to 40nm, Cayman’s predecessor stayed on the 32nm process until the very end. The Cayman team did begin planning to move back to 40nm before TSMC officially canceled the 32nm process, but if AMD had a choice at the time they would have rather had Cayman on the 32nm process.
Talking about Issues, there are few rumors that TSMC also has problem with 28nm process and there are going to be some delays.
I just purchased a 6950 to replace my nVidia 7900GT. Think I'll see any difference? :p
That's untrue. We're talking about economies of scale when it comes to the 32nm manufacturing process. There wasn't a large client to pick it up so it was dropped in favor of concentrating on 28nm.
As I have said numerous times already: AMD realized that 32nm wouldn't bring them any benefits in terms of power savings or cost offsets for their mid and lower end cards so decided to port them over to 40nm instead. That left Ibiza dangling at the top end but without large volumes running through their foundries on the 32nm process, TSMC decided to drop the process altogether. This is also why we didn't see Cayman until December of this year.
I'm not saying TSMC wouldn't have had issues with manufacturing. Rather, they weren't given the chance to actually run into any of the pitfalls since designs were stopped before volume production commenced.
I don't need links or anything else to back this up since I was told it first-hand.
Basically what you're arguing is which came first: the chicken or the egg. I mean naturally TSMC was behind on the 32nm process but that didn't mean they COULDN'T produce a lineup of products based off of it.
I can say with certainty that I will be bothered by the noise levels of my 6870 when spring comes around never mind summer. I'm just talking about noise levels at stock speeds. I can't imagine a similar cooler on something that draws more juice than my old, power hungry GTX280. From what I saw both 4870 and 4870x2 stock coolers were unacceptably loud. It would be nice if AMD adopted a better cooling solution for their reference cards.
...i dont understand this talk about loud/inefficient cooling on any Nvidia or ATI gfx card:shrug:
...are these the HardOCP forums...?
...do you guys run your quad core cpus with stock coolers spinning at 4000rpm?:confused:
...do you know that you can actually remove the default crap cooler,get a decent aftermarket cooler,put 2x12mm fans spinning at 1000/1200rpm:up: and reduce your load temps by as much as 30degrees and stay silent:yepp:
are we still xtremesystems.org/forums:confused:
btw no one needs 2gigs of video ram sounds awfully like "no one will ever need more than 640k of system ram":rofl:
Show me one that will do a halfway decent job with the memory and vrms. There are components other than the GPU on the pcb. :yepp: You also have to keep in mind that the extra money spent on aftermarket cooling for a 5970 can buy you a GTX570 with a nice vapor chamber cooler.
I was going to buy a 6970... Assuming it was 450$ and as fast as a GTX 580 or close to that.
I am not impressed by either set of cards. I have a 4870x2 in my websurfing rig and 280 SLI sitting on my desk collecting dust. I do not even own a game that will out pace either set of those cards. Why pay 500$ to upgrade when I won't need it? I do not think either of these cards has the staying power the 8800 GTX did. I am the kid that spent almost 700$ shipped on launch day for my 8800 GTX. I wont buy either of these cards the GTX 570/580 or the 6900 series.
4850 is really loud.
6950 and 6970 are fine. Might be a little bit louder than 5870, but still pretty good.
When I had my 4850 it was only noticeable past 55% fan speed and it only went past that when I tested furmark which kept it under 90C using the custom fan settings in Afterburner.
It's not that hard to keep fans quiet, just reseat with proper thermal paste and keep the dust out and use custom fan speed configurations. Besides, you're not really going to care after how noisy it is while gaming anyways.
What's so untrue about it? You did not read my reply completely or decided to reply selectively. You don't believe the HD 6000 series would end up faster on 32nm process if TSMC could delivered it. I mean deliver it at reasonable price, talking mainly about the 6800 series.
You still believe there were no issue with the 32nm process? As you origionaly said:
"Contrary to popular belief, TSMC didn't have issues with 32nm."
I do not know about your "first-hand" info but could not find any article to confirm the above "no issue".
Anyway my first reply was to this post of yours, which sounded to me as it was AMD fault why TSMC cancelled the 32nm process.
Or at least it was a little miss-leading.
Sorry to say so but I believe the AnandTech article described the situation much better. Off course I am comparing it the the first post of yours I replied to.
I do not have any desire to talk about "the chicken or the egg" but since you put it this way, I believe the TSMC problem with the 32nm came first.
One more time I Googled "TSMC 32nm cancel" looks like the "no issue" is top secret.
This is my last post about the subject.
I have to upgrade my aging dualie too. I was going after HD6970 but given it's local price tag, which equals amount of money I could buy roughly $640 at the exchange office,
I think 6950 for $490 is a more viable choice :D
I guess that the performance improvements of 6970 over 6950 are not measurable enough to justify the price gap, plus the increased power consumption?
I see some US citizens :banana::banana::banana::banana::banana:ing about US price tags. Think about this when you are about to do that again while yours average salaries far exceed that of my own :)
So the prices are due to rise after New year?