I decided that I'm better off with a second 6870 right now. as I won't be buying a 30" monitor in the next 4-5 months!
Printable View
I decided that I'm better off with a second 6870 right now. as I won't be buying a 30" monitor in the next 4-5 months!
I'm sorry but, that doesn't make sense. If the GTX 560 is to beat the 6950 by a large margin then that would put it in GTX 570 territory...
And also, the reference cooler is NOT loud, and is NOT hot. How about trying the cards out for yourself before spouting regurgitated nonsense. :shakes:
/end rant :)
I should have said significant sorry, not large.
I've tried ATI reference coolers enough times - every generation from X1900 xtx to 5770. My reference 5770s were louder and hotter than my MSI GTX 460s, considering that I heavily OC my graphics cards. My current GTX 460s are completely silent when clocked to 900 Mhz - no reference cooler, whether ATI or Nvidia ever has been silent at max OC with acceptable temperatures for me.
The ATI reference coolers may be fine at stock speeds, but once you try to overclock them to their max, they either sound like hairdryers to keep acceptable temperatures, or the temperatures run far too high with them running silent.
What I find even less impressive about the 6900s is how insignificantly better they are than the previous 5800 range.
I play at 1920x1200, 1 Gb is plenty fine for that. You should never compare video cards by the amount of memory on them. 2 Gb isnt going to become a standard requirement for games for a very long time yet, and when it is you will be able to buy £150 cards with that much ram.
I agree, i was disappointed too. I remember saying in the rumor thread that if the 6970 is any less powerful than a GTX 580 it would be fail, and wouldn't make sense. Look what happened... the only thing keeping it from falling into the fail category is the price.
Wasn't the GTX 580 only 20% more powerful than its predecessor as well? Although, AMD has had much longer to produce something better. But overall they have made great progress in tessellation performance and both companies have fixed what needed fixing with their new series (while still being stuck on 40nm).
That makes the statement even worse :confused:Quote:
Originally Posted by bhavv
Regarding the cooler, you're comparing the past now. Weren't we talking about the cooling performance on the 6900 series?? Which is fantastic even when comparing it to the great cooling on the 5xx series.
This page of Anandtech's article is in line with everything i've found with my card so far. (except for power consumption, i have no way of testing that :rofl:)
http://www.anandtech.com/show/4061/a...eon-hd-6950/24
All asked for was a for link or from where you have the info since your post was hard to believe.
I Googled "TSMC 32nm canceled" got many articles about the cancellation starting with our friend Charlie rumors. :) Still a good read.
Could not find anything about TSMC 32nm cancellation because off AMD canceling first, same as could not find that TSMC was ready with 32nm process as you said:Finally I found the AnandTech, article which I believe describes the situation the best. Looks to me that Mr. Skynner was not "selective truth telling"Quote:
"Contrary to popular belief, TSMC didn't have issues with 32nm."
Does the above sounds like "TSMC didn't have issues with 32nm"?Quote:
With the launch of the Barts GPU and the 6800 series, we touched on the fact that AMD was counting on the 32nm process to give them a half-node shrink to take them in to 2011. When TSMC fell behind schedule on the 40nm process, and then the 32nm process before canceling it outright, AMD had to start moving on plans for a new generation of 40nm products instead.
One more time. Does the above sounds like "TSMC didn't have issues with 32nm"?Quote:
The 32nm predecessor of Barts was among the earlier projects to be sent to 40nm. This was due to the fact that before 32nm was even canceled, TSMC’s pricing was going to make 32nm more expensive per transistor than 40nm, a problem for a mid-range part where AMD has specific margins they’d like to hit. Had Barts been made on the 32nm process as projected, it would have been more expensive to make than on the 40nm process
The bottom line is, if there was no issue with the 32nm process, Cayman would be a little beast, actually the same applies to Bart.Quote:
Cayman on the other hand was going to be a high-end part. Certainly being uneconomical is undesirable, but high-end parts carry high margins, especially if they can be sold in the professional market as compute products (just ask NVIDIA). As such, while Barts went to 40nm, Cayman’s predecessor stayed on the 32nm process until the very end. The Cayman team did begin planning to move back to 40nm before TSMC officially canceled the 32nm process, but if AMD had a choice at the time they would have rather had Cayman on the 32nm process.
Talking about Issues, there are few rumors that TSMC also has problem with 28nm process and there are going to be some delays.
I just purchased a 6950 to replace my nVidia 7900GT. Think I'll see any difference? :p
That's untrue. We're talking about economies of scale when it comes to the 32nm manufacturing process. There wasn't a large client to pick it up so it was dropped in favor of concentrating on 28nm.
As I have said numerous times already: AMD realized that 32nm wouldn't bring them any benefits in terms of power savings or cost offsets for their mid and lower end cards so decided to port them over to 40nm instead. That left Ibiza dangling at the top end but without large volumes running through their foundries on the 32nm process, TSMC decided to drop the process altogether. This is also why we didn't see Cayman until December of this year.
I'm not saying TSMC wouldn't have had issues with manufacturing. Rather, they weren't given the chance to actually run into any of the pitfalls since designs were stopped before volume production commenced.
I don't need links or anything else to back this up since I was told it first-hand.
Basically what you're arguing is which came first: the chicken or the egg. I mean naturally TSMC was behind on the 32nm process but that didn't mean they COULDN'T produce a lineup of products based off of it.
I can say with certainty that I will be bothered by the noise levels of my 6870 when spring comes around never mind summer. I'm just talking about noise levels at stock speeds. I can't imagine a similar cooler on something that draws more juice than my old, power hungry GTX280. From what I saw both 4870 and 4870x2 stock coolers were unacceptably loud. It would be nice if AMD adopted a better cooling solution for their reference cards.
...i dont understand this talk about loud/inefficient cooling on any Nvidia or ATI gfx card:shrug:
...are these the HardOCP forums...?
...do you guys run your quad core cpus with stock coolers spinning at 4000rpm?:confused:
...do you know that you can actually remove the default crap cooler,get a decent aftermarket cooler,put 2x12mm fans spinning at 1000/1200rpm:up: and reduce your load temps by as much as 30degrees and stay silent:yepp:
are we still xtremesystems.org/forums:confused:
btw no one needs 2gigs of video ram sounds awfully like "no one will ever need more than 640k of system ram":rofl:
Show me one that will do a halfway decent job with the memory and vrms. There are components other than the GPU on the pcb. :yepp: You also have to keep in mind that the extra money spent on aftermarket cooling for a 5970 can buy you a GTX570 with a nice vapor chamber cooler.
I was going to buy a 6970... Assuming it was 450$ and as fast as a GTX 580 or close to that.
I am not impressed by either set of cards. I have a 4870x2 in my websurfing rig and 280 SLI sitting on my desk collecting dust. I do not even own a game that will out pace either set of those cards. Why pay 500$ to upgrade when I won't need it? I do not think either of these cards has the staying power the 8800 GTX did. I am the kid that spent almost 700$ shipped on launch day for my 8800 GTX. I wont buy either of these cards the GTX 570/580 or the 6900 series.
4850 is really loud.
6950 and 6970 are fine. Might be a little bit louder than 5870, but still pretty good.
When I had my 4850 it was only noticeable past 55% fan speed and it only went past that when I tested furmark which kept it under 90C using the custom fan settings in Afterburner.
It's not that hard to keep fans quiet, just reseat with proper thermal paste and keep the dust out and use custom fan speed configurations. Besides, you're not really going to care after how noisy it is while gaming anyways.
What's so untrue about it? You did not read my reply completely or decided to reply selectively. You don't believe the HD 6000 series would end up faster on 32nm process if TSMC could delivered it. I mean deliver it at reasonable price, talking mainly about the 6800 series.
You still believe there were no issue with the 32nm process? As you origionaly said:
"Contrary to popular belief, TSMC didn't have issues with 32nm."
I do not know about your "first-hand" info but could not find any article to confirm the above "no issue".
Anyway my first reply was to this post of yours, which sounded to me as it was AMD fault why TSMC cancelled the 32nm process.
Or at least it was a little miss-leading.
Sorry to say so but I believe the AnandTech article described the situation much better. Off course I am comparing it the the first post of yours I replied to.
I do not have any desire to talk about "the chicken or the egg" but since you put it this way, I believe the TSMC problem with the 32nm came first.
One more time I Googled "TSMC 32nm cancel" looks like the "no issue" is top secret.
This is my last post about the subject.
I have to upgrade my aging dualie too. I was going after HD6970 but given it's local price tag, which equals amount of money I could buy roughly $640 at the exchange office,
I think 6950 for $490 is a more viable choice :D
I guess that the performance improvements of 6970 over 6950 are not measurable enough to justify the price gap, plus the increased power consumption?
I see some US citizens :banana::banana::banana::banana::banana:ing about US price tags. Think about this when you are about to do that again while yours average salaries far exceed that of my own :)
So the prices are due to rise after New year?
So disappointed. Craptastic, I shouldn't have sold my 5870.
I wanted to get 2 x6850 for two pcs, and eventual run in xfire, however I can only get the cards from the U.K and check out these prices
5870 = $300 is way better for me (@1680*1050)than 6870 = $312 or 6950 = $342 or 6970 = $ 444
http://www.overclockers.co.uk/showpr...56&subcat=1502
http://www.overclockers.co.uk/showpr...56&subcat=1866
http://www.overclockers.co.uk/showpr...56&subcat=1752
I really dont see where ppl are getting this 30% improvement of 6970 vs 5870 !!? in the main games I like F1 2010 there is as little as 7% btn them !! and dirt2 about 2%, crysis 9%
http://www.techpowerup.com/reviews/H...D_6970/15.html
http://www.guru3d.com/article/radeon...irex-review/11
http://www.techpowerup.com/reviews/H...D_6970/12.html
one or two games like metro2033 maybe 25% , but thats just one game....the rest of the games and $140 odd difference is an easy choice yes ? Do I buy 2 x 5870 ?
btw checked on my old lists , 5870 used to sell for $512 !!
If you game at 1680x1050 then either get 2x 58x0 or 2x GTX465 (and unlock them to GTX470).
COD4 4% diff
http://www.techpowerup.com/reviews/H...D_6970/10.html
really 30% where I just dont see "majority of games increase by 30% .....?
6870 vs 5870 is close with 5870 winning of course, F1 2010 13% , dirt2 9% , crysis 11% , UT3 10% , and only $12 more than the faster 5870 , so maybe they will phase out 5870 and replace with 6870 ....
already I notice very few 5870 in online stores ..
5870 vs 6970 @1920 * 1200 , F1 2010 9% , Dirt2 6% , crysis 10% ......... metro2033 28% , I wouldn't mind seeing some reviews with dark void , WRC, need4speed hot pursuit (new game yes?)
I don't really play batman and battleforge, hawk,far cry2, yet .... maybe in the future ....I just spend more time on speed games f1 2010, dirt2 etc.... some CS, some UT3, some generals shockwave 1.0 , etc.... lol
seanx, we get the point... the 5870 to 6870 is a sidegrade and it was always meant to be.
that said, as drivers improve, the 6800s and the 6900 will improve, they did have a fair amount of architectural tweaking done, unlike the GTX500s which are basically the same chip as the GTX400s with improvements in architecture aimed at power efficiency and yield improvement.
Anyone else trying to bench 6970 subzero and hitting issues with screen corruptions below 5C?
If I really thought 6870 would improve over 5870 in the long run I would rather get 2 x 6870 , they are newer and probably use less power ... However anyone have a clue if
6870 will indeed improve over time vs 5870 ...? any technical advantages that have yet to bear ? tesselation, etc....?
edit = it just got harder to make this choice 6870 for $196
http://www.overclockers.co.uk/showpr...56&subcat=1866
is this a hint that 68xx prices may drop a little to set the marketing sectors in a clearer light ?
Just checked , 6870 beats 5870 in xfire by about 10% in dirt2 and crysis at the moment ...
the "stalker refutation", lol@big bang theory
http://www.techpowerup.com/reviews/H...D_6970/20.html
6970 is slower than 5870 at both 1680*1050 and 1920*1200
Side grade means it is not a replacement and is not an increase in speed. What we would consider somewhat more of a sidegrade is the 6870. Very similar speed in both cards and even the naming is more similar.
The 6970 is a upgrade(although minor in alot of circumstances) because it is newer, faster and makes the 5870 line obsolete(hence production on it stopping entirely) and it taking place of the 5870 as AMD's top chip.
It would be weird if AMD a new series and it's next generation fastest chip wasn't an upgrade to its previous generation fastest chip, especially in all likelihood, this will be AMD fastest single chip for the 6xxx series.
What makes you think it is a sidegrade Stevil?
HD6970 CF video review:
http://www.youtube.com/watch?v=qqWowIQsGiU (Crysis / Metro 2033 runs). fps fun @min 11:35.
it was meant to be a sidegrade. Check the transistor budget... about 25% fewer transistors. It was meant to achieve around the same level of performance.
if it were meant to be an upgrade, transistor count would have gone up. check how the 6900s have an increased transistor count
it just keeps changing ....6950 for $318 now
http://www.overclockers.co.uk/showpr...56&subcat=1752
obviously retailers and suppliers are still catching up to the new releases yes ?
this site shows that 5870 and 6950 are basically level with a slight edge to 6950
http://www.bit-tech.net/hardware/201...-6950-review/9
its good times , 2gb ram will probably be more "future-proof" + better tech on 6950 than over 5870 , seems I've convinced myself to go 6950 lmao
the analysis is quite informative too
http://www.bit-tech.net/hardware/201...6950-review/11
and the conclusion is even more interesting, they indicate that GTX560 will have to be Nv s' way of meeting this card ( aside from ati own 5870 imo)
So I just sold my 5850.
Getting a 6970 soon! Just curious what brand to get, HIS or Powercolor? which is better?
I was gonna get an ASUS but ASUS are out late with their cards this time.
Should I wait, or get either HIS or Powercolor?
EDIT: I'm waiting for the ASUS EAH6970, since they've got a all-aluminum shroud instead of the plastic reference one!
I have got X1950XTX's ( GDDR4). HD4870, 2x 5870 from HIS, and all was great overclockers and never fail to me ( even when i was push OC on them like ) .
on 10GPU easy from HIS, only one have need a RMA ( after i push this card too much, one GDDR chips have die ) .
But maybe im just lucky . And all my gpu's since 2004 have been watercooled. + normally all reference boards are the same, only the price change and this is due to distributors stock and price they can do to the shops for xx numbers of pieces.
I am pretty sure HIS is a good vendor, since they have been partner with ATI for about 15 years or something.
Customer service with HIS in North America is worse than bad ( they outsource support for their products in NA and the company doing so is ran by useless people who would make subpar crash test dummies ). I have nothing good to say about their company. Don't waste your time and money with those jokers. Hint: I don't like them.
If HIS and Powercolor are your two options, take the Powercolor ( at least they will respond to their customers in some capacity )
yeah i have a HIS 6870 and it was been fine no issues..haven't tried there customer service because i havent had to yet...(knock on wood!)
if i were u id would go with XFX just because of the double lifetime warranty
lol
You can really max out any game? That's pretty good and maybe puts things into perspective with regard to the new card releases.
i didnt say any, i said just about any.
there are always those few games that push the limits.
we're not that odd ! :p:
think there's another one from apple that uses the same panel..
and yeah at high res there's no need to increase AA to crazy levels, can play every game available without a problem maxed out, think will pass this gen, or add a third card if I see one used for sale cheap :p
look at those prices.. so much for cheap :|
another U2711 lover hehe, best screen i have ever used
and you are definitely right about AA, 4x is the highest i ever need to go
Overclock3D HD6970 Crossfire Review
Same as other review pointed out, the HD 6970 performs great in CrossFire.
In this review the the HD 6970 CF competed well against the GTX 580 SLI in most games, specially in the average and minimum frames, and winning in the Crysis Warhead.
I was surprised the HD HD 6970 CF was faster than GTX 580 SLI in Unigine Heaven 2.1
The GTX 580 SLI cost $300 more, $740 vs $1040 (40% more expensive). Guess what I am going to buy. :)
Not really, I'm waiting for the HD 6990, hopefully close in performance but even less expensive. Just hope it's not going to be too much gutted.
EDIT
The calculation was done, based on today Newegg lowest prices.
Why is Hawx extremely bad for AMD. every test I see it's performing incredibly bad.
The game have been made in collaboration with Nvidia ( same as LP2 ), let say they have code it for don't look good on AMD cards .
Done with Nvidia hardware for Nvidia hardware. They probably just checked if it would work on AMD cards and that's about it, no optimization or anything. This applies to stuff like HAWX2 and LP2, both use DX11 features applied by Nvidia. Sucks for those who like these games and use AMD. Personally neither game interests me. They do give a nice perf boost to aggregate fps results in reviews though. :down:
Not like i care, but yes, i still ask me why some use those benchmark / games on their review, specially after have made some articles about their concern on the results with them, put it on a GTX5xx review for compare with old Nvidia gpu's, why not, but otherwise, what a waste of time.
I miss the days when new generation of hardware meant 100% percent update. It seems GPUs finally became like CPUs, incremental updates with each generation that only few (enthusiasts) can appreciate; the rest will be either scummed into buying into this new stuff or won't care at all. My one year old 5970 is still the fastest card out there, it's almost sad... It seems we're entering an end of an era.
Yeah but the blame can't be put entirely on hardware designers. I mean there's only so much you can do when you are up against electrical and physical boundries. Can't expect miracles time after time. Especially when redesigning an architecture 1 year ago from a shrink to the same node. I think the vast majority of fail is on spoiled end users thinking hardware makers OWE them something.
That's the sad thing, they include those games in some overall perfmonace charts, so the AMD gpu's appear to be slower in the average.
Hawx2 is an awful game (by review sites, consoles included). And lets not forget sites like guru3d, pcgameshardware, computerbase, hexus and others who comply with nvidia's "reviewers guide", and set catalyst IQ to high instead of default. In some games with 5-10% less fps without noticeable quality gains.
Even guru3d had an article prior to their "updated" version where they even zoomed the image with photoshop and stated that even at that scenario is almost impossible to notice quality loss with the default setting (which enable performance optimizations, same as nvidia default wich you can't touch -or you can?-).
More performance is not needed anyway since consoles call the shots on the video-game industry. What's sad about this is that techno-evangelists like Kurzweil and co are proved wrong. The future won't be much more advanced than the present, rapid development can only last so much and in the computer hardware industry it lasted around 40 years.
That was a good run, but now it's dying...
Techpowerup review:
HD5800
9.10
10.3
10.7
10.11
10.12
HD6900
10.11
10.12a
http://www.techpowerup.com/reviews/A...ormance/1.html
Game with clear gains:
http://tpucdn.com/reviews/AMD/Cataly..._1920_1200.gif
And other games :rolleyes:
http://tpucdn.com/reviews/AMD/Cataly...rfrel_1920.gif
Yeah I can tell in this review that driver updates for the 6900 will clearly improve the performance alot just like when nvidia worked out the new fermi arch (gtx480)
We will see I expect he next driver revision the 6900 series will see 10% gains
Just look at the stalker clear sky results the 5870 beats a 6970?
the reason lots of sites use them in reviews is because lots of people play them. im sure they could use a bunch of random indie games that run pretty even across Nvidia and AMD but nobody plays them...
as much as it sucks that LP2 and HAWX are Nvidia optimized people still play those games... and if your playing LP2 on AMD your just not going to have the best experience. Nvidia overall seems to have allot more consistent performance even in AMD sponsored games. like in LP2 it is so bad when it comes to similar performing cards when I recommend a new video card to someone I ask them if they will be playing LP2. if they are then I will lean towards an Nvidia Card, if not then pretty much whatever is cheapest
From the relative performance graph: GTX 480 faster than GTX580 eh? :down:
To chime in on games being optimized for nvidia... I used to be totally against it and be really angered when new games came out that were twimtbp because I knew they'd suck on my card. I've since realised that nvidia don't have any areas in which they take a massive hit, unlike ati, so I guess, dirty tactics or not, that nvidia is just a better choice. If ati were really concerned about us getting good performance in these types of titles they'd just pay the developers to optimize for them to. It appears as if they are not though, so 2 fingers up to amd for this generation!
Having said that I did nearly get the HD6970 instead of the GTX570 that's going to be my christmas present; just as good performance if not a little better in non-nvidia titles, 2GBs of ram, and eyefinity are all great reasons. The draw of the folding power on nvidia cards was too much for me though. Should amd fix folding performance it will be entirely the better card, but waiting on them to fix things has not proven to be the best strategy in my experience.
You think I would have picked that up!:( Sorry, still on my first cup of coffee today! I guess if I looked at the 4870 result I could have guessed what was going on :D
Makes me wonder whether those ati cards will actually get the performance boost people are talking about though, and whether nvidia has changed the architechture from 4xx to 5xx significantly enough that there is still performance to be eked from them with new drivers?
I second that! Avoid HIS unless you can get the card at a SIGNIFICANT discount. I've heard from multiple people that their customer service is horrible. So, you better hope you never have an issue with their cards. I once called them out of the blue to see if I could get a hold of someone and I could never get through on multiple attempts. Plus, the voice mail system was barely in english in the USA.
Depends on the approach I guess and as far as I remember AMD said in an interview that they will never de optimize games for their competitor.
That way proprietary standards like PhysX are bad, since you can have normal physics in a game or Nvidia can add a bit of green fairy dust (also known as money) to a title in production and suddenly normal features like e.g. fog are implemented in a way that will work only on NV cards.
You can call me biased, but I prefer to have standard features available to everyone. :shakes:
why does my memory think 9.4 wasnt the first driver released for the 4870? so i would have liked to see the first 2-3 months the card was released, then the newest one so that it covers the entire history. the 285 and 4870 show like no gains, while the 5870 and 480 have clear gains, and i wonder if thats just because we dont see the first driver releases.
I have a HIS card and have never had any problems with it itself, I usually think videocards are pretty stable unless you OC it to the maximum for long periods lol
I get some graphical glitching on metro, mostly when outside. It's something to do with the shadow/lighting system, it's all over the place. When i move the camera around the shadows move too, consistently.
Apart from that it looks and plays great with very high setting. I don't bother playing in DX11, (i know there is) but i see no real visual difference and FPS is quite a bit lower compared to DX10. That's with just tessellation enabled, with advanced DoF it halves my framerate.
Object motion blur on the very high preset is an absolute must :up: