You've tested the majority of them?
Printable View
No they are NOT poor overclockers, but they don't overclock by themselves. I don't think I can count more then 2-3 reviews where the vGPU was raised beyond 1.162v with ATI's OV tool. We've seen the reviews, now let's see how they'll do in HWBot rankings (overclocker average graph is what I am looking for). Once overclockers will get their hand on them and put them to work, we'll know exactly how most of them clock...
You are making the snowblower rev at 5000RPM instead of normal 4600RPM and expect the snowblower to return the same fuel consumption ??
You overclock the clock so you must at least should over-voltage the card a bit as its a high end version and not a entry or mid range.
5850 OCed with stock voltage is a good idea, but 5970 Oced on stock not so much IMO.
I agree. What I don't agree with is how ATI marketed the card to the press and their board partners. I am taking issue with some of the statements here and other places because they basically repeat ATI's marketing slides verbatim. Slides that hold claims which are on shaky ground due to the unpredictable nature of overclocking. Simple as that.
Memory, yes it overclocks well. The cores? Not so much.
..just because some people don't know how to really push these cards, doesn't mean they are bad overclockers, it simply means that some people don't know what they are doing..
when the 4870x2 came out..most all review sites i saw where under 800mhz for the cores..usually 780..30mhz over stock..but when u put the card into hands of people that know what they are doing..800mhz seems like the minimum and ive seen 830(usual) to 900(xtreme)..me personally at 850core..
so ill take review sites and overclocking with a grain of salt :)
EDIT: i do see u understand it takes more voltage to get anything nice from the cores..but its to be expected with this card since its under clocked and volted to meet the 300w tdp
Core voltage increases were needed for the x1800, x1900 and x1950 in order to overclock them. Therefore, it should be no surprise if we have to still do that now. Unless some have forgotten we had to actually do that? Some of you act like this is new or something :shakes:
a low voltage cpu also sucks at overclocking, but somehow anyone who gets one knows to first up the voltage to normal levels before OCing. why is this seeming like such a harder concept?
Yeah, I bought it along with the other 2000 people that enjoyed ATI's huge stock dump just before the launch =))Quote:
Originally Posted by cowie
And yes, you are right, they are not very good clockers out of the box, they are average clockers for the high-end segment. But that is one thing, and "they are not good clockers" period is a totally different thing. Overclocking in general is a completely different thing.
The voltage scaling and the negative temperature scaling are the ground stones of overclocking. You seem to have the feeling that every chip scales with voltage in the same way, and you keep repeating that. I don't even think I have to adress this ideea...Quote:
Originally Posted by SKYTML
Ok, let's put it in like this - for daily-use, average IT-educated buyers (which is the vast majority of them) you are right, HD 5970 is not a bright overclocker, with it's stock voltage and all that. But since this is XtremeSystems, not mildly-stock-volts-systems, and overclocking usually means something a little bit different here, try to be more specific when you adress issues like overclocking.
i dont think anyone is saying that they dont clock when volt modded.
bah duel cored cards you know what to expect anyway.
thanks monstru for reply
btw
my comment was torward eastcoasthandle
Put these puppies on water! Until then I'm reserving judgement on their clocking abilities because face it, the stock cooler isn't up to the task to master 2 R870 cores at high clocks ( using stock cooling, I could burn one up in the Vantage feature tests I assure you ) Are the EK blocks shipping yet by the way?
cowie - I know, we posted at the same time :)
chickenfeed - this is the funniest part. This is the only high-end cared where I reached the maximum voltage that is possible to adjust by software, on air, and I still have scaling :) Of course, this is not for daily use, I am talking for benching and such. Anyway, this week I'll make some time to vmod the card and put it under some LN2, to see exactly how bad it clocks.
How does one just decide to vmod the card?
Is it the same on all different models of cards or just easy to trace out what you need to do by looking at the board?
Depends on the card.
if the tech documents are available for the VREGs then its pretty easy. If not then you get out the probes..
Bingo.
The card was marketed as having a large amount of overclocking potential AT STOCK VOLTS. However, it does not overclock well at STOCK VOLTAGE. Is this so hard to understand? No one is saying that with additional voltage input, the clock speeds will still be bogged down.
If anyone recorded ATI's presentation, fast forward to ~minute 40 and listen. I can't remember which slide they were presenting at the time.
Heh, I like the standing red, and knocked down green king in the slide :D Never noticed it before
Quote:
Originally Posted by SKYTML
Hmm...:rolleyes:Quote:
Originally Posted by AMD
Were you part of the actual press presentation / teleconference? If so, please refer to the section where they verbally mention that most users can see 850 - 900Mhz on stock volts and 900+ with the overclocking tools provided by ATI and their board partners. I believe it was during slide 16 as Wiz posted.
I'll upload the audio file that was sent to me once I get approval.
you need to visit tpu more often :) and read my reviews
http://img.techpowerup.org/091125/Capture277.jpg
My friend, nobody really cares how ATI is marketing their products TO PRESS. It is your job, or my job, or anybody else that writes articles to filter the marketing BS and have aclean report. Now, why do you keep on repeating over and over again how ATI is lying about their product? Everybody tries to fill our head with BS in teleconferences, that is why I do not participate in this kind of things anymore, even if I am invited. I am tired to hear 45 minutes marketig BS about some technology, and 15 mins about the board itself. I can remember how the conference was for GTX 275....55 mins CUDA and PhysX, 5 minutes GTX 275.
Anyway, our job is to inform our readers in a correct manner. That is why, unlike other sites, you will not find 3-4 pages about DX11, Eyefinity and other stuff like that in one of my ATI articles, and you will also not find 3 pages of CUDA and PhysX and BS in every Nvidia card review I did this year (unlike other sites - If I want to inform readers about PhysX or Cuda or Eyefinity or DX11, I will make a separate article, ONCE, not repeat that in every test for that manufacturer..) So....who is telling the BUYERS that ATI HD 5970 has an enormous OC potential with stock volts? Because ATi sure as hell does not. They tell that to you, not to the audience.
I am just happy to see an ATI review from someone that knows how to configure catalyst..:rofl::rofl:
dude that PCIe scaling review rocks, its scary how little of an improvement there is between 4x on 2.0 and 16x. i was hoping to see atleast 20% drop from 16x to 8x.
im thinking if there is a common link between this and why we dont have 4GB as the standard for cards. since the textures being moved around are so tiny (compared to how much performance has changed in the last few years)
i would like to see something like GTA4 in the same test. or a game with heavily modded textures and effects which should only be run on a top end system
Newegg had the Asus version available just moments ago http://www.newegg.com/Product/Produc...82E16814121357
I missed that as well :mad:
:(
Actually, cards like the HD 5970 can become bandwidth starved at PCI-E 2.0 x8 and PCI-E 1.1 x16 speeds in some games. Far Cry 2 is a prime example of this as is STALKER: CoP. It really makes me wonder when cards will saturate the x16 2.0 interface. 18 months? Less? Considering the raw power of the HD 5970 in its stock form and the fact that there will be dual 8-pin versions pushing 900Mhz+ quite soon, things could get interesting.
Especially considering PCI-E 3.0 with its 128/130 encoding probably won't see the light of day next year either... :down:
That is what I am reffering to. You are bashing them for some BS marketing to press. Well, everybody delivers BS marketing to press, what is the problem as long as they don't try to BS the buyers?Quote:
Originally Posted by SKYTML
This looks like AMD HD 5970 press presentation
http://www.youtube.com/watch?v=vhTb2DRPUoI
Off course it might not be the same you talk about.
Anyway the slide 16 starts at about 14 min and there is nothing in the presentation that the 850 - 900Mhz can be done on stock volts. In fact in that presentation the slide didn't even reach 900MHz. Plus right on the slide is "Complete Control - at stock voltages or higher"
I did see some reviews with 900MHz OC including the LegionHardware review which claim the 900 MHz was stable. If I can get that with stock cooler with adjusted voltage I'll be very happy.
Off course first I have to get the card. I ordered the card from NCIX Canada on November 18 when the card was in stock, the order was confirmed and CC debited but never shipped.
Nope. That wasn't the one. :confused:
Yes, and then they posted THIS.Quote:
I did see some reviews with 900MHz OC including the LegionHardware review which claim the 900 MHz was stable. If I can get that with stock cooler with adjusted voltage I'll be very happy.
Anyways, overclocking or not it should be interesting to see what the next step will be. :up:
Oh snap. I was probably going to get the 5970, but after reading about the throttling even at STOCK... i'll wait even longer to see how Fermi compares
and theres no easy fix thanks to the 3-phase/gpu pwm
5970 should sport min 4-phase/gpu pwm.. 5870 does and its single gpu
in order to keep up to 5870 stock clocks: 4-phase/gpu
in order to run dual 1ghz core stable and throttle free: 5-phase/gpu
why did amd go with 3-phase/gpu ?? a 4-phase/gpu 5970 would be in my system right now
great review, too bad there out of stock every time I check.
would custom pcb 5970s fix this?
if so by the time they come out fermi should also be out... great timing lol
http://anandtech.com/weblog/showpost.aspx?i=657Quote:
Radeon 5970 Overclocking: The VRM Temperature Bottleneck
In our Radeon HD 5970 review, we ran in to some issues when trying to overclock the card to 5870 speeds of 850MHz/1200MHz. At the time this is something we attributed to the VRMs, meanwhile AMD suggested that it was cooling related, and that we should manually increase the fan speed.
As it turns out, we were both right, we just didn’t have the tools at the time to properly identify and isolate the issue. Late last week we got our hands on a beta version of Everest Ultimate, which added preliminary support for the 5970. With that, we could read and log the voltages and temperatures of the various components of the 5970, and properly isolate the issue.
From that, we’ve discovered a few interesting things about the 5970.........
lol at pcper. The cards are getting a 100fps average on world in conflict and for some reason they turn off water reflecting clouds, lol, idk, seems funny.
then in the screenshot of the game benchmark you see whatever is beeing tested getting 16fps yet this number fails to show up on any of the graphs.
ammm... you were talking about stock voltage and max clock, they talk about max voltage max clock and the impact of this on the VRM's...
You concluded AMD lied and 5870 freq are not achievable they said 5970 can achive 5870 speeds but better accurate voltage tool is required.
No. The Dnet client can begin throttling at STOCK. It doesn't take much (as Ryan noted) to push the VRMs up that additional 19°C. It bodes extremely ill for other GPGPU apps which may push things beyond what DNet does.
Part of their conclusion:
This has been my point for the last few pages...Quote:
but based on how the 5970 was promoted and presented the fact of the matter is that the card can’t meet its advertised capabilities
if only they build the stock cooler a little different, they wouldnt have had this issue
Advertised??? Advertised to whom? Do they have banner on the net "buy this, it clocks like hell?" Do they write on the box "super OC-able board inside"? Where is the ADVERTISING? Are we confusing marketing BS fed to press with advertising? Oh boy, oh boy...
You're splitting hairs. Anything marketing can also be considered advertising. In a way, reviews are an advertisement for a product since they are giving said product face time in the public's eye. As such, what companies comminicate to reviewers, they expect will be comminucated to the public in some form. In this case, it was communicated and then later debunked. THAT is a reviewer's job: to communicate information and then look into the claims and refute them if necessary.
As for advertisement:
http://www.hardwarecanucks.com/forum...eb3385ed3e.gif
Did they transmited the information directly to the buyers? NOQuote:
advertising - A paid, mediated, form of communication from an identifiable source, designed to persuade the receiver to take some action, now or in the future.
Did they offer to pay you to transmit that information to the buyers? I like to think not.
Advertising is not the same thing with PR actions, like press conferences. Period.
Woaaa...great, so it says it had headroom...Well, it does. How much? it does not say that. Anyway, please start bashing all hardware companies for their banners then, everybody is marketing overclocking this days, with much more aggresive selling lines then that in the banner above. Why aren't you upset with Asus or Gigabyte or Nvidia or any other manufacturer for their banners? :)Quote:
As for advertisement
A....sorry...the can expect whatever they will, but a reviewer's job is to test the product, to analize it, and then report what he found. A review is NOT advertising, since it can very well be negative. A review is putting the product to the test and reporting findings, be them bad or good. Otherwise, just give the public the PR PDF, and your job is done...Quote:
In a way, reviews are an advertisement for a product since they are giving said product face time in the public's eye. As such, what companies comminicate to reviewers, they expect will be comminucated to the public in some form. In this case, it was communicated and then later debunked. THAT is a reviewer's job: to communicate information and then look into the claims and refute them if necessary.
YES.
http://www.amd.com/us/products/deskt...-overview.aspx
Quote:
Unlocked, this graphics card has massive headroom so you can take control and push your hardware to its full potential!
What a nasty claim, I have never seen any hardware manufacturer lying like that, almost without shame... :rolleyes:
But wait, YOU can take control and push it to the max. Which brings us to the very beginning of the conversation - it does not overclock itself :)
You can't buy the card directly from AMD anyway, I see no Buy it now button. Joe blow won't be purchasing this card anyway, they'll be purchasing the newly released 310 GT from Nvidia ;)
I think your giving AMD too much benefit of the doubt.
If you think these marketing slides were only meant be be seen by the press, I don't think you are giving the public enough credit.
Reviewers, websites are the most likely to see through the BS, however, alot of reviewers will post these slides which will also be seen by readers.
Additionally people are naive, there are plenty of people that take AMD mouth, as word for word accurate and advertise to others, that this card has plenty of overclocking headroom.
The overclocking headroom has been particularly stressed with this card. It's why there are so many websites disproving, bigs ones like anandtech, because there is real controversy here. I haven't seen a card particular stressed for its overclocking potential by the GPU manufacturer as much as this card. And the opposite is happening, we are getting worse overclocking potential than most cards in general on stock voltages.
It overclocks a bit better with higher volts, but as anandtech noted, with certain applications, said overclocks are useless when the card throttles down.
Not to mention overvolting a card, voids the warranty if anything goes wrong with most if not all manufacturers.
How AMD is advertising the overclocking ability of this card, its seems like its idiot proof and you got massive headroom. This clearly isn't the case. It involves finding an overvolting tool, requires tweaking of your fan profile to possibly impractical levels.
Secondly, its this type of debugging that might sway people to buy 5850CF over this solution. Its cheaper and more overclockable.
Excuses excuses. You say that it can overclock safely without issues. I called shens. Anandtech and LH have said the same. It throttles performance due to VRM temps once clock speeds / voltage is increased and that has now been proven. As such, it was NOT meant to have "massive overclocking headroom".
You asked for proof ATI claiming this card has massive overclocking headroom. I showed you just that.
It doesn't matter what the end user does. What matters is that ATI is CLAIMING their card can do something that it CAN'T because of current cooling limitations. That isn't me BSing you or anything else; it is written as plain as day on the ATI website. It is those same statements that AIBs take and write directly onto their packaging and send out as information to be posted as a product description on Newegg, NCIX, etc. Marketing, advertising, it doesn't matter because in the end it will be filtered down to consumers who don't know it is false information.
SKYMTL=Anti-AMD crusade lately for some reason. :shrug:
i saw several 5970's reaching 1000 ghz so this conversation starts to bore me yawn
Well, have you seen any Nvidia, Gigabyte or MSI slides lately? I've seen. Maaan there are some interesting claims there. Why don't I see nobody bashing them?Quote:
I think your giving AMD too much benefit of the doubt.
If you think these marketing slides were only meant be be seen by the press, I don't think you are giving the public enough credit.
Reviewers, websites are the most likely to see through the BS, however, alot of reviewers will post these slides which will also be seen by readers.
Big sites like Anandtech said that high VTT will kill your Wolfdales and they were wrong. Yes, people are naive indeed :)Quote:
The overclocking headroom has been particularly stressed with this card. It's why there are so many websites disproving, bigs ones like anandtech, because there is real controversy here. I haven't seen a card particular stressed for its overclocking potential by the GPU manufacturer as much as this card. And the opposite is happening, we are getting worse overclocking potential than most cards in general on stock voltages. It overclocks a bit better with higher volts, but as anandtech noted, with certain applications, said overclocks are useless when the card throttles down.
Overclocking is not idiot proof, it has never been and it will never be. The amount of overclocking is not guaranteed by the manufacturer. Overclocking to high clocks is not at hand for everybody. If it were, we would all be kingpin now :)Quote:
Not to mention overvolting a card, voids the warranty if anything goes wrong with most if not all manufacturers.
How AMD is advertising the overclocking ability of this card, its seems like its idiot proof and you got massive headroom. This clearly isn't the case. It involves finding an overvolting tool, requires tweaking of your fan profile to possibly impractical levels.
That is not a bad ideea at all. But the difference is not that big. Let's end this conversation and see what we are actually talking about. Of course, keep in mind that these data is based on the samples I have, this does not represent the overall OC potential of any of the cards.Quote:
Secondly, its this type of debugging that might sway people to buy 5850CF over this solution. Its cheaper and more overclockable.
So, HD 5870 is by far the best of them all. Rightfully, this should have had no limits in CCC and it should have been marketed as the VGA with massive OC headroom. However, the BBA only gave me 925MHz GPU with stock volts. The study bellow is done with an Asus HD5870.
http://lab501.ro/wp-content/uploads/...1/HD5870OC.jpg
This is the HD 5870's VRM, by far the best of the three cards.
The HD 5850 I used was the reference sample from ATI. It shows pretty good voltage scaling. The vGPu is lower then HD 5870's, but higher then HD 5970's.
http://lab501.ro/wp-content/uploads/2009/11/HD585OC.jpg
This is HD 5850's VRM.
I have tested 3 HD 5970 in overclocking, one HIS and two BBA's. The all clock within 10-20 MHz range. This is done with one of the BBA's. As you can see, the HD 5970 has the lowest vGPU.
http://lab501.ro/wp-content/uploads/...1/HD5970OC.jpg
Of course, as we all know, this is how HD 5970 looks naked.
Now, taking into consideration we have a two full Cypress GPU's on a PCB, with all 1600 stream procesors and all that, and we have the lowest vGPU, does it clock bad? No, it seems normal to me and it seems in line with what her sisters are getting. HD 5850 has a little bit simpler GPU and a higher vGPU, while HD 5870 goes way in front due to it's strong VRM design. Is there anything wrong here? No, this looks normal to me.
Now, did ATI do a smart thing by marketing the HD5970 as the unlocked board with Oc headroom? No, this is marketing BS, the HD 5870 should have had unlocked CCC and should have been marketed like this. Are they lying to the buyers? No, it is just every day marketing BS that we see in all the slides and presentations, from all manufacturer. The just wanted to add some value to the product using this claim.
So yeah, marketing HD 5970 as the "fierce overclocker" is not a very bright ideea, but it just falls under the marketing BS category we see everyday, it is not something to get that excited about.
Some last few words. The data above is from an article I am working on, about overclocking on HD 5xxx series. That is "work in progress" data, since I did not go with all the cards to the same max voltage, I did not use the same stepts, and so on. Also, this is just the part about air-cooling using software to adjust the voltaje, the cards will also be vmodded to find their tru max. Of course that will happen on LN2. So, this is work in progress stuff, i know it's not apples to apples comparison since the values for vGPU are different between the 3 cards.
Sorry bro, but somebody is looking for excuses, and it is not me. I am happy, fine and dandy about how HD 5xxx overclock, and I have screenies to proove that. If id does not work for you, it does not mean is bad ;)Quote:
Originally Posted by SKYTML
I'm not going to drag up your post history, but you have a clear anti-AMD bias going on. I have no problem with that, but don't expect some people to believe otherwise.
You spoke highly of Batman and PhysX, but downplayed Eyefinity. You said that AMD doesn't have to do anything when it comes to the open standard community, and just "sit back" and let the community do everything. You didn't have anything good to say about the C3 steppings, stated that hardware tessellation "isn't needed", stated that any DX11 lead AMD gets would be"washed away in no time", then said that AMD is "spinning their wheels" as Fermi "gets close", then said that days of supply problems can "turn into months", then said all Nvidia has to do is leak some info, and people will just completely forget about ATI cards and wait.
Need I go on? Okay I will...
You also said the sales of the 57xx cards are weak, even though you admitted it was "circumstantial evidence", then in a what I thought was a very bizarre statement, you said the 5970 probably has a "2x128-bit memory interface" which defied logic.
Maybe you don't realize it, but you most often have a downbeat tone when you talk about AMD, not so much with Nvidia. On the other hand. Hardwarecanucks.com is an excellent site and that tone doesn't translate into the articles from what I've seen, props for that. :)
SKYTML - Yeah man, I know, that is exactly what I am talking about. We see so much marketing BS lately that we just have to ignore it, no matter who it comes from. I really do not care about their claims, I just run wy own tests and that is that. GTX 295 was the absolute king of VGA's for almost an year, and I had no problem with that (man I love that board). Now HD 5970 is the mother and father of them all, and I am happy to play with it and I love it too. If GT 300 will be the next big thing, I will also be happy. For me it's about toys more and more powerfull, and I like them all. Who has the best product at the moment is my favourite for that moment. But BS I ignore from all of them since I am sick of it.
And believe me, I am not praising ATi or AMD. I just love the HD 5970 for what it is. ATI and AMD got enough bashing from me when i was not happy with HD 5870 sampling and availability, straight on the site, a full page of it :)
eleeter - :up:
Telling the truth isn't the same as bias. Unlike many other people in my place, I dont' strive to be politically correct since it doesn't benefit anyone if I was. I hate glossing over things. However, I have no issue at all with AMD or ATI.
Guilty. Proudly so. The PhysX effects in B:AA are amazing. However, I don't speak highly about PhysX. Rather, I avoid it altogether. Hence why you never saw an article about it from us. And probably never will other than some testing of the EVGA Co-Op.Quote:
You spoke highly of Batman and PhysX, but downplayed Eyefinity.
And they do. Which is a good thing and is what the Open Source community is all about. I mentioned that as well. Plus, you're going to tell me Stream has actually come a long way since its inception? That ATI talks alot about DirectCompute, OpenCL, etc and has actually made a huge investment in any of these areas when it comes to development? Heck, one of their main talking points in the last few months was of the Bullet Physics library...which is being developed with the help of Nvidia.Quote:
You said that AMD doesn't have to do anything when it comes to the open standard community, and just "sit back" and let the community do everything.
This isn't an issue of not wanting to do something. Rather I have always felt it is due to a lack of funding on AMD's part. That's the truth...as I see it. ;)
My comments were regarding overclocking in general. How no two chips overclock the same. My posts in this thread have stated the exact same thing in reference to HD 5970 cards.Quote:
You didn't have anything good to say about the C3 steppings
This I will ask for a link to. Must have been a REALLY bad day or something. :shrug:Quote:
stated that hardware tessellation "isn't needed"
No, I said they COULD be washed away if the supply problems continue. Once again: the truth.Quote:
stated that any DX11 lead AMD gets would be"washed away in no time"then said that AMD is "spinning their wheels" as Fermi "gets close",
And they have, haven't they? Try to find STABLE stock of HD 5800 series. I dare ya.Quote:
then said that days of supply problems can "turn into months",
Truth again. Nature of people's buying habits and all of that. Did it myself when I was waiting for a Mazda Protege and the Mazda 3 was announced.Quote:
then said all Nvidia has to do is leak some info, and people will just completely forget about ATI cards and wait.
And I hold by this. 100%. Heck 110%.Quote:
You also said the sales of the 57xx cards are weak, even though you admitted it was "circumstantial evidence"
No. People in the thread were commenting about lackluster performance at certain resolutions and all sorts of idea were thrown around. Nothing was stated for a fact. Monstru and I were publically hashing out ideas.Quote:
you said the 5970 probably has a "2x128-bit memory interface" which defied logic.
As you said: Need I go on?
I DO understand where you are coming from and maybe I need a stricter filter on my thoughs but I hate brushing something under the carpet just for the sake of trying to make everyone happy. Around here, I cut loose by talking about the way I see things instead of being politically correct. I'll continue to do that since I don't think you guys deserve to have stuff sugar coated. This has nothing to do with Nvidia, ATI, AMD, Intel or any other company since I compliment what I think is right for consumers. So. when something comes up that I think is trying to pull the wool over your eyes, I say so.
Yeah, I am sure my posts would look so much better if I was playing nice... :D
Here is it just because you requested it.
http://www.xtremesystems.org/forums/...09#post4078209
But we can drop this back and forth it's not really relevant to this thread, agreed?
Yes it can but at cpu's cost, same as phy's the more the tessellation much harder it is for the cpu to handle. Faces and complex objects are tessellation monsters and doing tessellation on them in a 60 fps with high pixel rate is no joke.
This thread has a great code for tess. use
http://www.gamedev.net/community/for...opic_id=531164
Try the code in both software and then a hardware environment. I used RenderMonkey and rewrote the code back when my brother had a 2900 GT and it worked.
you guys are fighting nice but i wanted to say something.
i got two one.... clocks like crap the other is broke so much for all the press reviews...stores are the real world.
i know that dont seem to have anything to do with what your talking about but it really does
well i'm sry if i dont shed no tears for you since the press guys dont have to buy them:p: but that does suck dont it :(
and whats with the reviews(some really think and spew that:rolleyes:) that say it uses less power at load than a 295...thats plain outright bs as you well know;)
It does use less power then GTX 295, the meter is saying that, not the review. Now think about it and tell me why it does that since the TDP is higher. C'mon, it is not that hard... :)
this is not the thread for that and i dont want to go into it here if you dont believe me thats ok
the only thing i'll say is that the one that works ok needs added volts to run staible 3d at stock speeds the other wont even do that its just an 2d queen.which is hard to get a furmark ss from if it dont last even 1 second lol
but i'm not going farther into it here in this thread :)
this is not the place i dont think right?
i cant afford a good metter to test all i can go by is software and thats not the best way.
i just did not do well with my two cards which sucks for me other are having a field day clocking and benching
i'll go grovel in my self pity now
I think that you should back your negative opinions with proof. Show us screenshots, pics with the card, etc. Which manufacturer is it, where did you buy it from, etc, etc.
There are 2 situations here. One, you are a die hard Nvidia fan which really cannot accept that HD 5970 takes a big, nice pee on GTX 295 and simply mocks it. Well, it's hard to accept, but it does. It rapes it without lube so to speak :) Because of that, you make this allegations here. No problem with that, you would not be the only one, we know the type.
Two, you are not what I said above, and you are an honest buyer. You bought 2 HD 5970 and you have problems with both of them. One is DOA and the second is also defective, since it needs extra volts for stock clocks. Now, if this is the case, before you RMA you cards, you should really document this. Because you might not be the only one. And if ATI is having problems with HD 5970's, with TSMC yields or whatever, this is the best way to find out. You document your case here, with pics, screens and everything that is needed, and two days from now another guys will do the same, and 3 days later another one, and so on. That is called proof, and if that kind of proof will be posted here by a number of users, well, needless to say that any one of us that writes for a hardware site will pick up the story, and ATI is in big trouble. But we all need proof for that, not just a few words in a post.
i called off the photo shoot after rigging up both a pcp&c 1200w and a giga 1200w pro to still a beat card
this is my ss of quadfire 2d
http://img136.imageshack.us/img136/9329/iaintlying.jpg
[URL=http://g.imageshack.us/img136/iaintlying.jpg/1/]
as for being a nv fanboy hell yeah i am...but i am a vga fanboy before that i mean i did spend 1250 on thses just for 3d benches since i dont game much
You said one was DOA...I see 4 cores enabled there...
no i said it came broke
oh in 2d it works fine any thing in 3d forgetabout and yes i'v tested it by itself
plz this is news section lets just move on ok? this has cuased me to take plenty of xannax the last few days lol
Ok, as you wish, I don't see nothing wrong there, just somebody that really hates these cards. Anyway, If you have problems RMA them, that's why you have warranty.
no i dont hate the cards just diamond qc
you kidding i was hitting 130k 03 with one @950/1080 @1.28v cpu @4.2
it just sucks that i cant return the one for another brand just cus the gigabtye one i have does not clock that great on rams and core is by no means a reason to return it...hell parts cost us enuff already
Diamond? Is that a brand? Oh Jesus this reminds me of the old days of Lucy Star K5 boards...
Well, you said one does not work (broke) and another one needs volts for stock speeds. Now how is it? Does it need volts for stock speeds? If yes, that is a reason to return it. Or it just does not clock well? And how did you get 950MHz core but you say it clocks like crap and it needs volts for clock speeds? I do not understand, sorry.
oh great now you tell me lol
no the diamond one wont work in 3d at all added volts or not,the giga needs 1.10 to run 775 1.13ish for 800 in 3d bench game or fumark.
1.30ish is not so good for 950 is it? thats crappy i think cus a fellow guru member can easy run 1000 at 1.25 ish but i aint worried about the giga one i'll add volts and dont run stock any way if it breaks i'll send it back but for now its keeping me busy tinkering
if the cards ran 03 05 06 and vantage i could care less about anything thing else i have fun clocking cards most of the time but come on dont lie you would be miffed a tad to.
i'm being good not bashing anything thanks to the xannax i'm usally over the top about this sorta stuff lol
Finally info that I've waited for..
the VRMs get hot as hell
according to AT the card is really only good for 775Mhz OC without water.
I'm glad to see somebody is finally looking at the VRM temps. my old 4870x2 VRMS go hot even with a full cover but stock cooling could not handle the VRMs.
Radeon 5970 Overclocking: The VRM Temperature Bottleneck
I was curious about this VRM failures on 4800 series... so i just tested my 4870x2 (stock clocks) in Furmark even tho i do not like that, i was shocked :eek:.... even my Artic cooler Xtreme wasn't up to the job guys We are talking a cooler that keeps 2 GPUs idle 40 load 67 for hours.... that stock cooler on 5970 WILL fail at anything that puts huge 3D load that's for sure.
Ok this run goes fine... 1024x768 windowed Xtreme burn mode
http://img205.imageshack.us/img205/8...indowed.th.jpg
Now 1920x1080 Fullscreen Xtreme burn mode... here comes doom
I cannot provide a screenshot since i did not have time (seconds) to take a screenshot as my computer just crashed.. big time hang.
Results 300 seconds = 70c max shown into the Furmark window that stabilized over 100 seconds BUT my SS allowed me to see the VRM temps while in Fullscreen... zomg 144celcius... :eek::eek: wow that cooler is really good but still isn't able to cool these mofo VRM's! imagine what a 5970 cooler can achieve! AMD must really release 2-3 fans coolers before claiming "overclock headroom". Too bad it failed, i'll try later and provide a SS.
Asus HD 5970 Voltage Tweak Edition Review
http://www.bjorn3d.com/read.php?cID=1741
What are the vrms like on 5870, rather than 5970? Theres a very large difference in temps between gpu1 vrm's and gpu2 vrm's on anand.
Needs a full cooler replacement to cool them. The 3rd party coolers often didnt do vrm's (properly, or at all) on the 4000's so I hope the ones this round are better (but tbh I'm expecting rehashes)
The VRM's are extremely fuc*ed up since 4'th series and unfortunately the trend continues,the VRM's on my 4890 @ 1030/1150 with 1.4125 volts reach 114 degrees with no problem and after that temp the card just locks up,I tried to cool them with a fan placed above them but with no succes,smth like the Thermalright Vrm cooler solutions would solve the problem but it's kinda expensive...
I haven't had a Nvidia card for some time now,is the same problem present on the GT2XX series?
VRM temps have been an issue when pushing cards of both ati and nvidia. It's often a limiting factor overclocking them. I think EK water blocks was designing a water block for the 5970 that actually put water channels over the vrms... it's about time aftermarket cooling companies addressed this problem.
If I up the volts on my 280 GTX and go for high clocks, my vrms will get close to or over 100deg, depending what I am using to stress it.