With an environmental fee of $100 for contributing to global warming.
Type: Posts; User: Dark-Energy; Keyword(s):
With an environmental fee of $100 for contributing to global warming.
That was kind of a stupid move to sell it honestly. You never want to sell a card unless you have proof or are absolutely sure that the newer card coming out is going to be a lot better that it's...
They probably gain the extra DP-FLOPS/W for the same reasons AMD does because of doubling the shaders and dropping the hotclock. AMD has a lot more TFLOPS power than Nvidia, but the cards still...
My question is, why would they do something like that. To cut the shader clock in half and tie it with the core and then double the shaders (like ATI). Doesn't really make sense to me because their...
I'm not surprised at all, really. We know Nvidia pulls cheap tactics like this all the time, it's nothing new. You'd have to be delusional to deny something like that. Just look with Unigine and...
My guess is the 150W means max board power. To speculate, typical consumption would be less than that of a 550 Ti if they are the same speed.
How does it get P33520? A pair of 580's at stock clocks gets that same amount, no way it would get that at 600 mhz.
Good christ, what the hell is this, it's a freaking abomination
I'd feel embarrassed just by owning this
I never really expected it to be faster than 580 SLI, but at least it's practically 6970CF performance, which I am rather glad of.
http://www.anandtech.com/show/4209/amds-radeon-hd-6990-the-new-single-card-king
Damn you, ninja
There's a setting called Reduce DVI Frequency on high-resolution displays in the digital flat panel properties in CCC. Tick it on.
Also, are you sure you are using a Dual link DVI cable? Check...
Nah, it's because half of the AMD employees got abducted by aliens and couldn't design a faster card in time. I think that's a more probable cause :rofl::ROTF:
Both of them seem unlikely. Northern Islands (Cayman) was originally supposed to be on 32nm but due to the cancellation of it by TSMC, AMD resorted to 40nm, which still seemed to work fine. It's hard...
I think 2012 is a typo. I think they mean 2011.
Here it says 2011....
If you're running mid 80's during Linpack then that's not an issue, because nothing other than Linx will stress the CPU out to that temp as it maxes out it's TDP. It's normal to have LinX/IBT run at...
All 6950/6970 cards have that nip at the power connector, it's a cooler design flaw and every card has it, don't worry. As for the dirt on the PCB, that happens all the time, I don't know what it is,...
BSOD 124 on LGA775 platforms means Vcore 90% of the time, but sometimes can be caused by VTT being too high or too low (I've tested this myself)
No, C1E/EIST is not on. I don't think it was 1.105, I think maybe 1.15 instead.
Yeah, I think FSB is probably the culprit here. It takes a lot of work to get these damn things to 500 FSB.
Well my Q9550 had a VID of 1.105 (I think) But she needs 1.36v for 500x8 (4ghz) So VID doesn't always imply it will overclock well.
Nice results, the cards seem to scale very well. :up:
Can you also test them at 880/1375? I'd like to compare the scaling of the cards to 6970x2 Xfire scores.
Has MSI afterburner been updated yet or has there been a new beta to allow voltage adjusting on these cards? I'm getting extremely impatient:confused:
No, I think
1920 x 1080
4xaa
16xaf
Normal tesselation
6950 and flash it to 6970
I can confirm this works on my XFX 6950 I recently purchased. Runs furmark, 3dmark11, and unigine without any problems, so I think it's good.
Has it fixed it completely?
Maybe this is also the solution to my issue.