Funny the only time I see DilTech around in News section is at a GPU launch, either defending Nvidia or speaking negatively about ATI/AMD. ;)
Printable View
Funny the only time I see DilTech around in News section is at a GPU launch, either defending Nvidia or speaking negatively about ATI/AMD. ;)
When a large group speak negatively yet wrongly constantly about something, anyone impartial will end up looking biased as they correct those who were incorrect. I also correct those who say something wrong in the other direction, but generally those people are corrected far before I have a chance to say anything.
If you look around this forum, a lot of people here are practically a lynch-mob against NVidia for whatever reason. That's not an opinion either, that's an actual fact. It actually got so bad that I was asked to stay out of the gpu threads entirely for quite awhile merely because correcting people made me LOOK completely biased, even though my information was correct. Same reason most of the staff on this site stay out of GPU threads as I'm sure you'll notice. It was funny to me, because at the time there was a shiny 4850 sitting in my gaming rig, yet people said I was biased against AMD.
Think about that for a second, and things will begin to make more sense to you. :up:
I generally stay out of cpu discussion unless there is something big on the horizon because frankly speaking if you have a c2q or better most of it really isn't a big enough improvement to want to spend your time with. I mean, I'm still on a Q6600, and I haven't seen enough of an improvement over this cpu to warrant buying a new mobo, ram, and cpu yet. Before that Q6600, I only used AMD chips during the P4 vs A64 era, and once the X2's came out I pretty much would NOT touch an intel chip until the C2D showed up.
I go by what works best for what I want to do. If I see a glaring issue(like say, DX9 power on the GeForceFX series) I steer clear of it until it is proven that it's a non-issue. If people aren't informed of that possible issue, I tend to point it out to them. At present, it appears AMD may still not be up to par on tessellation performance. This isn't a sure thing, but it appears to be a possibility, which is why I stated that only time will tell if it's a driver or hardware issue.
I just can't believe that Cayman draws more juice than a GTX280. Where are all of the hot and power hungry complaints now?
Just in case no one gave you any links, Charlie sums it up rather nicely in this article:
http://www.semiaccurate.com/2010/12/...thern-islands/
It actually backs up everything SKYTML said, just in more detail. And I really don't care about any bias toward the author, he actually admits he was wrong a lot in this article and he f*d up. But explains why.
Its actually an interesting read...Quote:
"That explains Cayman, but why weren't Cozumel and Kauai shrunk too? Why does NNI/NI-40 have four members while NI-32 only have three? That one comes down to pricing. Even before the 32nm knifing, AMD ran the numbers, and realized that for a mid-range chip, the VLIW4 architecture, while completely doable on 32nm, would not be as cost effective as one made from selectively borrowing from NI-32, and backporting to 40. Cost per area equivalent bit TSMC, and cost them a lot of wafer starts.
Long before TSMC surprised the world about 32nm, or lack thereof, Cozumel and Kauai were killed and replaced by Barts, Turks and Caicos. In a nice bit of synergy, when 32nm was killed and the decision to backport Ibiza was made, a lot of the key pieces of the chip were already ported, or well into the process of being ported to 40nm. Give and ye shall receive."
I think everyone expected it this time to be honest. We all knew this was going to be AMD's hottest and most power hungry chip since the HD 2900xt.
That's what I was pointing out in my little summary. Initially people were mad because for whatever reason they expected GTX 580 like performance, even though people like Sky MTL were telling them that expecting that much would lead to their disappointment, thus my statement.
Also the 1920 and above was covered in that too. :up:
The poor showing in some titles may be due to tessellation power(hawx2 for example), but we won't know for sure until we see if AMD can correct performance there in future drivers.
The 6950 2gb really is the sweet-spot for cayman though.
Anyway we shouldn't complain so much because right now there's very great competition in every price segment which hasn't been the case for a very long time and prices change/have changed quite a lot lately for the better if you look back only a couple of months. :D
^ Yea I'm sure many other here sees it like this, I know I do. Also I know it will lead to unnecessary arguing/thread crapping and possibly flamebaiting etc. so I think it's just better to let the biased people think what they do, most don't care if they are wrong or right anyway.
The 6950 is best-in-class in performance per watt, only beaten by the lowly 5450 in Techpowerup´s test, at least in 1680x1050 and up:
http://techpowerup.com/reviews/HIS/R...D_6950/30.html
It draws less power than the GTX 280 in all tests by 17- 67w, except for BD- playback where it draws 6w more:
http://techpowerup.com/reviews/HIS/R...D_6950/27.html
The 6970 is quite a bit less efficient though, drawing 12w less in idle and up to 38w more under load:
http://techpowerup.com/reviews/HIS/R...D_6970/27.html
The GTX 280 is not in the performance per watt charts, but I think it is safe to say that it would be way below either Cayman, much more than the around 20% lower power consumption vs the 6970.
I think, all in all, that the 6950 is a pretty solidly performing card, priced around 80% of the 6970 and GTX 570 which are priced equally locally.
Edit: I don`t know where you read that the Cayman GPUs draw more power than a GTX 280, I just checked the results in a second review I had read earlier (Anandtech), and there the GTX 285 (which is even better than the GTX 280 in this regard) draws more power than both 6950/70 in all tests:
http://www.anandtech.com/show/4061/a...eon-hd-6950/24
GTX280 is also two and a half years old.
This I can agree with. 6950 looks like the hell of a card. 6970 on the other hand I can't see one reason to buy one over GTX570.Quote:
I think, all in all, that the 6950 is a pretty solidly performing card, priced around 80% of the 6970 and GTX 570 which are priced equally locally.
I know, you were the one saying it draws less power than Cayman, I was just trying to say that the 6950/70 are still pretty efficient after all. The 6970 is also beating all GTX 400/500 series card in performance per watt in 1680x1050 and up, except the GTX460. Also notice my edit of the last post.
I don`t know, it`s a a little more efficient, priced around equally here, and it has more memory than the 570. It really depends on what you would use them for.
I agree, there is no one reason to buy a 6970 over a GTX570. Similarly, there is no one reason to buy a GTX570 over a 6970. They are about evenly matched, so it comes down to individual usage/preference.
If you are running high resolution, multiple cards, and/or multiple monitors then a 6970 might be the card to get. If you run lower resolution, want to save a few bucks, want physx, or play games with very high tessellation then a GTX570 might be what you want to get instead. These "one point to rule them all" arguments might sound nice, but they rarely mirror how we actually use cards in our daily lives.
Are there others like rage3d's review? They added measuring power usage for multimonitor setups. For a single monitor, the 6950/70 are 10 watts less, but the gtx 570/580 are 22 watts less for multimonitor. That's hurting the 6950/70 in my eyes(you can guess which way I'm leaning for a monitor upgrade.) I'd be interested in hearing whether that power usage could be tweaked down.
depend what monitor upgrade you speak about, if it's for use in Eyefinity or Surrround ( 3 monitors ), you need 2 GPU in SLI for Nvidia.
As for media, and multimonitor ( + media playback ), i will ask me if they have been able to push all the same video treatment they have use on the AMD for the Nvidia ( deblock, noise, mosquito, contrast dyn, 0:255, white brilliant, color flesh etc... ) there's a big difference on accelerate the video by the GPU ( VLC; Flash player, Cyberlink, MPHC ) adding those different methods on the GPU usage.
Remember that HD 6000 had to stay on 40nm similar to HD 5000.
That way I guess AMD was unable to break the laws of physics and produce a chip with more transistors/performance and decrease the power draw.
But hey, HD 6000 still leads the wat/performance charts, so you shouldnt be complaining. :shrug:
The HD6870 maybe leads in Perf per watt but the 6900 series is just a TINY bit better then GTX 5XX's perf per watt.
looks like to get the extra DX11 Performance both GPU manufacturers needed to up the power consumption.
looking at things now AMD and Nvidia have extremely comparable GPU's both offer strong DX11 performance, are priced similar, use almost the same amount of power and have half decent stock coolers... i really comes down to if you want a larger Frame buffer (HD6970) or support for GPGPU that is actually useful/Physx (GTX 570)
what I am disappointed with is that fact that we never really got the price war we were hoping for...
+1
There is a large group of people talking negatively about AMD incluiding you. Are they all impartial and unbiased? One more time including you? I doubt so.
Sure it is your opinion and since when is your opinion a fact.
Just maybe some people have good reason to be unhappy with some of the Nvidia tactics, some of which looks like doesn't bother you at all.
Dear Mr. DilTech,
Although we all appreciate any effort to correct facts and obvious mistakes on our behalf, I guess that most of us already have a mother and/or a father and are in no need of another one. Anyway, I’m sorry to hear about your experience with us and I do hope that we will behave more restrained/correct in the time to come.
At the same time I feel the need to say the obvious as there – in my view - are some very good reasons as to why Nvidia have got their fair share over the last couple of years. What actually triggered this post was your wording “…for whatever reason…” and I started to wonder where you have been lately.
The disrespect that Nvidia has shown us – the buyers of their cards and software - over the last couple of years are to the best of my knowledge breath-taking and unprecedented in any “Fortune 500” company; I have actually never seen someone :banana::banana::banana::banana: their pants as many times and in the ways that JHH has been over the last couple of years.
I’m sure that you’re aware of most of his :banana::banana::banana::banana:ty pants but believe that listing some of them will help us all:
1. The renaming of the renaming of the renaming of the renaming of the renaming of the renaming …
2. Woodscrew gate
3. The announcement of the announcement of the announcement of the announcement of the announcement…
4. “Business practise” like an Italian Godfather combined with Jim Carrey’s impersonation of someone that completely has lost their mind – Big arse ego, Assassins Creed, etc. comes to mind +++.
5. “Can of whoop arse”…
6. For some more examples please have a look at Saaya’s signature… :-)
I honestly believe that one has to have a JHH mobile on vibrate stuck up one’s arse if one’s not capable of grasping the above or actually just agreeing. However, there’s been a change over the last months and I can only hope that The Board of Directors, the CEO and the administration understand how this has hurt the company’s reputation and credibility, and has changed their practise.
We are sick and tired of “whoop arses” and just want JHH to shut the :banana::banana::banana::banana: up and deliver the goods; No more talk, just walk the walk. Then, and only then, I believe that Nvidia will be taken seriously and the lynch-mob disappears. I know for sure that’s when I’ll start considering buying their stuff again.
Rob
Ps. I don’t look at AMD/ATI as the innocent company, although they are called “The peasants company” in the Far East due to their lack of market professionalism. They have however, kept their mouth shut and delivered to the their best of capabilities. Merry xmas to you and you’re family.