MMM
Page 12 of 34 FirstFirst ... 2910111213141522 ... LastLast
Results 276 to 300 of 828

Thread: AMD Radeon HD6950/6970(Cayman) Reviews

  1. #276
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Found something interesting...
    Quote Originally Posted by Anandtech
    Finally, the 6800 series also introduced support for HDMI 1.4a and support for color correction in linear space. HDMI 1.4a support is fairly straightforward: the 6000 series can drive 3D televisions in either the 1080p24 or 720p60 3D modes. Meanwhile support for color correction in linear space allows AMD to offer accurate color correction for wide gamut monitors; previously there was a loss of accuracy as color correction had to be applied in the gamma color space, which is only meant for use for display purposes. This is particularly important for integrating wide gamut monitors in to traditional gamut workflows, as sRGB is misinterpreted on a wide gamut monitor without color correction.
    So 5xxx series incorrectly worked with wide gamut monitors, and now it's fixed? Or does it only apply to monitors connected via HDMI, and those connected via DVI have no gamut issues with either card?
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  2. #277
    Xtreme Member
    Join Date
    Sep 2010
    Posts
    139
    Quote Originally Posted by SKYMTL View Post
    Regarding efficiency (or lack thereof), I just want to make it clear that we do everything possible to exclude outside variables from our power consumption test.

    The power supply is plugged in to a Tripp-Lite line conditioner that regulates input voltage to a constant 120V. If this isn't done, input voltage WILL impact the consumption of the system and let's be honest; no one likely lives in a house with constant live voltage.

    In addition, the 3DMark test we use puts a very minimal load on the CPU. This is important since the CPU's fluctuating load patterns in most of the apps out there can really mess up results.

    I am also surprised to see quite a few sites still using FurMark, etc to test power consumption. AMD admits to that program being capped by PowerTune.
    But every program that will exceed a certain power will be capped on the 69xx series. So either you tested with powertune set on +20% or something else is scewed in your test.

  3. #278
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Xope_Poquar View Post
    Are there ANY reviews with the Catalyst 10.12 drivers??
    Quote Originally Posted by z3d. View Post
    Well, considering that in most of reviews they used 10.10 or 10.11 drivers, don't you think that the real performances will be clear when AMD releases 10.12? I mean, the Big Cayman could run nearly as fast as Gtx580 with some (5+10%) driver encrease!
    Most people seem to be dreaming in technicolor here. If you think that AMD released drivers to reviewers that wouldn't feature the full performance, you are sadly mistaken. This these hopes and dreams seem to happen with EVERY AMD and they NEVER come to pass.

    The drivers given to the press (both 10.11 and 10.12 RC2) are "full speed" drivers, make no mistake about that.

    Having used both, I can tell you that they are identical in performance other than a small increase in 3DMark 11 scores and a patch that fixes some random crashes I was having in F1 2010.

    For all of you harping on the drivers used, I think that is in poor form simply because there won't be one lick of difference between the two packages in terms of overall single GPU in-game performance.


    Quote Originally Posted by Nintendork View Post
    The one to blame should be TSMC at the end for this "new generations".
    I want to address this right away because placing the blame on TSMC shouldn't be done once you understand the situation.

    As I said in my article, AMD had already begun taping out their cards on 32nm when it was realized that the power and space savings incurred were minimal whicle the pricing was too high. This meant AMD decided to use an existing 40nm process for their entry and mid-level cards. This left Ibiza at 32nm but since the volume from the lower-end cards would no longer be there, TSMC decided to cancel the 32nm node altogether. This left Ibiza which then needed to be ported to 40nm and eventually became Cayman.

    So, it wasn't TSMC's fault but rather a combination of economic decisions by both parties which sunk 32nm.

    Quote Originally Posted by tajoh111 View Post
    I can't believe alot of you guys were slamming SKYMTL last week for trying to control your expectations and not to expect so much. He was probably the closest to giving accurate performance for the 6970 and he had an NDA.
    I tried.

  4. #279
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by flyck View Post
    But every program that will exceed a certain power will be capped on the 69xx series. So either you tested with powertune set on +20% or something else is scewed in your test.
    Nope. PowerTune only caps when the infered TDP reaches a certain point. If a program doesn't allow the core to each that point, PT won't kick in.

    It is completely possible that the program I use pushes TDP closer to the PT barrier that what Anand and some other sites use.

  5. #280
    Xtreme Enthusiast Kai Robinson's Avatar
    Join Date
    Oct 2007
    Location
    East Sussex
    Posts
    831
    Seriously - did no one READ what i wrote? The only reason these cards are disappointing, is because you CREATED all the Hype, with rampant, pointless speculation, and expected a giant killer - which is NOT what AMD have been trying to make. They wanted Cayman at 28nm, TSMC let them down, and they had to re-configure for 40nm - what about that DON'T you get? Sometimes i swear you people ask for too much.

    Main Rig

    Intel Core i7-2600K (SLB8W, E0 Stepping) @ 4.6Ghz (4.6x100), Corsair H80i AIO Cooler
    MSI Z77A GD-65 Gaming (MS-7551), v25 BIOS
    Kingston HyperX 16GB (2x8GB) PC3-19200 Kit (HX24C11BRK2/16-OC) @ 1.5v, 11-13-13-30 Timings (1:8 Ratio)
    8GB MSI Radeon R9 390X (1080 Mhz Core, 6000 Mhz Memory)
    NZXT H440 Case with NZXT Hue+ Installed
    24" Dell U2412HM (1920x1200, e-IPS panel)
    1 x 500GB Samsung 850 EVO (Boot & Install)
    1 x 2Tb Hitachi 7K2000 in External Enclosure (Scratch Disk)


    Entertainment Setup

    Samsung Series 6 37" 1080p TV
    Gigabyte GA-J1800N-D2H based media PC, Mini ITX Case, Blu-Ray Drive
    Netgear ReadyNAS104 w/4x2TB Toshiba DTACA200's for 5.8TB Volume size

    I refuse to participate in any debate with creationists because doing so would give them the "oxygen of respectability" that they want.
    Creationists don't mind being beaten in an argument. What matters to them is that I give them recognition by bothering to argue with them in public.

  6. #281
    Registered User
    Join Date
    Nov 2006
    Location
    Earth
    Posts
    55
    Wooww.... Cayman... all the way...
    Two Caymans... oh My God....
    Two Caymans All the way.... So beautiful....So Powerful.....
    wWwooooohhhoo..hoo..hooo.....
    O.. my godd... Oh.. my goood..... Whoooooo.......
    What does this mean? OOoooohhh.... Ooooohhhhh......

    Sooo Powerful..... so beautifullllll..... Like double Rainbowww....

    Oooooohhhh..... OOoohhhhhhh....... oooh.. my gooddd.....

    Two Caymans.... oooohhh...... ooooooohhh.....

    /JK



    Performance wise, cayman is a Failure.

    AMD is getting HATTONed this round.

  7. #282
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    Holy . What were you guys expecting AMD to do being stuck on 40nm? They clearly weren't going to create a 300w monster as that would go against what they've been doing since the HD3870. The HD6970 looks to me like a test of the architecture while they wait for the 28nm node to mature for mass production. On top of everything it isn't like the HD6970 is a GTX 480. It is a minor improvement on the HD5870 and it paves the way for a killer 28nm core.

    Also, it is slightly faster than the GTX570 with a little over half a gig more memory and priced slightly higher in what I feel is the proper performance/price slot. What do you want, AMD to bleed money so you can get your high-end GPUs for $200?

    I'm so disappointed in you guys, not AMD. AMD kept their lips sealed, you guys worked yourselves into a frenzy, and then turn on AMD when they don't deliver what you had hyped yourselves into expecting.

    Just so we are ready for the HD6990.. "omg it comes with 2560SP, 128TMUs, two men/women(depending on the bundle you purchase) to wait on hand and foot and it consumes negative 400 watts!."

    I will agree though that this naming scheme is horrible. What a trainwreck that is.

  8. #283
    Xtreme Member
    Join Date
    Sep 2010
    Posts
    139
    Quote Originally Posted by SKYMTL View Post
    Nope. PowerTune only caps when the infered TDP reaches a certain point. If a program doesn't allow the core to each that point, PT won't kick in.

    It is completely possible that the program I use pushes TDP closer to the PT barrier that what Anand and some other sites use.
    That is what i'm saying. But your result indicate a 150W difference between idle and load for the 6950 which has a power cap of 140W. Add the idle statement and you are getting close to the 200W for the 6950, which is the +20% on powertune. So either you set it at +20% or it is not kicking in when it should be.

    Or the cpu is loaded more then you expect. But that won't change the strange difference between the 5850 and 6950. If the 6950 would be capped 140W, that would mean the 5850 would use 90W during 3dmark. which seems a bit optimistic.

    note that most numbers look to be correctly when you watch the idle-load of the same card. I do however see something unexpected for the 6950 and the 58xx-68xx. The difference is to big while their power consumption should be in the same range.
    Last edited by flyck; 12-15-2010 at 07:26 AM.

  9. #284
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    6950 crossfire beats the 580 sli in 5760x1080 resolution, so not sure why people think Nvidia is faster?
    http://www.hardwareheaven.com/review...ty-vs-sli.html

    so for me, playing at that resolution, I save half the cost chosing amd 6950.
    2 amd cards for one one 580gtx if they were in stock............................another CARD isnt cost effective buying two nvidia cards for more money for worse performance????
    twice the failure wouldnt you think?
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  10. #285
    Xtreme Member
    Join Date
    May 2009
    Location
    Italy
    Posts
    328
    A good review from Xbitlabs:
    http://www.xbitlabs.com/articles/vid...70-hd6950.html


    Impressive HD6970 vs HD6950 same clock



    vs GTX570:



    Power consumption:




    others power consumption test (HD6970 = GTX570):
    http://www.computerbase.de/artikel/g...stungsaufnahme

    http://www.hardware.fr/articles/813-...ossfire-x.html
    Last edited by Gilgamesh; 12-15-2010 at 07:27 AM.

  11. #286
    Xtreme Member
    Join Date
    Oct 2005
    Posts
    354
    Battlefield Bad Company 2 @ 1920x1200 6fps advantage over a 5870, you gotta ing be kidding me.

    I am really disapointed
    i7 920 @ 4.0GHz
    Scythe MUGEN-2 with Push/Pull
    Gigabyte EX58 UD5
    3X2GB G-Skill DDR3
    Sapphire 5870 1GB Vapor-X
    OCZ Agility 120GB
    24" Acer HDMI LCD
    Corsair TX850
    Lian-Li PC-V1000

  12. #287
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    Quote Originally Posted by flopper View Post
    6950 crossfire beats the 580 sli in 5760x1080 resolution, so not sure why people think Nvidia is faster?
    That seems weird, really weird. I mean everybody has Crossfire or SLI these days. Not to mention three monitors.

  13. #288
    Xtreme Member
    Join Date
    Oct 2009
    Posts
    241
    And only playing F1 , we already knew 6970 does very good in F1 game ,now try playing more games.
    .:. Obsidian 750D .:. i7 5960X .:. EVGA Titan .:. G.SKILL Ripjaws DDR4 32GB .:. CORSAIR HX850i .:. Asus X99-DELUXE .:. Crucial M4 SSD 512GB .:.

  14. #289
    Xtreme Member
    Join Date
    Aug 2007
    Location
    Montenegro
    Posts
    333
    According to Anandtech, the 6970 goes head to head with the GF480...I think this is pretty impressive..I don't know why some of you folks in here find this card to under-achieve.....this card was intented to compete against the 480 and so it does......and remember, in CF mode these cards are very powerfull....what did you most expect to see from these cards anyway? they are still under the 40nm technology....blame the semis...for the delay of course.
    Internet will save the World.

    Foxconn MARS
    Q9650@3.8Ghz
    Gskill 4Gb-1066 DDR2
    EVGA GeForce GTX 560 Ti - 448/C Classified Ultra
    WD 1T Black
    Theramlright Extreme 120
    CORSAIR 650HX

    BenQ FP241W Black 24" 6ms
    Win 7 Ultimate x64

  15. #290
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by Vardant View Post
    That seems weird, really weird. I mean everybody has Crossfire or SLI these days. Not to mention three monitors.
    dont care about your setup.
    I care about value for my money.
    there 6950 kick 580 gtx.
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  16. #291
    Xtreme Member
    Join Date
    Oct 2009
    Posts
    247
    // Just woke up... All reviews added from topic and PM. Special thanks to PM guys, makes adding easier :P
    Also start adding video reviews for those who hate read bazzilion pages... like meh xD
    Game Rig:
    Intel Core i7 920 (3.0Ghz) || EVGA X58 Classified (E760)|| 3x2 Gb A-DATA 1333Mhz Triple Channel + 3x2 Gb Patriot 1333Mhz Triple Channel || WD500GB + WD750GB + Hitachi 1TB || PowerColor Ati Radeon 5850 1024MB GDDR5 CrossFireX|| Chieftec 1020W || Acer 24" P243 (1920 x 1200) || Razer Copperhead Blue || Microsoft Reclusa || SteelSeries Seberia 7.1 || CoolerMaster CosmoS

    Water cooling:
    WC HeatKiller 3.0 || 2x 120mm Koolance || Koolance RP-980BK || Koolance nozzles

  17. #292
    Xtreme Member
    Join Date
    Sep 2010
    Posts
    139
    Quote Originally Posted by Gilgamesh View Post
    A good review from Xbitlabs:
    http://www.xbitlabs.com/articles/vid...70-hd6950.html


    Impressive HD6970 vs HD6950 same clock
    basically 10% more shading power results in 1-2% performance increase? also those 10% shaders apperently cause a 80W difference??

    Why doesn't the math work out?

  18. #293
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    Quote Originally Posted by flopper View Post
    there 6950 kick 580 gtx.
    I'm not disputing that. But your wording was, hmmm, let's say unfortunate.

    The point is, it's probably one of the rarest cases you can think of. So asking, why everybody else sees things in different light could be considered trolling.

  19. #294
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by flyck View Post
    That is what i'm saying. But your result indicate a 150W difference between idle and load for the 6950 which has a power cap of 140W. Add the idle statement and you are getting close to the 200W for the 6950, which is the +20% on powertune. So either you set it at +20% or it is not kicking in when it should be.
    You have your numbers wrong.

    HD 6950 PowerTune Max = 200W

    "Typical" Gaming Power (whatever that is...) = 140W

    So yeah, I AM getting close to 200W which is the upper limit of power tune and basically justifies the methodology.

  20. #295
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by flyck View Post
    basically 10% more shading power results in 1-2% performance increase? also those 10% shaders apperently cause a 80W difference??

    Why doesn't the math work out?
    Adding geometry horsepower for high-level DX11 rendering adds die size and increases TDP.

  21. #296
    Xtreme Member
    Join Date
    Sep 2010
    Posts
    139
    Quote Originally Posted by SKYMTL View Post
    Adding geometry horsepower for high-level DX11 rendering adds die size and increases TDP.
    Considering 6950 and 6970 have the same die, same tesselation and only minor differences as in less shaderclusters. i don't understand your comment :-)

    but an 6950 ran at 870MHz and slightly higher memory clock than 6970 (so little less core clock and bit more memory bw, 10% less shaders etc) causes a a performance difference of 1-2% and a difference of 80W on average (between 6970 and the oced 6950) makes me wonder what is wrong.

    You have your numbers wrong.

    HD 6950 PowerTune Max = 200W

    "Typical" Gaming Power (whatever that is...) = 140W

    So yeah, I AM getting close to 200W which is the upper limit of power tune and basically justifies the methodology.
    If that is the case then yes.

  22. #297
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Posts
    970
    Looks like an awesome card. A few fps here and there are meaningless, especially since AMD targeted these price points directly.
    Pretty good engineering IMO. Also PowerTune looks like killer technology. Imagine the implications for mobile ( ala fusion, mobility 6900) and Antilles. Like anandtech says, 'this is just the beggining for Cayman.

  23. #298
    Xtreme Enthusiast
    Join Date
    Jan 2010
    Posts
    533
    They didn't. There were last minute price drops. GTX 570/580 surprised them quite a bit.

  24. #299
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by flyck View Post
    Considering 6950 and 6970 have the same die, same tesselation and only minor differences as in less shaderclusters. i don't understand your comment :-)
    Ah. Well, I am not shure what reviews you have been reading but here is the rundown.

    Basically, the move to VLIW5 to VLIW4 did save ALU space on the die but AMD didn't just use this to shrink the core. Basically, they used the "saved" space and expanded upon it by adding additional SIMD engines. The engines with their associated TMUs, cache, etc naturally take up more die area than the reduction to VLIW4 allowed.

    The graphics engine iteself (which contains the fixed funtion units like the tessellator) was cloned so instead of one, there are now two. This also adds to the transistor count.

    Finally, there are the GPGPU compute changes that necessitated an additional direct memory access engine be added along with some other bits.

    All of this of course added to the transistor count and thus increased overall TDP.

  25. #300
    Xtreme Member
    Join Date
    May 2009
    Location
    Italy
    Posts
    328
    Quote Originally Posted by flyck View Post
    basically 10% more shading power results in 1-2% performance increase? also those 10% shaders apperently cause a 80W difference??

    Why doesn't the math work out?
    That math don't works also for HD5850 vs HD5870, 1 years ago



    3% for 320-5D vs 288-5D.

    And power consumption:

    Last edited by Gilgamesh; 12-15-2010 at 08:29 AM.

Page 12 of 34 FirstFirst ... 2910111213141522 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •