Page 128 of 149 FirstFirst ... 2878118125126127128129130131138 ... LastLast
Results 3,176 to 3,200 of 3724

Thread: AMD Cayman info (or rumor)

  1. #3176
    Xtreme Mentor
    Join Date
    Nov 2005
    Location
    Devon
    Posts
    3,437
    Quote Originally Posted by [XC] hipno650 View Post
    i will wait for the 15th so that we can figure out exactly what this whole power tune AM/FM radio switch power saving maybe turbo boost stuff is. because for AMD to have the card set to a TDP and raise or lower clocks to stay within that limit would be nothing short of pure stupidity. considering most people who buy this card have no interest in OCing and know little more then how to plug in a stick of RAM let alone do research to find out that to get the most performance out of their brand new $400+ video card they need to hit some stupid switch. I appreciate AMD trying to make the card as efficient as possible but at the cost of performance like that is way to far over the line...
    So why you're not complaining about GTX580 throttling in Furmark then? Apparently it is limiting performance in that excellent game by quite a margin.

    Look at this feature from different angle -> Perlin noise from Vantage takes the HD5870 almost as high as Furmark does. This new Power Tune undoubtedly will affect games performance very slightly in very specific conditions only or not at all. I have not come across a game which would put similar load on GPU as Perlin Noise or Furmark at all (be assured I've tested quite a lot of them when my HD5870 was fresh).
    The good thing about this tech is that it is microcontroller based and not dependant on profiles.
    I can set my card power to be -20% and when playing older/less demanding games my card will take care of staying at reasonable power use while still delivering outstanding performance.
    Imagine a situation where you have card X with sustained gaming power 200W and card Y with variable power of 100-200W. Now you start Quake III TA with no vSync so you can fragg better. Card X will draw 200W and give you 2500FPS and on card Y while might reach 2500FPS @200W you prefer to save 100W and be happy with only 1500FPS. The side effect of that power saving will of course be lower noise as well.

    So instead of spinning new power features as bad thing try to look at them from different angle
    Last edited by Lightman; 12-13-2010 at 02:19 PM.
    RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W

    RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU

    SmartPhone Samsung Galaxy S7 EDGE
    XBONE paired with 55'' Samsung LED 3D TV

  2. #3177
    Xtreme Member
    Join Date
    Oct 2010
    Location
    192.168.1.1
    Posts
    221
    Which makes sense again... like I said, the earlier bench results that put the card at 5870 like levels could be because of the default TDP limit...

    once the limit was increased to 250W (or whatever), we got the real performance of the card which is slightly below GTX 580.

    Although, secretly I am wishing that the current scores we have ARE with the default TDP limit, and if we remove the limit, GTX 580's butt will be kicked... that'd be:

    1. Awesome
    2. My next card

  3. #3178
    Xtreme Addict
    Join Date
    Dec 2007
    Location
    Hungary (EU)
    Posts
    1,376
    Quote Originally Posted by SpuTnicK View Post
    Here is the Nvidia dual-GPU Monster to kick AMD HD 6990 but:
    That is Galaxy's dual GTX 470.
    -

  4. #3179
    I am Xtreme
    Join Date
    Oct 2004
    Location
    U.S.A.
    Posts
    4,743
    Quote Originally Posted by hurrdurr View Post
    Which makes sense again... like I said, the earlier bench results that put the card at 5870 like levels could be because of the default TDP limit...

    once the limit was increased to 250W (or whatever), we got the real performance of the card which is slightly below GTX 580.

    Although, secretly I am wishing that the current scores we have ARE with the default TDP limit, and if we remove the limit, GTX 580's butt will be kicked... that'd be:

    1. Awesome
    2. My next card


    That is my hope, the last thing we need is ATI not competing with Nvidia and to be stuck at last year's graphics speeds.


    Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card

    LSI series raid controller
    SSDs: Crucial C300 256GB
    Standard drives: Seagate ST32000641AS & WD 1TB black
    OSes: Linux and Windows x64

  5. #3180
    Xtreme Enthusiast
    Join Date
    Jun 2008
    Posts
    660
    Quote Originally Posted by Oliverda View Post
    Thanks
    An unfortunate person is one tries to fart but sh1ts instead...

    My Water Cooling Case Build (closed)

  6. #3181
    Registered User
    Join Date
    May 2009
    Posts
    22
    Quote Originally Posted by [XC] hipno650 View Post
    [...] considering most people who buy this card have no interest in OCing and know little more then how to plug in a stick of RAM let alone do research to find out that to get the most performance out of their brand new $400+ video card they need to hit some stupid switch.
    As someone whose interest in this thread is purely for tech curiosity* and the inner-world of graphics whores -- a breed that would confound an international symposium of psychiatry -- I just love the image of you all tossing and turning through sleepless nights fretting vicariously on behalf of the Neanderthal Sixpack. Oh Noes!!11, NVIDIA have re-branded again... Oh Noes!!11 there's a switch on the card. Think of the children (sixpack), ban this sick filth now!

    Fortunately windows aren't taxed these days because glass is less transparent.

    Anyway, do carry on... not you specifically, the breed. /research

    [*] my pc won't even run Aero

  7. #3182
    Xtreme Enthusiast
    Join Date
    Jun 2010
    Posts
    588
    Here is the Nvidia dual-GPU Monster to kick AMD HD 6990 but:
    yo boy..... no no no no this is not nvidia card this is from GALAXY and this card not use GF110 core this card use GF100 (2xGTX 470)

    look here



    and look here this is from nvidia dual-GPU (GF110)


  8. #3183
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Both are 2 x 8 pin so they wont go retail in that config. They're probably ES's or I guess in Galaxy's case, a "special edition" card

  9. #3184
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    I'd prefer if I could make it a TDP floor.

  10. #3185
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    Quote Originally Posted by hurrdurr View Post
    Which makes sense again... like I said, the earlier bench results that put the card at 5870 like levels could be because of the default TDP limit...

    once the limit was increased to 250W (or whatever), we got the real performance of the card which is slightly below GTX 580.

    Although, secretly I am wishing that the current scores we have ARE with the default TDP limit, and if we remove the limit, GTX 580's butt will be kicked... that'd be:

    1. Awesome
    2. My next card
    I doubt AMD would set the TDP throttle at a level that would hinder game performance at default clocks. That's the performance we get at 880MHz and that's it. If you want more you'll have to oc and up the throttling limit to 250W. Some people seem to have gotten the idea that upping the limit is somehow different from normal overclocking, but it isn't. Of course the benefit of having a slider that goes to 250W is that it might be higher than what AMD would otherwise set as the max TDP. Of course there's always the risk of bricking your card too.
    "No, you'll warrant no villain's exposition from me."

  11. #3186
    Xtreme Member
    Join Date
    Sep 2010
    Posts
    139
    Quote Originally Posted by hurrdurr View Post
    Which makes sense again... like I said, the earlier bench results that put the card at 5870 like levels could be because of the default TDP limit...

    once the limit was increased to 250W (or whatever), we got the real performance of the card which is slightly below GTX 580.

    Although, secretly I am wishing that the current scores we have ARE with the default TDP limit, and if we remove the limit, GTX 580's butt will be kicked... that'd be:

    1. Awesome
    2. My next card

    well, the latest results where the 6970 beat gtx570 by 25% was apparently with the 190W CAP... still have to wait 36hours or so to know for sure

  12. #3187
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by zshadow View Post
    Most people who buy a $400 card have no interest in OC'ing?

    I'd say it's the other way around..

    Also, the switch seems to be some sort of BIOS reset mechanism. I don't think it's related.
    not at XS but you need to think about the majority of the market here. out of most of the people I have ever met who call themselves computer enthusiasts and fully build there own systems and have a decent amount of knowledge about computers and hardware only two of them besides myself overlclock there systems CPU let alone there GPU. most of the time I end up doing all of the overclocking for them anyways because they don't have the knowledge or interest. the same applies to the majority of guys who work in the local hardware shops. most of them buy parts assemble them and call it a day. so lets say overall out of people I have met that would be in the market for a 6970 only 10% of them will overclock it (thats being very generous) and it's not like i don't have a great deal of exposure to this crowd either custom built computers are my hobby and my job so they take up a good amount of my life. theres more to the custom built and gaming seen then what is on XS however i would have a hard time calling anything but XS "enthusiasts"

    @Lightman i guess it would have to do with a difference in view points. for example I could care less about using an extra 100 watts to play old games (i own a 480 after all haha) also I have noticed that when monitoring my GPU usage in games old and even some newer ones it is very rare that I am using 100% of my GPU while gaming most older games on use 30% of my GPU thus my temps are much lower then when i am at 100% and same with my power usage anyways. the issue I have is that out of the box when you need all of the possible power you GPU has (playing Crysis for example) your GPU could potentially be down clocking itself for the sake of power consumption, which begs the question why did you buy such a high end GPU in the first place when in demanding games it will slow itself down at stock settings. now granted i doubt it will be that drastic but we will never know till the 15th comes around.

    as to why I have not complained about the 580/570 in furmark, i tend to think that any 3dmark test is much closer to real game play then furmark. furmark is made for three reasons, stability testing, heat testing and power consumption testing. 3dmark is at least made to provide some level of game performance representation (even though it fails to do so more often then not) i have yet to see the power throttling on the GTX 580 come into play in games because if I remember correctly it is driver based not hardware based... and for the record I am not a fan of what Nvidia did there which was deliberately to make the gains in power consumption look better then they really are. however power consumption is at the BOTTOM of my concerns when buying a new video card or when recommending a new card to someone

    who knows maybe there will be no games that trip the downclocking on the 6970 but if AMD does limit game performance then I will have a problem with it.
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  13. #3188
    Xtreme Mentor
    Join Date
    Nov 2005
    Location
    Devon
    Posts
    3,437
    @[XC] hipno650

    I see from where you coming

    I could make a car analogy but I will refrain
    Let's wait for official benches to judge this new power saving tech.
    RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W

    RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU

    SmartPhone Samsung Galaxy S7 EDGE
    XBONE paired with 55'' Samsung LED 3D TV

  14. #3189
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    115
    i guess it would have to do with a difference in view points. for example I could care less about using an extra 100 watts to play old games (i own a 480 after all haha) also I have noticed that when monitoring my GPU usage in games old and even some newer ones it is very rare that I am using 100% of my GPU while gaming most older games on use 30% of my GPU thus my temps are much lower then when i am at 100% and same with my power usage anyways.
    gpu usage is a useless metric to correlate to power consumption. Far Cry 2 used to show 100% constant usage, while crysis jumped around 70-95 percent while temps playing crysis were much higher.

  15. #3190
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Location
    Lima, Peru
    Posts
    600
    Moving this to the extreme:




    ¿UK prices? (Gibbo @overclockes.co.uk forum) found on b3d forums.
    6950 2048MB = £215-£230
    6970 2048MB = £280 - £320
    Last edited by Nintendork; 12-13-2010 at 03:22 PM.
    Athlon II X4 620 2.6Ghz @1.1125v | Foxconn A7DA-S (790GX) | 2x2GB OCZ Platinum DDR2 1066
    | Gigabyte HD4770 | Seagate 7200.12 3x1TB | Samsung F4 HD204UI 2x2TB | LG H10N | OCZ StealthXStream 500w| Coolermaster Hyper 212+ | Compaq MV740 17"

    Stock HSF: 18°C idle / 37°C load (15°C ambient)
    Hyper 212+: 16°C idle / 29°C load (15°C ambient)

    Why AMD Radeon rumors/leaks "are not always accurate"
    Reality check

  16. #3191
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by Lightman View Post
    @[XC] hipno650

    I see from where you coming

    I could make a car analogy but I will refrain
    Let's wait for official benches to judge this new power saving tech.
    I just hope the wait does not kill all of us
    Quote Originally Posted by gamervivek View Post
    gpu usage is a useless metric to correlate to power consumption. Far Cry 2 used to show 100% constant usage, while crysis jumped around 70-95 percent while temps playing crysis were much higher.
    in my experience I have seen usage while playing games to be directly related to temps experienced while playing games across a wide range of games. granted there with be differences and yes Crysis does seem to get the cards hotter and make them use power then most but usage it is still shows a certain level of GPU power consumption. a GPU running at 50% uses less power then one at 100% thats pretty common sense for the most part...

    just like when I run WCG on my CPU it uses more power then when I surfing the web...

    I have also noticed that most modern cards will auto downclock when the GPU gets below a certain level of usage which is fine for the most part because you don't need the extra power anyways... this however up until now has been driver based not hardware based.
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  17. #3192
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Posts
    970
    So this switch is like AMD saying: 'this is what we can do if we use the same TDP as the competition'.
    Last edited by flippin_waffles; 12-13-2010 at 03:28 PM.

  18. #3193
    Xtreme Enthusiast
    Join Date
    Mar 2009
    Location
    Toronto ON
    Posts
    566
    Gibbo post at OverclockersUK Forum.
    NVIDIA's & ATI's Latest Pricing as of end of this week
    GTX 460 768MB = £105 - £115
    5830 1024MB = £115-£130 **EOL** ***Practically 5850 performance - BARGAIN***
    6850 1024MB = £130-£140
    GTX 460 1024MB = £140-£160
    5850 1024MB = £130-£150 **EOL** ***Stock None Existent***
    6870 1024MB = £170-£190
    5870 1024MB = £180-£200 **EOL** ***Nothing sub £200 beats this***
    GTX 470 1280MB = £190-£220 **EOL**
    6950 2048MB = £220-£230 ***Expect £10 price increase in January***
    GTX 480 1536MB = £250-£270 **EOL**
    GTX 570 1280MB = £250-£290
    6970 2048MB = £285 - £320 ***Expect £20 price increase in January***
    GTX 580 1536MB = £350-£450 **Supply & Demand will keep this high**
    6990 4096MB = £450-£500
    Core i7-4930K LGA 2011 Six-Core - Cooler Master Seidon 120XL ? Push-Pull Liquid Water
    ASUS Rampage IV Black Edition LGA2011 - G.SKILL Trident X Series 32GB (4 x 8GB) DDR3 1866
    Sapphire R9 290X 4GB TRI-X OC in CrossFire - ATI TV Wonder 650 PCIe
    Intel X25-M 160GB G2 SSD - WD Black 2TB 7200 RPM 64MB Cache SATA 6
    Corsair HX1000W PSU - Pioner Blu-ray Burner 6X BD-R
    Westinghouse LVM-37w3, 37inch 1080p - Windows 7 64-bit Pro
    Sennheiser RS 180 - Cooler Master Cosmos S Case

  19. #3194
    Xtreme Member
    Join Date
    Nov 2008
    Location
    London
    Posts
    300
    Things are starting to get interesting for sure. We're starting to see the big picture now, as well as the intentions.

    If the 6950 comes in at just above £200, i'll buy it at launch
    -
    Core i7 860 @ 3.80GHz, 1.28v | GA-P55A-UD4 | G.Skill Ripjaw 4GB DDR3 @ 1900MHz 7-9-8-24 1N, 1.57v | HIS HD 6950 2GB, 1536sp @ 900/1400, 1.10v | Samsung F3 500GB | Thermaltake 750W | Windows 7 64bit | Air

    Crunching away...

  20. #3195
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Great pricing for 2GB cards

    FWIW those would be close to 58xx launch prices it looks like
    Last edited by zerazax; 12-13-2010 at 03:48 PM.

  21. #3196
    Xtreme Enthusiast
    Join Date
    Mar 2009
    Location
    Toronto ON
    Posts
    566
    Quote Originally Posted by zerazax View Post
    Great pricing for 2GB cards

    FWIW those would be close to 58xx launch prices it looks like
    Right on.

    Same goes for the 4GB GDDR5 HD 6990
    Core i7-4930K LGA 2011 Six-Core - Cooler Master Seidon 120XL ? Push-Pull Liquid Water
    ASUS Rampage IV Black Edition LGA2011 - G.SKILL Trident X Series 32GB (4 x 8GB) DDR3 1866
    Sapphire R9 290X 4GB TRI-X OC in CrossFire - ATI TV Wonder 650 PCIe
    Intel X25-M 160GB G2 SSD - WD Black 2TB 7200 RPM 64MB Cache SATA 6
    Corsair HX1000W PSU - Pioner Blu-ray Burner 6X BD-R
    Westinghouse LVM-37w3, 37inch 1080p - Windows 7 64-bit Pro
    Sennheiser RS 180 - Cooler Master Cosmos S Case

  22. #3197
    Would you like some Pie?
    Join Date
    Nov 2009
    Posts
    269
    If they are anything like the 5870 launch prices, I am pretty sure I will end up with 3 6970's.
    Xeon W3520 @ 4.0Ghz ~ 3x 7970 ~ 12GB DDR3 ~ Dell U2711

  23. #3198
    Xtreme Mentor
    Join Date
    Nov 2005
    Location
    Devon
    Posts
    3,437
    Quote Originally Posted by zerazax View Post
    Great pricing for 2GB cards

    FWIW those would be close to 58xx launch prices it looks like
    Yes, I've paid for my HD5870 £287+£5 shipping on launch!
    These prices are better than I've expected!
    My dilemma now: one HD6970 or two HD6950?
    But the concious is saying one HD6970 is plenty enough for 1920x1200
    RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W

    RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU

    SmartPhone Samsung Galaxy S7 EDGE
    XBONE paired with 55'' Samsung LED 3D TV

  24. #3199
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    I bet one 6970 is overkill for 1920x1200. 2GB on those guys is asking for 2560x1600 or multi-monitor!

  25. #3200
    Xtreme Member
    Join Date
    Nov 2008
    Location
    London
    Posts
    300
    I only game at 1680x1050, so a 2GB 6950 WILL probably be overkill. But at least i can increase those AA and AF levels indefinitely
    -
    Core i7 860 @ 3.80GHz, 1.28v | GA-P55A-UD4 | G.Skill Ripjaw 4GB DDR3 @ 1900MHz 7-9-8-24 1N, 1.57v | HIS HD 6950 2GB, 1536sp @ 900/1400, 1.10v | Samsung F3 500GB | Thermaltake 750W | Windows 7 64bit | Air

    Crunching away...

Page 128 of 149 FirstFirst ... 2878118125126127128129130131138 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •