Page 17 of 30 FirstFirst ... 71415161718192027 ... LastLast
Results 401 to 425 of 730

Thread: OCCT 3.1.0 shows HD4870/4890 design flaw - they can't handle the new GPU test !

  1. #401
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    The 82A issue and so on seems to suggest an issue with OCP kicking in

  2. #402
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by zerazax View Post
    The 82A issue and so on seems to suggest an issue with OCP kicking in
    There is something strange, ie possibly unknown, still going on though since certain reference cards are not having problems, even some oc'ed cards.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  3. #403
    Xtreme Legend
    Join Date
    Jan 2003
    Location
    Stuttgart, Germany
    Posts
    929
    Quote Originally Posted by Tetedeiench View Post
    :I have no idea. I'd guess the protection is in the BIOS, but that's just a guess.
    nope it's configured via a resistor, no software way to control or disable ovp

  4. #404
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    213
    Quote Originally Posted by SNiiPE_DoGG View Post
    I still dont get why this program crashing cards is a problem, so if applying AF stops the card from crashing then your set - just enable AF in your games and all woes are gone.

    I'm not calling this thing a power virus or anything like that, but it is clear to me that this program is loading the card in certain way that no game would ever do - even if it had the best graphics in the world - because games aren't rendering a mostly static image with little geometry and no AF setting
    Well, there are still a few problems :
    • Take an old game, play it with your brand new card. Not alot of geometry. Simple shaders. Wow, black screen ? That could happen. It's worth a try. Any idea of a game or bench ? I'd say 3dMark2003 or 06... or any game of that era that uses simple shaders. Can't think of one at the moment.
    • GPGPU applications. They may reach this value also. They don't have to wait for geometry there. ATI will have to lower the values also. Even if i don't think they'll reach this value, my limited experience in the field cannot answer that question for sure.
    • You buy a card that boast it can do X, yet when you have it, it can't. I'd say it's a problem.


    Now, right now, what ARE the implications of this problem ?
    • Limited overclocking margin. You'll reach it quickly.
    • You cannot run any 3d app you want on your card. The list is not that huge, i'd agree, it's not that bad. But morally, ethicaly, this comes into question. Especially since this list can grow in the future.


    I have checked the logs, AMD downloaded OCCT yesterday. I guess they must not be taing the problem lightly, or at least, they're taking a look at it

    It's true it is not a bug that will make everyone ditch their card, or make ATI/AMD recall their card, IMHO. But still, it's a flaw, a quirk in their design, and worth notice.

  5. #405
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    449
    Quote Originally Posted by Bo_Fox View Post
    or Enough of this emotional bickering!

    What I was doing here is to try to clear things up a little bit in all of this mess that you're stirring up. Like when the OP said that Vsync enabled does not make a difference with power consumption, I said that it is not what I have been experiencing.

    I had a HIS 4850 1GB card that died on me after exactly 30 days of use. The display became permanently corrupted after decoding hi-def videos for a couple of hours on multiple screens, and I am wondering if it has anything to do with the cheap VRM's... Ever since I had an X1900XTX from the day it came out, I noticed that the VRM temperatures on ATI cards seemed much higher than those on Nvidia cards. I am geared towards buying either a 4870 1GB or a 4890, and am keenly seeking out this thread for knowing which brand/make is the best quality.
    Heh just go for GTX 285 for $300 dell deal. Thats what Im thinking of doing after I play around with this 4870x2 and sell it.
    --lapped Q9650 #L828A446 @ 4.608, 1.45V bios, 1.425V load.
    -- NH-D14 2x Delta AFB1212SHE push/pull and 110 cfm fan -- Coollaboratory Liquid PRO
    -- Gigabyte EP45-UD3P ( F10 ) - G.Skill 4x2Gb 9600 PI @ 1221 5-5-5-15, PL8, 2.1V
    - GTX 480 ( 875/1750/928)
    - HAF 932 - Antec TPQ 1200 -- Crucial C300 128Gb boot --
    Primary Monitor - Samsung T260

  6. #406
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    I understand the concern many may have, but as shaders get more complex the problem will be less and less relevant, no? - I have played games like star wars jedi academy @ 1600x1200 with all the settings on high no problem (a great game) its old and has very simple geometry but my card has no problem with it as it barely loads the GPU to make it run great....

  7. #407
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Tetedeiench View Post
    Well, there are still a few problems :
    • Take an old game, play it with your brand new card. Not alot of geometry. Simple shaders. Wow, black screen ? That could happen. It's worth a try. Any idea of a game or bench ? I'd say 3dMark2003 or 06... or any game of that era that uses simple shaders. Can't think of one at the moment.
    • GPGPU applications. They may reach this value also. They don't have to wait for geometry there. ATI will have to lower the values also. Even if i don't think they'll reach this value, my limited experience in the field cannot answer that question for sure.
    • You buy a card that boast it can do X, yet when you have it, it can't. I'd say it's a problem.


    Now, right now, what ARE the implications of this problem ?
    • Limited overclocking margin. You'll reach it quickly.
    • You cannot run any 3d app you want on your card. The list is not that huge, i'd agree, it's not that bad. But morally, ethicaly, this comes into question. Especially since this list can grow in the future.


    I have checked the logs, AMD downloaded OCCT yesterday. I guess they must not be taing the problem lightly, or at least, they're taking a look at it

    It's true it is not a bug that will make everyone ditch their card, or make ATI/AMD recall their card, IMHO. But still, it's a flaw, a quirk in their design, and worth notice.
    I'm not so sure GPGPU apps would do the same.
    Sure it will stress the shaders but not every part of the chip to the max, i.e. TMUs/ROPs.

    The overclocking margin is only limited if you are soley going by OCCT. Most people are going to base their overclocks on their favorite games and benches, since that is where the extra performance might be needed.

    What other 3d app can you not run based on the card failing OCCT?
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  8. #408
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    213
    Quote Originally Posted by W1zzard View Post
    nope it's configured via a resistor, no software way to control or disable ovp
    Thanks for the precision So it's hard-wired... wow.

  9. #409
    Xtreme Addict
    Join Date
    Jan 2003
    Location
    Ayia Napa, Cyprus
    Posts
    1,354
    I think one of the senior members here should assist Tetedeiench in contacting Macci. In this way, someone close to the community and who works for AMD can help colaborate with these findings.

    Just me 2 cents..................
    Seasonic Prime TX-850 Platinum | MSI X570 MEG Unify | Ryzen 5 5800X 2048SUS, TechN AM4 1/2" ID
    32GB Viper Steel 4400, EK Monarch @3733/1866, 1.64v - 13-14-14-14-28-42-224-16-1T-56-0-0
    WD SN850 1TB | Zotac Twin Edge 3070 @2055/1905, Alphacool Eisblock
    2 x Aquacomputer D5 | Eisbecher Helix 250
    EK-CoolStream XE 360 | Thermochill PA120.3 | 6 x Arctic P12

  10. #410
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by mongoled View Post
    I think one of the senior members here should assist Tetedeiench in contacting Macci. In this way, someone close to the community and who works for AMD can help colaborate with these findings.

    Just me 2 cents..................
    AMD clearly already knows about the situation...
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  11. #411
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    one question i keep asking and have no response on is, even though this is a potential problem, how does the performance compare to previous ATI cards, and the nvidia competitor cards. to generate that much heat and power, id expect this thing to be demolishing all others.

    i think in the future they will go less with shaders, and more in other places, since these things are never being used to their max, they can probably better optimize their cores.

  12. #412
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Manicdan View Post
    one question i keep asking and have no response on is, even though this is a potential problem, how does the performance compare to previous ATI cards, and the nvidia competitor cards. to generate that much heat and power, id expect this thing to be demolishing all others.
    Largon concurred with the results I was seeing from other forum members.

    Underclocked 4890(850/850, ~83FPS @ 1680x1050 on setting 3, is quite a bit faster than even a GTX285, ~53FPS @ 1680x1050 on setting 3.

    Don't know how much weight the CPU has on the score but the Nvidia card has the CPU advantage as well...
    Last edited by LordEC911; 05-21-2009 at 03:46 PM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  13. #413
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    i don't see the issue here, it's a single app out of thousands and the first one able to show this "problem"

    i think amd did this on purpose to prevent the card form burning in case someone loads the card with so simple shaders that each ALU is fully uitilized which is impossible under normal circumstandes

    i never EVER saw my card pulling more than 70A
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  14. #414
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by BababooeyHTJ View Post
    You do know that the vrms on these refrence 4870 and 4870x2s are the same as the ones on the GTX280/260 65nm right? The 55nm versions use cheaper vrms.
    Oh yeah, the volterra ones.. but why such a limitation on the volterra vrms? I think the volterra ones are more expensive only because they offer more of a digital support (tweaking via RT/bios, etc..) and the 55nm NV cards do just fine with analog VRMs. 55nm GT200 cards have a higher reliability rate than 65nm cards that are more prone to failure, and a GTX285 can eat just as much power as a GTX280. I remember reading somewhere recently that the analog VRM's are actually better than the digital ones.

    W1zzard, have you yet done a Vmod on disabling the OVP for 4870/4890 cards? Anybody? I'm curious if those reference cards can handle 100A or so without any problems. It just gives some peace of mind, since there are so many older games out there--literally thousands of them--that we never know which one could actually push the card to the same black-screen crashing scenario (of which we would probably just discard as "buggy" piece of software when it's really the hardware). Also, this issue is interesting, nonetheless.

    Strange how RT reports it as 40A when GPU-Z reports it as 80A... which one do you think is true?
    Last edited by Bo_Fox; 05-21-2009 at 02:06 PM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  15. #415
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Someone should grab a fireblanket and run this on a HD2900XT! That card has some craaazy vDDC current capabilities (6× VT1195SF).
    =P

    edit:
    Quote Originally Posted by Bo_Fox
    Strange how RT reports it as 40A when GPU-Z reports it as 80A... which one do you think is true?
    That's what I was wondering too.
    That 80A figure everyone is seeing does not make sense.
    Quote Originally Posted by largon View Post
    I don't get it, what are those amperage figures RT & GPU-Z display?
    It can't be the total vDDC phase amperage, nor it can't be a single phase amperage.

    And why does RT give totally different amperage figures than GPU-Z? GPU-Z recorded 83.30A while RT reported 48.79A for the same point of time when I had both GPU-Z and RT write a log during the same OCCT test run...
    Last edited by largon; 05-21-2009 at 02:19 PM.
    You were not supposed to see this.

  16. #416
    Xtremely High Voltage Sparky's Avatar
    Join Date
    Mar 2006
    Location
    Ohio, USA
    Posts
    16,040
    Quote Originally Posted by Tetedeiench View Post
    Well, there are still a few problems :
    • Take an old game, play it with your brand new card. Not alot of geometry. Simple shaders. Wow, black screen ? That could happen. It's worth a try. Any idea of a game or bench ? I'd say 3dMark2003 or 06... or any game of that era that uses simple shaders. Can't think of one at the moment.
    I run a myriad of new and old games. No game comes close to stressing the card as much as this OCCT.

    • GPGPU applications. They may reach this value also. They don't have to wait for geometry there. ATI will have to lower the values also. Even if i don't think they'll reach this value, my limited experience in the field cannot answer that question for sure.
    I run Folding@Home on my 4870 all the time that I'm not playing games, ever since I got the card pretty much. GPU-Z reports only 45-ish amps or so, for 96-98% GPU load.

    • You buy a card that boast it can do X, yet when you have it, it can't. I'd say it's a problem.
    I don't know about that - so far, everything the card boasts it can do, it has done

    Now, right now, what ARE the implications of this problem ?
    • Limited overclocking margin. You'll reach it quickly.
    • You cannot run any 3d app you want on your card. The list is not that huge, i'd agree, it's not that bad. But morally, ethicaly, this comes into question. Especially since this list can grow in the future.
    I'm maxed out on my overclock that CCC will allow for. But we all know that overclocking is a gamble, so even if you couldn't OC the card more than 30MHz well you still got what you paid for.
    Any 3D app does run. Only OCCT stress test doesn't, but OK...

    I have checked the logs, AMD downloaded OCCT yesterday. I guess they must not be taing the problem lightly, or at least, they're taking a look at it

    It's true it is not a bug that will make everyone ditch their card, or make ATI/AMD recall their card, IMHO. But still, it's a flaw, a quirk in their design, and worth notice.
    And if it is a real issue then I'm sure they'll take a look at it. Seems they already are if they downloaded OCCT, which is a good thing.
    The Cardboard Master
    Crunch with us, the XS WCG team
    Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64

  17. #417
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    So whats the final verdict here, bad hardware or bad software????

  18. #418
    Xtremely High Voltage Sparky's Avatar
    Join Date
    Mar 2006
    Location
    Ohio, USA
    Posts
    16,040
    I wouldn't call it "bad" just something to take note of. I'm thinking of it like the redline of a car's engine, sure you can run up to that, but you can't go over it. Keep it at redline too long you could break it. For whatever reason, OCCT is trying to push it over the redline, and that's a no go.

    Doesn't really bother me too much.
    The Cardboard Master
    Crunch with us, the XS WCG team
    Intel Core i7 2600k @ 4.5GHz, 16GB DDR3-1600, Radeon 7950 @ 1000/1250, Win 10 Pro x64

  19. #419
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by SparkyJJO View Post
    I wouldn't call it "bad" just something to take note of. I'm thinking of it like the redline of a car's engine, sure you can run up to that, but you can't go over it. Keep it at redline too long you could break it. For whatever reason, OCCT is trying to push it over the redline, and that's a no go.

    Doesn't really bother me too much.
    Reassembles quite the fuzz about furmark back then, there where people that blew there VRMs with Furmark, yet there have to be a cases where this happens with games or GPGPU apps.

  20. #420
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by LordEC911 View Post
    Largon concurred with the results I was seeing from other forum members.

    Underclocked 4890(850/850, ~83FPS @ 1680x1050 on setting 3, is quite a bit faster than even a GTX285, ~53FPS @ 1680x1050 on setting 3.

    Don't know how much weight the CPU has an the score but the Nvidia card has the CPU advantage as well...

    My GTX280 pulls just short of 80A with the same settings with my core at 648mhz. This is much more stressful than furmark on my card. I was seeing under 60fps, btw. With a Q9650 at 4ghz.

    I'm not sure what point you are trying to prove.

  21. #421
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by BababooeyHTJ View Post
    I'm not sure what point you are trying to prove.
    I wasn't trying to prove a point, I was simply answering a question.
    I can see where he might have been headed by asking that question though.

    4890 is what ~10-15% behind a GTX285 with both at stock on average in "normal" games and apps?
    Yet with OCCT, the 4890 is ~56% faster, using the 83FPS vs 53FPS.
    However this app is programmed it stresses every part of the chip to the max, or at least quite a bit more than other "normal" apps/games.

    Also none of the numbers, i.e. FPP, seem to add up.
    4890@850mhz= 1.36Tflops
    GTX285@1476mhz= 1.06Tflops(MADD+MUL), .708Tflops(MADD)

    1.36/1.06= 1.28x greater (1/2 the FPS difference)
    1.36/.708= 1.92x greater (amusing since it doesn't mean anything but = largons power draw increase)

    Simply using max theorectical FPP is not an accurate way to estimate performance but in this case it seems to be related. Since this app has been said to use simple shaders to completely load the ALUs, you could come to the conclusion that the MUL is only being used ~45% of the time.

    Basically, the way this app is programmed it is able to use the 4890's architecture to the max and seems to not fully load Nvidia cards, persay.

    Edit- Anyone know what the stock volts for a GTX280 is under load? 1.3-1.4v?
    Last edited by LordEC911; 05-21-2009 at 04:52 PM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  22. #422
    Xtreme Member
    Join Date
    Dec 2006
    Location
    Edmonton,Alberta
    Posts
    182
    Quote Originally Posted by BababooeyHTJ View Post
    My GTX280 pulls just short of 80A with the same settings with my core at 648mhz. This is much more stressful than furmark on my card. I was seeing under 60fps, btw. With a Q9650 at 4ghz.

    I'm not sure what point you are trying to prove.
    Nvidia may have already put a safety in to prevent reaching peak Amperage
    Instead of shutting down it restricts the the frames per second

    Which would make the GPU stress test meaningless

  23. #423
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    Quote Originally Posted by LordEC911 View Post
    Edit- Anyone know what the stock volts for a GTX280 is under load? 1.3-1.4v?
    I think it's 1.19V under load at stock volts, 1.11V at idle.
    Link Here

  24. #424
    xtreme energy
    Join Date
    Oct 2004
    Location
    Europe, Latvia
    Posts
    4,145
    Quote Originally Posted by AMDDeathstar View Post
    Nvidia may have already put a safety in to prevent reaching peak Amperage
    Instead of shutting down it restricts the the frames per second

    Which would make the GPU stress test meaningless
    Good point but AMD should have put safety as well
    ...

  25. #425
    Xtreme Addict
    Join Date
    Dec 2004
    Location
    Flying through Space, with armoire, Armoire of INVINCIBILATAAAAY!
    Posts
    1,939
    Quote Originally Posted by Shintai View Post
    Thats pure BS.

    There is no way you can damage a VIA/AMD/Intel CPU with any software/power virus.

    Because unlike GPUs, CPUs and VRM designs are made to handle anything. Its all about GPU designers going the cheapskater way.
    Well, that will be useful to tell to my friend who had to RMA 3 Wolfdales before he realized that he was damaging them with Orthos Prime. Granted, he was overclocking them, (and then using OP to test for errors...), the test would not have caused enough stress under stock settings. In any case, he can keep his 4.2 ghz overclock under normal operation (games, movies, etc..), while after an hour of OP, it will start BSODing until underclocked to 4.1. And so on.
    Sigs are obnoxious.

Page 17 of 30 FirstFirst ... 71415161718192027 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •