Page 21 of 30 FirstFirst ... 1118192021222324 ... LastLast
Results 501 to 525 of 730

Thread: OCCT 3.1.0 shows HD4870/4890 design flaw - they can't handle the new GPU test !

  1. #501
    Xtreme Member
    Join Date
    May 2008
    Posts
    258
    well, the thing about revving it to no end pretty much makes sense. The ATI cards have a rev limiter, to stop you from grenading them. Car manufacturers don't tend to publish their rev limits either, but they are there. This program is asking the card to rev to 10K, but it can only do 8K without damage, so they implemented a hard-cut rev limiter. Nvidia just uses a soft-limiter (car manufacturers do this too), to keep the revs at their limit without a noticeable 'cut'.

    I think this is pretty much open and shut, I can't believe we're on page 20. It's obviously been built into the design to help protect the circuitry. So why are we still talking about it? If you get a game that is asking too much of your single card, that's what they have crossfire for!!

    Anytime your game is running your GPU to 99%, its pretty obvious that you don't have enough graphics horsepower to run the game at the current settings. Add another GPU and your workload per card will be cut nearly in half.

    Can we put this to rest? Surely there has to be other more productive things we could be reading and typing about.

  2. #502
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    213
    Quote Originally Posted by LordEC911 View Post
    And you seem to be in Nvidia's pocket...
    Get off your high horse and take a look at the numbers, they don't make sense.
    You agree they don't make sense and then dismiss there is a problem.


    As I stated and showed, it is not loading Nvidia cards 100%, Tet even agreed to this.

    Also why are you asking him about his code? He obviously has something to hide and doesn't want any to know what it is.
    I'm not in Nvidia's pocket. I don't know where you're taking that from. YOur wild imagination perhaps. I don't have anything to hide. I'm still working on the performance issue, and tryoing to get Perfhud to work with DXUT to do some debugging.

    Just goive me time.

    And i'm tired of fanboys. I already said that if Nvidia would have been at stake here instead of AMD, this thread would be here just all the same. But perhaps i would have send an email to them : they are responding to the community. AMD just didn't. That's why you see Nvidia's logo on my website, and not AMD.

    I'm truly tired of answering to your post that come out of your wild imagination that i'm paid by Nvidia to crush ATI cards. We're not in X-files there, friend.

    I've already spent about 8 hours on trying to load Nvidia cards better. What else do you want ? Give me time.

  3. #503
    Xtreme Legend
    Join Date
    Jan 2003
    Location
    Stuttgart, Germany
    Posts
    929
    Quote Originally Posted by LordEC911 View Post
    you are not receiving 100% the performance of an Nvidia chip.
    PLEASE stop thinking that any test you are running, hearing about or imagining utilizes any gpu to 100%. just because the gpu-z "gpu load" thingie shows 100% it's not 100% of the complete gpu.

  4. #504
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    while companies had this much of hot fanatic supporters they dont need to answer or care about this kind of stuff.


    When i'm being paid i always do my job through.

  5. #505
    Xtreme Legend
    Join Date
    Jan 2003
    Location
    Stuttgart, Germany
    Posts
    929
    i am pretty sure amd engineers are aware of occt now and will use it in their qualification testing. but i doubt we will hear any official word from amd about this

  6. #506
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Oxford (England)
    Posts
    191
    It kind of makes me laugh, i have had an 4870x2 for 6 months now, played just about every game and have never had any issue's other then driver problems etc. Now all of a sudden a program comes out that can make the card overheat/fail people are worrying about it. At the end of the day, people have been using the 4800 series of Ati cards for months now, a vast majority its been fine for everything they've asked the card to do. Now people are crying about it crashing with this OCCT program.
    At the end of the day no one was whinning before they read this article, so why start now. Your just looking for reasons to find fault with the cards.

  7. #507
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    This thread =
    Core i7 2600K@4.6Ghz| 16GB G.Skill@2133Mhz 9-11-10-28-38 1.65v| ASUS P8Z77-V PRO | Corsair 750i PSU | ASUS GTX 980 OC | Xonar DSX | Samsung 840 Pro 128GB |A bunch of HDDs and terabytes | Oculus Rift w/ touch | ASUS 24" 144Hz G-sync monitor

    Quote Originally Posted by phelan1777 View Post
    Hail fellow warrior albeit a surat Mercenary. I Hail to you from the Clans, Ghost Bear that is (Yes freebirth we still do and shall always view mercenaries with great disdain!) I have long been an honorable warrior of the mighty Warden Clan Ghost Bear the honorable Bekker surname. I salute your tenacity to show your freebirth sibkin their ignorance!

  8. #508
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Posts
    881
    With all these car analogies I have to say this, just like if you run your car at the track you void the warranty, this is the same thing. Your video card is designed to play games, if a stress program blows it up it doesn't mean it's defective. However if you play a game and it blew up, now that's a problem.

  9. #509
    Xtreme 3D Team
    Join Date
    Jan 2009
    Location
    Ohio
    Posts
    8,499
    Quote Originally Posted by awdrifter View Post
    With all these car analogies I have to say this, just like if you run your car at the track you void the warranty, this is the same thing. Your video card is designed to play games, if a stress program blows it up it doesn't mean it's defective. However if you play a game and it blew up, now that's a problem.
    Thank you! No game in the world will ever do the stuff and put that much load on your GPU. 115a x 1.23v? It could blow out your (mid range) PSU too!
    Smile

  10. #510
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    213
    Couldn't find anything bad in the code after about 8 hours of work on it.

    And PerfHUD refuses to enumerate its device in my code... seems i'm not the one encountering this error. But at least, i do have support from Nvidia, i guess i'll have to contact them. However, i won't have time to get back to this issue before Thursday, at the earliest date.

    Still no answer from AMD though.

  11. #511
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    213
    Quote Originally Posted by awdrifter View Post
    With all these car analogies I have to say this, just like if you run your car at the track you void the warranty, this is the same thing. Your video card is designed to play games, if a stress program blows it up it doesn't mean it's defective. However if you play a game and it blew up, now that's a problem.
    AAAAAAAAAAAand the 10 pages old debate resurfaces, with the "3d cards are designed to play games, not benchmarks" arguments.

    Next, in stores, you can buy the brand new "Movie encoding-limiting" CPU, the "Office-2003 and nothing else" CPU, and the most expensive "Do whatever you want with it" CPU.

    Let's be serious, a 3d card is made to display any 3d scene, not any 3d game out there.

    And let's not pollute this thread that was out of such non-constructive arguments for a few pages. The card fails at a perfectly valid 3d scene, and that's the main problem.

  12. #512
    Xtreme Addict
    Join Date
    Dec 2004
    Location
    Flying through Space, with armoire, Armoire of INVINCIBILATAAAAY!
    Posts
    1,939
    why would it be called the 5th gear if there are technically only 4?
    Sigs are obnoxious.

  13. #513
    The Doctor Warboy's Avatar
    Join Date
    Oct 2006
    Location
    Kansas City, MO
    Posts
    2,597
    Quote Originally Posted by iddqd View Post
    why would it be called the 5th gear if there are technically only 4?
    because of companies being cheap. lol.
    My Rig can do EpicFLOPs, Can yours?
    Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....

    Build XT7 is currently active.
    Current OS Systems: Windows 10 64bit

  14. #514
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    Quote Originally Posted by Warboy View Post
    because of companies being cheap. lol.
    not being cheap, weve seen its OCP which is companies being cautious....

  15. #515
    Xtreme Legend
    Join Date
    Jan 2003
    Location
    Stuttgart, Germany
    Posts
    929
    Quote Originally Posted by Tetedeiench View Post
    AAAAAAAAAAAand the 10 pages old debate resurfaces, with the "3d cards are designed to play games, not benchmarks" arguments.

    Next, in stores, you can buy the brand new "Movie encoding-limiting" CPU, the "Office-2003 and nothing else" CPU, and the most expensive "Do whatever you want with it" CPU.

    Let's be serious, a 3d card is made to display any 3d scene, not any 3d game out there.

    And let's not pollute this thread that was out of such non-constructive arguments for a few pages. The card fails at a perfectly valid 3d scene, and that's the main problem.
    microsoft wanted to put "3 applications max running" into w7 starter edition. they scrapped it, but with cloud computing gaining market share in the next decades i expect this idea to be implemented at some point

  16. #516
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by Warboy View Post
    because of companies being cheap. lol.
    No, being cheap would be using less expensive, not software adjustable, and not software readable voltage regulator circuitry. Like what nVidia did with 55nm G200.

  17. #517
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by Tetedeiench View Post
    AAAAAAAAAAAand the 10 pages old debate resurfaces, with the "3d cards are designed to play games, not benchmarks" arguments.

    Next, in stores, you can buy the brand new "Movie encoding-limiting" CPU, the "Office-2003 and nothing else" CPU, and the most expensive "Do whatever you want with it" CPU.

    Let's be serious, a 3d card is made to display any 3d scene, not any 3d game out there.

    And let's not pollute this thread that was out of such non-constructive arguments for a few pages. The card fails at a perfectly valid 3d scene, and that's the main problem.
    Wait ... aren't you referring to the Atom, Core 2 Duo mainstream, and then the i7?
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  18. #518
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    My 295 ran it...



    About 2 minutes in I had to stop it due to heat...



    I was getting above 80FPS...
    Quote Originally Posted by Solus Corvus View Post
    Not really. It's probably loading the NV cards 100% also. It's just that the ATI cards whip the nVidia cards when it comes to processing this shader code.
    How fast do the ATI cards run this code at 1920x1200?
    Last edited by Talonman; 05-25-2009 at 10:51 AM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  19. #519
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    213
    Here is a screenshot of a GTX280 @stock values running OCCT, with everything displayed.



    Quite stressfull already, isn't it ?

    Mind you, i already contacted Nvidia about the difference in FPS on the shader code, it still does not seems right IMHO. But my test is NOT gentle on those cards. That's for the accusations of being biased.

  20. #520
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    I picked these options for my test...

    Next, the goal is simple : maximise the GPU load. Here is the way to do so on those cards. Be sure to use these settings !

    * Enable Fullscreen Mode
    * Disable Errorcheck Mode (comparing images is NOT effective)
    * Use a High resolution. Preferably the native resolution of your screen (i.e. 1920x1200 for a 24" LCD, etc)
    * Shader Complexity 3 for HD4XXX cards


    What do the ATI card get for FPS at 1920x1200?
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  21. #521
    Banned
    Join Date
    Oct 2006
    Posts
    963
    the 4870 and 4890 when running the test successfully have fps which are far higher than the gtx285.... surely they should be closer... Tetedeiench's findings on this issue should be interesting....

    even after amd cripple the test as with furmark, i fully expect the ati cards to keep a healthy fps lead...

    £160card out performs £500card in yet another synthetic test... all i'm bothered about...
    Last edited by purecain; 05-25-2009 at 02:38 PM.

  22. #522
    Worlds Fastest F5
    Join Date
    Aug 2006
    Location
    Room 101, Ministry of Truth
    Posts
    1,615
    There have been a lot of posts in the last few pages which have made the assumption that 82 amps current draw cannot be achieved during gameplay (only by this kind of stress test).

    I just wanted to state for the record that, as I game at 2560x1600, virtually all recent games I have played suck this kind of power through my graphics card vrm's when playing at this resolution.

    I only have Nvidia hardware around right now so I cannot test with any 4870 / 4890's and I am sure that the different design architectures will cause a different current draw in the same situation between ATI & Nvidia cards but I can assure you my Nvidia card draws over 82 amps in several games I tested, and some of those tests were just a quick test at the loading / option screen....
    X5670 B1 @175x24=4.2GHz @1.24v LLC on
    Rampage III Extreme Bios 0003
    G.skill Eco @1600 (7-7-7-20 1T) @1.4v
    EVGA GTX 580 1.5GB
    Auzen X-FI Prelude
    Seasonic X-650 PSU
    Intel X25-E SLC RAID 0
    Samsung F3 1TB
    Corsair H70 with dual 1600 rpm fan
    Corsair 800D
    3008WFP A00



  23. #523
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    no crash here with 2 x 4890s @ 900mhz core on cat9.5 - 5 minutes run just for testings sake - will try overclocked soon




  24. #524
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    640
    Well, I've been following this thread and after this "issue" surfaced, I've looked around at several other forums, searching for people having major issues while gaming at any settings with any of ATI's 4xxx series of cards and have found pretty much none. No rampant and recurrent claims of cards locking up, quitting running, etc. Of course, there are the stray complaints of driver issues and Crossfire in certain games, but just as with SLI, neither brand is perfect with multiple card setups.

    But having seen no similar and recurring complaints that span hordes of users of these cards, I'm beginning to suspect this is more a tempest in a teapot than a true issue, except by those that wish it to be. Otherwise, how can one dispute the huge customer base that uses ATI 4xxx video cards and have no issues what so ever with their use in gaming?

  25. #525
    Registered User
    Join Date
    Jan 2009
    Posts
    3
    Quote Originally Posted by Humminn55 View Post
    Well, I've been following this thread and after this "issue" surfaced, I've looked around at several other forums, searching for people having major issues while gaming at any settings with any of ATI's 4xxx series of cards and have found pretty much none. No rampant and recurrent claims of cards locking up, quitting running, etc. Of course, there are the stray complaints of driver issues and Crossfire in certain games, but just as with SLI, neither brand is perfect with multiple card setups.

    But having seen no similar and recurring complaints that span hordes of users of these cards, I'm beginning to suspect this is more a tempest in a teapot than a true issue, except by those that wish it to be. Otherwise, how can one dispute the huge customer base that uses ATI 4xxx video cards and have no issues what so ever with their use in gaming?
    Since you've been following this thread, you should know that the issue here isn't occurring in gameplay as of yet, but in OCCT

Page 21 of 30 FirstFirst ... 1118192021222324 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •