Page 1 of 10 1234 ... LastLast
Results 1 to 25 of 236

Thread: HD5970 Microstuttering tests

  1. #1
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574

    HD5970 Microstuttering tests

    Microstuttering is a giant multi-GPU problem and is why I had felt the games were smoother when I went "down" from a SLI rig to a single 4870 a year ago, even though the FPS of my SLI rig was faster. Unfortunately this big, big issue gets often overlooked. It's completely unrealistic to compare AFR frame rates with normal frame rates.

    I was yet to see a microstuttering analysis for HD5970 but some guys at Donanimhaber (cenova and Source) did tests with a good selection of games: Crysis, Crysis Warhead, Batman:AA, MW2, GTA4, Dirt 2, Unigine Heaven, World in Conflict, Wolfenstein 2.

    http://forum.donanimhaber.com/m_3702..._/key_//tm.htm

    Out of these popular titles, the only game that wasn't reasonably affected by MS is Wolfenstein 2. All others are significantly affected.

    In the graphs the X axis is the frame rendered and the Y axis is the amount of time in milliseconds that it took to render the frame at that exact time.

    Crysis@1920 (1680 doesn't display much MS, weirdly)


    Here you can see that FPS varies wildly between 40 and 75. Most probably, in these instances FRAPS showed FPS around 62, whereas it was 40 fps-smooth at that time. This is why you can't trust AFR's FPS.

    Batman:AA@1920


    MS kills this game even more, with FPS varying between 40 and 100(!).

    MW2@1680 (1920 shows less MS in this game)


    There is terrible MS half of the time, varying between 40 and 200. Don't forget that if momentary FPS goes 10-150-10-150-10-150 the smoothness won't be the average of that FPS. It will be the minimum. You will see 10ish FPS smoothness.

    Dirt 2@1920 (In 1680 there is little MS.)


    WTF. This could be the definition of MS.

    GTA4@1920


    Unigine@1920


    And you thought it couldn't get worse than Dirt2.

    BOTTOM LINE: Never compare dual GPU FPS to single gpu FPS. Back when I had a G92 SLI system I had concluded that the second GPU, while multiplying FPS by about 1.5-1.6, didn't contribute anything to the smoothness.
    Last edited by annihilat0r; 01-14-2010 at 12:58 AM.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  2. #2
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Thnx.. but that's been known forever! People who brag or use SLI for anything other than pushing extra pixels are using an epeen. (sorry) I have money, but never once considered SLI. That is why I was interested in Hydra. But I feel that technology is still a good year from tweaks in overtaking SLI/Xfire.

    So in the mean time, single best single card has always been my choice. So where does that leave me this round?

  3. #3
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    awesome article!

  4. #4
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    151
    This is why dual GPU cards should be communicating through an HT-esque bus, sharing memory and NOT using the same rendering ideas from the ATI Rage Fury MAXX of 10 years ago.
    eXt 4
    Intel Core i5 2500k @ 5GHz | Asus Maximus GENE-Z | Corsair Vengence 16GB DDR3-1600 9-9-9-24 | eVga GTX 580 SC 1000/2300 | SeaSonic Platinum Platinum-860 | Corsir Force GT 60GB/WD Green 1.5TB
    XSPC dual bay res w/ Laing D5/ XSPC Raystorm CPU Block / EK-FC580 GTX+ GPU block / XSPC RX240 / XSPC RX360 / All Yate D12Sm-12 Fans / NZXT case

    St0rage
    AMD Phenom II 965 @ 3.6GHz | Gigabyte MA785G-UD3H | Corsair XMS 8GB DDR2-1066 | 785G integrated | Corsiar TX 750 | WD Green 2.0TB
    Thermalright VenomousX, ALL Yate D12SL-12 FANS

  5. #5
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by annihilat0r View Post
    I had concluded that the second GPU, while multiplying FPS by about 1.5-1.6, didn't contribute anything to the smoothness.
    I agree!
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  6. #6
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    How do the captions calculate out FPS compared to the ms response from frame rendering? That's what I'm confused by right now but that might be a wording issue

    Also, did they provide baseline/control graphs? It's one thing to graph results, but what about a single gpu solution? AKA "normal" behavior? That would be more illuminating

  7. #7
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,838
    yea, there should be some type of control.
    i think this may look similiar to the type of lag you get with vsync even on a single gpu.
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  8. #8
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    640
    Few questions:

    First, as was pointed out, where are the single gpu control graphs?

    Second, where are the FPS graphs? All I see is a graph showing a frame number and time interval, no fps charted.

    Third, when Batman was run, why was it run with AA on, since it's well known that AA is intentionally crippled in that game for ATi cards, unless the game's profile was "fixed".

  9. #9
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by NickF View Post
    This is why dual GPU cards should be communicating through an HT-esque bus, sharing memory and NOT using the same rendering ideas from the ATI Rage Fury MAXX of 10 years ago.


    its beyond me why amd hasnt equipped gpus with a HT interface yet...
    lower latency than pciE = slightly better perf and faster frame times even at lower fps = better gaming experience even at low fps + the usage for dual gpu would make a whole lot of sense... they could use NUMA and save half the memory on the cards or share the entire memory of both or more cards instead of mirroring the data...

    if i read the graphs right, frame time delta is displayed on the left, and frame number is displayed on the bottom.
    average frame time delta is around 25ms, that would mean 1000/25=40fps

    frames are rendered in batches, thats the problem with afr... too large batches without any syncing between the batches.
    when the frame time delta, the time between one frame and the next, decreases, is below average, then the graph goes down, you see several frames within a short period of time, this part feels great and the game feels fluid. but then, the batch is done, and the card(s) start to render a new batch of frames, at this point in time the time from one from to the next increases and in many cases instantly jumps up, this is causing a spike in the graph going UP, and this is the part that feels terrible when playing the game, you only see a few frames in a certain amount of time, and the delay from one frame to the next gets worse and worse.

    this can, and should, be fixed by ati, but it hurts performance... youll end up with a lower average fps, naturally...
    games can fix this as well by managing the gpus properly and distributing the work more evenly, and making sure the stream of work to the gpu and from the gpu is even and not fluctuating... basically forcing gpus to only pre-render a low amount of frames SHOULD improve the situation a lot...
    Last edited by saaya; 01-14-2010 at 03:35 AM.

  10. #10
    Since when does a fraps graph represent the experiance of a human eye?

    The intervals could be even longer, while you wouldnt even ackonwledge it, which has been proven before already many times.



    Edit: All Im saying is I have played mentioned titles on an X2 card and havent noticed any MS.
    Last edited by Shadov; 01-14-2010 at 03:38 AM.

  11. #11
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    151
    Quote Originally Posted by saaya View Post


    its beyond me why amd hasnt equipped gpus with a HT interface yet...
    lower latency than pciE = slightly better perf and faster frame times even at lower fps = better gaming experience even at low fps + the usage for dual gpu would make a whole lot of sense... they could use NUMA and save half the memory on the cards or share the entire memory of both or more cards instead of mirroring the data...
    One would think after AMD acquired ATI, the 3870 X2 basically flopped, that ATI would have at the least put an HT link on the table for an upcoming generation. With the success of the 4870 X2, and the currently unchallenged performance of the 5970, I fear that ATI may have gotten lazy in this regard. I mean, its very plausible that they could do HT with a 6000 series card, but not likely.
    eXt 4
    Intel Core i5 2500k @ 5GHz | Asus Maximus GENE-Z | Corsair Vengence 16GB DDR3-1600 9-9-9-24 | eVga GTX 580 SC 1000/2300 | SeaSonic Platinum Platinum-860 | Corsir Force GT 60GB/WD Green 1.5TB
    XSPC dual bay res w/ Laing D5/ XSPC Raystorm CPU Block / EK-FC580 GTX+ GPU block / XSPC RX240 / XSPC RX360 / All Yate D12Sm-12 Fans / NZXT case

    St0rage
    AMD Phenom II 965 @ 3.6GHz | Gigabyte MA785G-UD3H | Corsair XMS 8GB DDR2-1066 | 785G integrated | Corsiar TX 750 | WD Green 2.0TB
    Thermalright VenomousX, ALL Yate D12SL-12 FANS

  12. #12
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Shadov View Post
    Snnce when does a fraps graph represent the experiance of a human eye?

    The intervals could be even longer, while you wouldnt even ackonwledge it, which has been proven before already many times.
    the reason this is getting so much attention is that there IS a perceivable problem with multi gpu setups, if you cant see and feel it, good for you... but most people do...

    the problem MIGHT be caused by something entirely different... its possible that the fraps graphs have nothing to do with this... but it must be a pretty huge coincidence then that the fraphs graph results match the perceptive problems with multi gpu... :P

    Quote Originally Posted by NickF View Post
    One would think after AMD acquired ATI, the 3870 X2 basically flopped, that ATI would have at the least put an HT link on the table for an upcoming generation. With the success of the 4870 X2, and the currently unchallenged performance of the 5970, I fear that ATI may have gotten lazy in this regard. I mean, its very plausible that they could do HT with a 6000 series card, but not likely.
    yeah, instead they added sideport, and never used it...

  13. #13
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    151
    Quote Originally Posted by saaya View Post
    yeah, instead they added sideport, and never used it...
    I thought sideport was just a pr stunt for low-end gpus
    eXt 4
    Intel Core i5 2500k @ 5GHz | Asus Maximus GENE-Z | Corsair Vengence 16GB DDR3-1600 9-9-9-24 | eVga GTX 580 SC 1000/2300 | SeaSonic Platinum Platinum-860 | Corsir Force GT 60GB/WD Green 1.5TB
    XSPC dual bay res w/ Laing D5/ XSPC Raystorm CPU Block / EK-FC580 GTX+ GPU block / XSPC RX240 / XSPC RX360 / All Yate D12Sm-12 Fans / NZXT case

    St0rage
    AMD Phenom II 965 @ 3.6GHz | Gigabyte MA785G-UD3H | Corsair XMS 8GB DDR2-1066 | 785G integrated | Corsiar TX 750 | WD Green 2.0TB
    Thermalright VenomousX, ALL Yate D12SL-12 FANS

  14. #14
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Great article! and some people though this was "fixed" over a year ago! though I had an G92 SLi as well and the MS issues were awful, though I gave SLi another chance with GT200 and the fps were smoother and not bothersome!
    ░█▀▀ ░█▀█ ░█ ░█▀▀ ░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░█▀▀ ░█▀▀ ░█ ░█ ░░░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░▀▀▀ ░▀ ░░░▀ ░▀▀▀ ░░▀ ░░░▀░▀ ░▀ ░▀▀▀ ░

  15. #15
    Xtreme Member
    Join Date
    Jan 2009
    Location
    Huyamba
    Posts
    316
    When I had 3x285 in tri sli there was no micro or macro stuttering - I can tell you that for sure. It's just that some games work better on a single card... I cannot tell anything about ati though for now - I simply haven't tried crossfire yet. But they say it scales marvelousely with no stuttering. this microstutter myth has been busted a numerous times already.
    i7 950@4.05Ghz HeatKiller 3.0
    EVGA E762 EK WB | 12Gb OCZ3X1600LV6GK
    Razer Tarantula |Razer Imperator | SB X-Fi PCIe
    480GTX Tri SLi EK WBs | HAF X | Corsair AX1200
    ____________________________________________
    Loop1: Double_MCP655(EK Dual Top) - MoRa3Pro_4x180 - HK3.0 - EKFB_E762
    Loop2: Koolance_MCP655(EK Top) - HWLabsSR1_360 - EK_FC480GTX(3x)

  16. #16
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by KingOfsorroW View Post
    When I had 3x285 in tri sli there was no micro or macro stuttering - I can tell you that for sure. It's just that some games work better on a single card... I cannot tell anything about ati though for now - I simply haven't tried crossfire yet. But they say it scales marvelousely with no stuttering. this microstutter myth has been busted a numerous times already.
    rofl, what? hahahah
    man, have you ever benched vantage with your cards?
    it stutters so bad with tri sli it almost gives me headaches...
    you must be BLIND not to notice any stuttering with tri sli...
    unless of course you play at normal resolutions and have so high fps that the stuttering doesnt matter cause it jumps from 180fps to 100fps

  17. #17
    Quote Originally Posted by saaya View Post
    rofl, what? hahahah
    man, have you ever benched vantage with your cards?
    it stutters so bad with tri sli it almost gives me headaches...
    you must be BLIND not to notice any stuttering with tri sli...
    unless of course you play at normal resolutions and have so high fps that the stuttering doesnt matter cause it jumps from 180fps to 100fps
    I disagree saaya, some titles are more prone to this effect then others.

    That way you CAN NOT make it a general rule for all titles and just support it by a fraps graph.

  18. #18
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Actually a fraps graph is far more useful evidence than people claiming to see / not see something. Microstuttering is a well understood phenomenon, people just stick their heads in the sand to avoid acknowledging that their multi-GPU setups are pumping out useless numbers.

  19. #19
    Xtreme Member
    Join Date
    Oct 2009
    Location
    Santos(São Paulo), Brasil.
    Posts
    202
    oh maan...

    multi GPU is around for yeeears, but it doesn't work properly yet


    ok, AFR is crap, so what would be the "right way" to do multi GPU?
    AMD Phenom II X6 1055T @ 4009MHz
    NB @ 2673MHz
    Corsair H50 + Scythe Ultra Kaze 3k
    Gigabyte GA-MA790X-UD4P
    2X2GB DDR2 OCZ Gold
    XFX Radeon HD5850 XXX @ 900MHz Core
    OCZ Agility2 60GB
    2x500GB HDD WD Blue
    250GB Samsung
    SevenTeam 620W PAF
    CoolerMaster CM690

  20. #20
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by trinibwoy View Post
    Actually a fraps graph is far more useful evidence than people claiming to see / not see something. Microstuttering is a well understood phenomenon, people just stick their heads in the sand to avoid acknowledging that their multi-GPU setups are pumping out useless numbers.
    You cant claim that all people notice Microstuttering & are ignoring it or lyingly.
    They either notice it or they don't.
    Last edited by Final8ty; 01-14-2010 at 05:15 AM.

  21. #21
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Phoenix, AZ
    Posts
    866
    Quote Originally Posted by KingOfsorroW View Post
    When I had 3x285 in tri sli there was no micro or macro stuttering - I can tell you that for sure. It's just that some games work better on a single card... I cannot tell anything about ati though for now - I simply haven't tried crossfire yet. But they say it scales marvelousely with no stuttering. this microstutter myth has been busted a numerous times already.
    give me one link where it has been busted, the links don't exist, cause micro stuttering is a true problem. If you have heard it has been busted many times, thats straight here say bull crap from people in denial that there massive money spent isn't 100% reliable. I attempted SLI myself with G92 and it was a waste experience. and I am not shure why you said macro stuttering, thats not a term. I hear (no fact on this) that since GT200 with SLI this has been less of a problem, and more of a problem with Xfire, either way its still an issue. I'm awaiting an advancement in multi GPU options. hence hydra.

    This is a big reason for me I am awaiting hydra. I thought the scaling would be more, but oh well, if it results in no micro stutter and no input lag issues, I'm game.

    Quote Originally Posted by Final8ty View Post
    You cant claim that all people notice Microstuttering & are ignoring it or lyingly.
    They either notice it or they don't.
    this is entirely true as well though. same thing as FPS and the people that say you cant see above 40 fps, which is total non sense.
    Last edited by Decami; 01-14-2010 at 05:26 AM.
    This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
    This Announcement of the delayed post above has been brought to you by Nvidia Inc.

    RIGGY
    case:Antec 1200
    MB: XFX Nforce 750I SLI 72D9
    CPU:E8400 (1651/4x9) 3712.48
    MEM:4gb Gskill DDR21000 (5-5-5-15)
    GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
    PS:Corsair 650TX
    OS: Windows 7 64-bit Ultimate
    --Cooling--
    5x120mm 1x200mm
    Zalman 9700LED
    Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350


  22. #22
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Folks, just don't read the OP but click on the link and read all the graphs. There you will find a trend where fluctuates are more pronounced at 1920 then at 1680. Thus it's easy to see that there is some sort of frame rate induced problem more so then anything else. It's clear to me that there is some sort of variant added to all this that for the most part not mentioned. For example, enhanced settings from CCC, lower 3d clock rate, etc.

    Because this is more of a low frame rate issue the graphs have no meaning. Also, from what other's have said, there is no control results and no frame rate results. However, the current graphs used when providing both 1680 and 1920 per game clearly show low frame rates.
    [SIGPIC][/SIGPIC]

  23. #23
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Phoenix, AZ
    Posts
    866
    ....ok and the fact they might be using more graphic enhancements still doesn't make the graphs mean nothing though. So what if they are, it means its still a problem with more enhancements, which, means its still a problem.

    poor logic.
    This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
    This Announcement of the delayed post above has been brought to you by Nvidia Inc.

    RIGGY
    case:Antec 1200
    MB: XFX Nforce 750I SLI 72D9
    CPU:E8400 (1651/4x9) 3712.48
    MEM:4gb Gskill DDR21000 (5-5-5-15)
    GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
    PS:Corsair 650TX
    OS: Windows 7 64-bit Ultimate
    --Cooling--
    5x120mm 1x200mm
    Zalman 9700LED
    Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350


  24. #24
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Decami View Post
    ....ok and the fact they might be using more graphic enhancements still doesn't make the graphs mean nothing though. So what if they are, it means its still a problem with more enhancements, which, means its still a problem.

    poor logic.
    The poor logic is found when trying to reinvent a problem when the former is clearly flawed.
    [SIGPIC][/SIGPIC]

  25. #25
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,363
    Quote Originally Posted by Final8ty View Post
    You cant claim that all people notice Microstuttering & are ignoring it or lyingly.
    They either notice it or they don't.
    I agree with you 100%

    Coming from a heavy SLI user microstuttering is a major problem; Its happened across multiple setups and multiple monitors. I have personally seen it on:

    6800 GS
    GTX260 SLI
    GTX260 192SP SLI
    8800GT (G92) SLI
    8800GTS (G80) SLI
    HD3850 Xfire

    Sometimes its noticable other times its not; I personally think its linked to the saturation of the PCI-E bus; not neccessarly on the GFX end of things in terms of lane saturation but data saturation at the southbridge.

    The worst games for microstuttering in my personal experience have been MMOs and games that have large open worlds that have to access heavy amounts of CPU/Mem/HDD loads.

    Enclosed areas the microstuttering goes entirely away; however the moment you step outside its really pronounced especially as I get near the 40FPS marker.

    Given my experiences; Once Fermi is released I plan on shifting my focus from SLI on my main gaming rig to:

    1 GFX card
    1 PhysX card
    1 PCI-E Raid controller

    (for the reasons mentioned above)
    Last edited by Sentential; 01-14-2010 at 05:38 AM.
    NZXT Tempest | Corsair 1000W
    Creative X-FI Titanium Fatal1ty Pro
    Intel i7 2500K Corsair H100
    PNY GTX 470 SLi (700 / 1400 / 1731 / 950mv)
    Asus P8Z68-V Pro
    Kingston HyperX PC3-10700 (4x4096MB)(9-9-9-28 @ 1600mhz @ 1.5v)

    Heatware: 13-0-0

Page 1 of 10 1234 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •