Page 2 of 6 FirstFirst 12345 ... LastLast
Results 26 to 50 of 129

Thread: PCGH: Micro stuttering on multi GPU solutions

  1. #26
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by MuffinFlavored View Post
    Thanks.

    I thought multi-GPU was all about rendering the upcoming frames, but that would mean they frames would be ready one after another and there would be no stuttering.

    I guess this is not how it is. How is multi-GPU then? All work to render one frame?
    They each render x part of the frame usually. Then have to share the information about that frame. I have a feeling they dont sync right and the frames gets "shared" uneven to try and keep up pace with high FPS when one card is behind. GFX cards and drivers today is a huge mess. Its also quite a joke when we need "optimized" drivers for x game. Since by standard definitions you would have baseline DirectX drivers and all would work perfectly. All these hacks and cheats have a backside of the medal.
    Crunching for Comrades and the Common good of the People.

  2. #27
    Xtreme Member
    Join Date
    Aug 2006
    Posts
    394
    Haha, I just realized I never put my gfx card in my sig because of how shamefull it was to me.
    Custom case laser cut from a 3/16" thick sheet of brushed Aluminum 8"x80" & cold formed into a box then anodized black with 1/2" Poly-carbonate side panels..[.fully modular, all aluminum mounting brackets, HD bays, and mobo tray are removable...down to the bare box
    --Asus Maximus V Gene--
    --Intel 3770k @4.2 GHz De-lided and I soldered an Arctic Twin Turbo to the Intel.
    --MSI R7970 3GB @1150, 1500 cooled with an Arctic Accelero Xtreme--
    --G.SKILL Ripjaws @2400 MHz --
    --SeaSonic X-1050 Gold--
    --128 GB Sandisk UltraPlus is was only $59 new! Seagate 1TB HD--
    --Samsung S23A750D 120Hz monitor--
    --Razer Tarantula-- keyboard, yes it is like 8 years old!
    --Corsair M60 mouse--
    --Klipsh Promedia 2.1-- I rock stereo speakers the way they were meant to be rocked
    -- 100% Fun ...

    Does it ever shock anyone else when your hear someone use Darwin's "survival of the fittest" to justify genocide?

  3. #28
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Shintai View Post
    They each render x part of the frame usually. Then have to share the information about that frame. I have a feeling they dont sync right and the frames gets "shared" uneven to try and keep up pace with high FPS when one card is behind.
    yepp, and the result looks like what you also see at high fps with no vsync. i actually thought that those things are fixed now seeing as we have at least the 4th gen xfire and 3rd gen sli.

    xfire 1 external connectors
    xfire 2 connection through pciE
    xfire 3 connection through internal bridge
    xfire 4 support for more than 2 gpus

    sli 1 one bridge
    sli 2 two bridges
    sli 3 support for tripple sli

  4. #29
    Xtremely Retired OC'er
    Join Date
    Dec 2006
    Posts
    1,084
    There are like 75%(and more) computer ppl who use vsync on.
    That what i seen in one of my forums, thats bad and ppl love to play on graphics lag.
    Some of them dont know how to play without vsync, lol.

    As far i remember vsync is like cooler for graphics gpus, it limits 50-75-120 fps and it wount raise any hier, just to _keep low fps = cooler graphics_

  5. #30
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by Ego View Post
    There are like 75%(and more) computer ppl who use vsync on.
    That what i seen in one of my forums, thats bad and ppl love to play on graphics lag.
    Some of them dont know how to play without vsync, lol.

    As far i remember vsync is like cooler for graphics gpus, it limits 50-75-120 fps and it wount raise any hier, just to _keep low fps = cooler graphics_
    Vsync got more benefits. First of all it stops your GPU from rendering more frames than your screen can show. It runs alittle colder as you say. It also helps keep minimum FPS up usually. Also vsync gives better quality since you dont skip alot of frames that never gets shown.
    Crunching for Comrades and the Common good of the People.

  6. #31
    Xtreme Mentor
    Join Date
    Nov 2005
    Location
    Devon
    Posts
    3,437
    Has anyone noticed micro-stuttering when Tile Mode rendering is used instead of AFR?

    This mode should not have any stuttering visible, but unfortunately it's slower than AFR....
    RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W

    RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU

    SmartPhone Samsung Galaxy S7 EDGE
    XBONE paired with 55'' Samsung LED 3D TV

  7. #32
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Lightman View Post
    Has anyone noticed micro-stuttering when Tile Mode rendering is used instead of AFR?

    This mode should not have any stuttering visible, but unfortunately it's slower than AFR....
    well how? ati doesnt let us force the different mods like nvidia does

  8. #33
    Xtreme Member
    Join Date
    May 2006
    Posts
    313
    nvidia does ?
    they don't !

    there is a tool that let's you choose the profile, but that does not come from nvidia.

    Anyway, you can change it with Ati by just renaming the executable... it's really that easy.
    System : E6600 @ 3150mhz, Gigabyte DS3, 4gb Infineon 667mhz, Amd-Ati X1900XT

  9. #34
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    hmmm i never understood why ati doesnt let people chose the rendering modes themselves... i mean honestly, why?

    cause they dont want nvidia to know when they use what modes and how efficient they are? like nvidia couldnt tell THEY have ways to find out what mode is running and know how to force what mode on a xfire system for sure.

    imo its a typical stupid "customers shouldnt get their dirty paws on this" decision... too bad really...

  10. #35
    Xtreme Member
    Join Date
    May 2006
    Posts
    313
    saaya plz tell me where nvidia let the customers choose the rendering mode.
    System : E6600 @ 3150mhz, Gigabyte DS3, 4gb Infineon 667mhz, Amd-Ati X1900XT

  11. #36
    Xtreme Member
    Join Date
    Aug 2002
    Posts
    215
    Quote Originally Posted by saaya View Post
    the tool your explaining doesnt solve the problem though, it will only have a cosmetic change in that the frame times will look fine, but as i explained above, the game will still stutter... the stuter would actually be even perceived worse cause you stretch the time between the two identical frames!
    Nono, the tool WORKED. I have it here and stuttering almost disappeared perfectly.

    All the tool did was to GUARANTEE a certain amount of time to wait until a certain frame gets displayed.

    I give you an expample:

    Frame A wanted to be displayed after 10ms, which is way too fast, because the NEXT frame would then come out after 50ms, because @ 30fps we have ~60ms time for two frames. This is what i call a 10-50 cadence. The 50ms part is responsible for the stuttering, as it represents a framerate auf 20fps (1000/50=20), which is way below 30fps.
    So the goal is to get rid of those 50ms.

    So when the tool ADDs to those 10ms of frame A another 20ms it will be displayed at 30ms, AND THE NEXT FRAME will come out in the NEXT 30ms -> a perfect 30-30 cadence. Since we eliminated the 50ms down to 30ms we will NOT see stuttering.

    So this tool does NOT "swallow" frames, it just delays them to shorten the lag of following frames. ALL frames get rendered!
    If frame A would take 15ms, the additional time added would only be 15ms.

    This tool, although not finished, IS working.
    ==
    Now what would make this tool perfect? Intelligence of course

    All a perfect tool would have to do, is to look at the previous 2 frametimes, AVERAGE this ((A+B)/2) and then take that as maximum waiting time for the next 2 frames.

    So e.g.: 23ms A + 56ms B = 76ms render time for 2 frames. -> 34ms render time for 1 frame. -> EXPECTED render time for next 2 frames = 34ms each. -> waiting time for next 2 frames = 34 - actual frametime. -> actual frametime messured: 10ms + 15ms (asuming game is getting faster ) -> waiting time for those 2 NEW frames is 24ms (34-10) and 19ms (34-19). -> expectation has to be adjusted -> next waiting time is [(10+15)/2= 12.5~13ms]-actual frametime and so on and on and on....

    So a tool will always look at the previous 2 (SLI) or 3 (TSLI) or 4 (quad-sli) frametimes, calculate an average frametime and then set a minimum wait for the next 2, 3 or 4 frames.
    It just SLOWS down those frames that come out too fast, to give the following frame(s) enough time. That's the trick.

    Disadvantages: of course a tool can only react 2 to 4 frames late at changing framerates, because it has to look at the past to predict new waiting times. So it WOULD react at changing conditions, e.g. looking at the sky in a game will decrease the load and increase fps, but only with a delay of 2 to 4 frames. So this wil cost a small amount of average fps during a benchmark session, but gameplay will be 1000 times better than without a tool.
    ==
    The easiest way would be to implement this kind of smoothing directly into the driver. The driver wouldn't have to messure more than 1 frame to know how long it took to render 1 frame. So it can react perfectly quick for the next frame.

    p.s: i hope i did not miscalculate anything, but it is late here in vienna and i'm tired

  12. #37
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Can any of you share that tool? I've been suffering the problem you (tombman) have explained for ages, and one of the reasons I drop Crossfire was that. If the tool actually works as you say it does, that would be PERFECT!
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  13. #38
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by realsmasher View Post
    saaya plz tell me where nvidia let the customers choose the rendering mode.
    Afaik you can create a custom game profile and in this profile you can select what rendering modes you want the driver to use?

    I didnt run sli or xfire for a looong time so im not sure.

    Btw, i did some research and apparently you CAN set the rendering modes, althought not all of them, with xfire as well? you have to set ai catalyst to advanced and then can select between 2 rendering modes or something?

    Quote Originally Posted by tombman View Post
    Nono, the tool WORKED. I have it here and stuttering almost disappeared perfectly.

    All the tool did was to GUARANTEE a certain amount of time to wait until a certain frame gets displayed.

    I give you an expample:

    Frame A wanted to be displayed after 10ms, which is way too fast, because the NEXT frame would then come out after 50ms, because @ 30fps we have ~60ms time for two frames. This is what i call a 10-50 cadence. The 50ms part is responsible for the stuttering, as it represents a framerate auf 20fps (1000/50=20), which is way below 30fps.
    So the goal is to get rid of those 50ms.

    So when the tool ADDs to those 10ms of frame A another 20ms it will be displayed at 30ms, AND THE NEXT FRAME will come out in the NEXT 30ms -> a perfect 30-30 cadence. Since we eliminated the 50ms down to 30ms we will NOT see stuttering.

    So this tool does NOT "swallow" frames, it just delays them to shorten the lag of following frames. ALL frames get rendered!
    If frame A would take 15ms, the additional time added would only be 15ms.

    This tool, although not finished, IS working.
    ==
    Now what would make this tool perfect? Intelligence of course

    All a perfect tool would have to do, is to look at the previous 2 frametimes, AVERAGE this ((A+B)/2) and then take that as maximum waiting time for the next 2 frames.

    So e.g.: 23ms A + 56ms B = 76ms render time for 2 frames. -> 34ms render time for 1 frame. -> EXPECTED render time for next 2 frames = 34ms each. -> waiting time for next 2 frames = 34 - actual frametime. -> actual frametime messured: 10ms + 15ms (asuming game is getting faster ) -> waiting time for those 2 NEW frames is 24ms (34-10) and 19ms (34-19). -> expectation has to be adjusted -> next waiting time is [(10+15)/2= 12.5~13ms]-actual frametime and so on and on and on....

    So a tool will always look at the previous 2 (SLI) or 3 (TSLI) or 4 (quad-sli) frametimes, calculate an average frametime and then set a minimum wait for the next 2, 3 or 4 frames.
    It just SLOWS down those frames that come out too fast, to give the following frame(s) enough time. That's the trick.

    Disadvantages: of course a tool can only react 2 to 4 frames late at changing framerates, because it has to look at the past to predict new waiting times. So it WOULD react at changing conditions, e.g. looking at the sky in a game will decrease the load and increase fps, but only with a delay of 2 to 4 frames. So this wil cost a small amount of average fps during a benchmark session, but gameplay will be 1000 times better than without a tool.
    ==
    The easiest way would be to implement this kind of smoothing directly into the driver. The driver wouldn't have to messure more than 1 frame to know how long it took to render 1 frame. So it can react perfectly quick for the next frame.

    p.s: i hope i did not miscalculate anything, but it is late here in vienna and i'm tired
    the method you are explaining would prevent flickering but not the stuttering.

    the frames would be displayed at a constant frame per second rate, but the frame times, the times when the frames wre actually calculated and hence the time the frames show IN the game, are still not constant.

    see it this way, you want to draw your own little cartoon movie.
    if you look at a certain amount of pictures per second it looks fluent.
    the more pictures per second you flip through the more fluent
    but drawing extra pictures is a lot of work...
    if you take the same pictures you already have and just make a copy of it you have twice the amount of pictures. so now you can show more pictures per time and have it look more fluent right? of course not, cause the pictures are still the same, you will look at the same picture twice which will not make it slightly more fluent but all movements actually stutter.

    thats the problem, the frames get rendered at almost the same time, which means they show almost the same time in the game, which means they are almost identical. so yes, reorganizing the time the frame gets displayed makes the scene flicker less and look more smooth, but it doesnt actually make the game smooth, it will still stutter, actually it will stutter even more cause for an even longer period you will see the same frame, or almost the same frame.

    effectively by changing the frame times you are changing the speed the time passes in the game.
    this would actually enforce the stuttering and make it even worse.
    at least in theory... no idea how vsync and display tehnology and the human eyes perception etc etc all play into this

    if your tool indeed improved the perceived game quality as in making it look and feel smoother, then we could use it with single card systems as well.
    we could simply insert extra frames, up to 100% more frames, but simply re-use exisiting frames and show every 10th or up to every single frame more than once. this would make the fps go way up artificially. i have my doubts if frame insertion actually helps the smoothness of a game... afaik both nvidia and ati experimented with this but never used it since it doesnt work and has side effects like manipulating the in-game time.
    Last edited by saaya; 02-11-2008 at 12:37 AM.

  14. #39
    Xtreme Member
    Join Date
    May 2006
    Posts
    313
    Afaik you can create a custom game profile and in this profile you can select what rendering modes you want the driver to use?
    there is a tool for setting rendering mode, but it does not come from nvidia. It's 3rd party.

    you can't make ati responsible for not open the rendering modes if nvidia also does not do it.

    SLI was long time(and also today) much more used then CF, so it's understandable why a 3rd party tool exists for SLI but not for CF.



    the frames would be displayed at a constant frame per second rate, but the frame times, the times when the frames wre actually calculated and hence the time the frames show IN the game, are still not constant.

    you think way to static.

    frame 1 comes at 0ms
    frame 2 wants to come at 10ms but is delayed to 30ms
    frame 3 comes at 60ms
    frame 4 comes NOT at 70ms, as it was delayed first time and it will take also 60ms till it rendered, so it will come at 90ms


    -> not problem anymore.



    @ tombman : could you please upload the tool ? i don't think you got the sourcecode ?

    Maybe i can help with it. If the methode for delaying frames is already implemented it shouldn't be too hard to improve the timing.
    System : E6600 @ 3150mhz, Gigabyte DS3, 4gb Infineon 667mhz, Amd-Ati X1900XT

  15. #40
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by cadaveca View Post
    AMD is too focused on speed...and not on quality...which is too bad.
    its not like nvidia is any different... :P
    THEY brought SLI back to live in its current incarnation which imo is flawed. ati just followed the market demand and did a similar approach cause nvidias SLI was so successfull as a marketing tool.

    Quote Originally Posted by realsmasher View Post
    there is a tool for setting rendering mode, but it does not come from nvidia. It's 3rd party.
    hmmm i didnt know that, this is a shame then...

    Quote Originally Posted by realsmasher View Post
    you can't make ati responsible for not open the rendering modes if nvidia also does not do it.
    yes i can, ati should do it cause its the right thing to do to give people access to the hardware they spend 1-2 months worth of their salary on.
    regardless of what the competition does!

    Quote Originally Posted by realsmasher View Post
    you think way to static.
    and you dont seem to know how a gpu works :P

    Quote Originally Posted by realsmasher View Post
    frame 1 comes at 0ms
    frame 2 wants to come at 10ms but is delayed to 30ms
    frame 3 comes at 60ms
    frame 4 comes NOT at 70ms, as it was delayed first time and it will take also 60ms till it rendered, so it will come at 90ms
    thats not how it works... the frame doesn "want" to come at a certain time and then gets pushed back. afaikthats impossible. the frame DOES come at the time the gpu finished it, this tool cant influence when the frame gets rendered, all the tool can do is hold the frame in the frame buffer for a bit longer and hide it from you and show it to you a few pico seconds later. thats all.

    the time the frame actually gets rendered at , afaik,cant be controlled by a little tool. you need to know a LOT about how the gpu works and need to influence the entire rendering process to adjust the times the frames actually get rendered at. this is very very complex and predicting when the gpu is ready to render the next frame and how long it will take, to make sure frame2 isnt finished before frame1, or 2 frames are finished at the same time, or there is a long period without any frames beeing rendered at all, takes some of the smartes people on this planet to figure out and adjust.
    Last edited by saaya; 02-11-2008 at 03:14 AM.

  16. #41
    Xtreme Mentor
    Join Date
    Mar 2006
    Location
    Evje, Norway
    Posts
    3,419
    So delaying the picture after its rendered wont actually address the problem since the error has already been done by that time. You will have to delay when the card starts to render the image in the first place. So when its done it actually hits 30ms later than the first pic like its supposed to.
    Quote Originally Posted by iddqd View Post
    Not to be outdone by rival ATi, nVidia's going to offer its own drivers on EA Download Manager.
    X2 555 @ B55 @ 4050 1.4v, NB @ 2700 1.35v Fuzion V1
    Gigabyte 890gpa-ud3h v2.1
    HD6950 2GB swiftech MCW60 @ 1000mhz, 1.168v 1515mhz memory
    Corsair Vengeance 2x4GB 1866 cas 9 @ 1800 8.9.8.27.41 1T 110ns 1.605v
    C300 64GB, 2X Seagate barracuda green LP 2TB, Essence STX, Zalman ZM750-HP
    DDC 3.2/petras, PA120.3 ek-res400, Stackers STC-01,
    Dell U2412m, G110, G9x, Razer Scarab

  17. #42
    Registered User
    Join Date
    Oct 2005
    Location
    Vienna/Austria
    Posts
    31
    Quote Originally Posted by saaya View Post
    the time the frame actually gets rendered at , afaik,cant be controlled by a little tool. you need to know a LOT about how the gpu works and need to influence the entire rendering process to adjust the times the frames actually get rendered at. this is very very complex and predicting when the gpu is ready to render the next frame and how long it will take, to make sure frame2 isnt finished before frame1, or 2 frames are finished at the same time, or there is a long period without any frames beeing rendered at all, takes some of the smartes people on this planet to figure out and adjust.

    saaya, you are from germany, so read this thread: http://www.forum-3dcenter.org/vbulle...d.php?t=389258

    The tool was programmed by a guy (Nick Ludi, he appeared at page 12), who is an ATI-employee. But there are licence-problems with a microsoft-library he used for his tool. So he removed the tool from his server and he has no time to create his own library. He said also, that he will not give the source code to anyone.
    Last edited by Slipknot; 02-11-2008 at 03:38 AM.

  18. #43
    Xtreme Addict
    Join Date
    Mar 2004
    Location
    Toronto, Ontario Canada
    Posts
    1,433
    This thread should be stickied, everyone should know about it and complain so that nv and ATi will do something.

  19. #44
    Registered User
    Join Date
    Oct 2006
    Posts
    22
    Hello !

    this thread is right up my alley....sadly
    I'm experiancing the symptoms of stutter and fps drops of 50% back to 100% fps in 1 second cycles. The odd thing is that they stuttered when I first installed them with my water cooling. Upon fiddling with the waterblocks on the gpu's....and hence the cards themselves...I found that the stutter disappeared. I could still SEE stutter visually althought the fps remained near 100% of potential ( minor and playable ).

    Well things were running almost acceptable ( with UT3 )...but I needed to clean out the water in my cooling as I'm still learning the nuances of keeping the loop clean.
    Once again...upon reinstalling the blocks and cards....the horrible stutter. This time, fiddling some more with the card seating did not fix it. I gave up and reformatted the whole darned computer. With the new install of Vista Ultimate x64 and ATI's 8.2 drivers, the stutter was back to normal amounts ( double the fps of 1 card...just less smooth than 1 card ) I was almost happy ! 8-0

    Well here we are again,with the final reseating of the waterblocks. Cpu never goes higher than 40 degrees at 100% load and gpu's at max load stay below 45 dgrees.
    The massive stutter returned, this time with the visual cue of the fps dropping to exactly 1/2 and back to a full 2 cards output in 1 second cycles.

    I have used the 8.2 drivers for both the balanced fps result ( less stutter , NOT stutterless ) and the wildly fluctuating fps. The only difference was actually touching the cards....or reinstalling the drivers and OS. ODD,,,,very odd

    I even tried the Vista multiple gpu hotfixes that address the issue of windows desktop manager not alloting data to the second card properly....to no avail.
    Hopefully they might help others with the stutter. Here they are...

    KB945149-Multiple GPU Hotfix
    KB945149

    KB936710-DX10 & Multiple GPU Hotfix
    http://www.microsoft.com/downloads/d...DisplayLang=en

    KB940105-Virtual address space usage in Windows game development Hotfix
    http://www.microsoft.com/downloads/d...DisplayLang=en

    Hope they help some.

    4 generations of crossfire later, I`m still waiting for ATI to write a comprehensive guide on crossfire use , not to mention a driver that delivers on their promise.
    Shameful....in any other industry, the consumer protection agencies would get involved.

    1 question about powerplay...from the ATI help files that load with CCC...it states " .PowerPlay™ for mobile computers manages the power requirements of the GPU by optimizing the graphics settings for higher performance or longer battery life. Use PowerPlay™ to balance performance and power consumption".
    Is this why I can't find any settings for Powerplay like like I am supposed to, under the "Graphics Settings Tree"(ATI's description) ? I'm definately not on anything mobile ( the CoolerMaster Cosmos case weighs 35 Lbs. when empty) I only see a reference to PowerPlay in my Power Options...with a greyed out menu for settings.

    Edit: A great article on micro stutter with multiple gpu's from a German site (pre translated for your convenience )
    http://www.pcgameshardware.de/?article_id=631668
    Last edited by Jungle+=; 02-22-2008 at 03:56 PM.
    ASUS Maximus Formula SE
    Intel Core2 Extreme QX9650
    Radeon 2X HD3870 X-fire
    4GB OCZ DDR2-1150 PC2-9200 Flex LC
    Thermochill PA120.3 / Swiftech MCP655 Pump/ D-Tek Blocks
    Samsung SyncMaster 226BW 22" LCD monitor
    SB X-Fi Platinum
    RAIDED Raptors
    OCZ Game X Stream 1010W PSU
    Logitech MX 3000 K/B & G7 2000 DPI mouse
    CoolerMaster Cosmos Full Tower case

  20. #45
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Jungle+= View Post

    Edit: A great article on micro stutter with multiple gpu's from a German site (pre translated for your convenience )
    http://www.pcgameshardware.de/?article_id=631668
    this is the article this thread is about! :P

    tombman, in the thread on 3dc you said that the hl2 integrated frame limiter already fixes this problem?
    can somebody check this? does it just delay frames or does it really help to get more stable fps?
    Last edited by saaya; 02-22-2008 at 05:23 PM.

  21. #46
    Registered User
    Join Date
    Oct 2006
    Posts
    22
    Quote Originally Posted by saaya View Post
    this is the article this thread is about! :P
    Blushes......
    Now I remember where I first found it !
    Methinks that 2 days without proper sleep wrestling my watercooling loop AND subsequent crossfire issues...has made me delerious !
    ASUS Maximus Formula SE
    Intel Core2 Extreme QX9650
    Radeon 2X HD3870 X-fire
    4GB OCZ DDR2-1150 PC2-9200 Flex LC
    Thermochill PA120.3 / Swiftech MCP655 Pump/ D-Tek Blocks
    Samsung SyncMaster 226BW 22" LCD monitor
    SB X-Fi Platinum
    RAIDED Raptors
    OCZ Game X Stream 1010W PSU
    Logitech MX 3000 K/B & G7 2000 DPI mouse
    CoolerMaster Cosmos Full Tower case

  22. #47
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    interesting... i read the entire thread and maaaaannn, its full of kids insulting each other and stupid ego battles... geez...

    so in the end, it seems some ati driver dev took the time and wrote a little tool that constantly stalls one of the gpus to keep the frame times of both gpus exactly in sync. the tool was removed since he uses a microsoft dll and either never received a reply from microsoft about using it, or received a reply telling him to stop distributing the dll.

    is this the tool?


  23. #48
    Registered User
    Join Date
    Oct 2005
    Location
    Vienna/Austria
    Posts
    31
    Dunno where you got the frame limiter from, but Ludis program is command line based:

    http://rapidshare.com/files/85231450/fpslimiter_0_2.rar

  24. #49
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,550
    Looks like tombman is the professor here, everything he says makes perfect sense.

    Now, with a single-card, I see "tearing" with and without vsync on a 2405FPW lcd, @ 60Hz

    However, the tearing noticably decreases if a 100Hz crt monitor is used.

    Does the refresh rate also matter in micro-stuttering?

    ps: I also vote for a sticky, this is a very important issue

  25. #50
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    i cant get this. how two cards getting the same clock from pci-e bus cant be synched. and totally agree with saaya this tool will also make single card systems more smooth too.


    When i'm being paid i always do my job through.

Page 2 of 6 FirstFirst 12345 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •