Page 2 of 2 FirstFirst 12
Results 26 to 45 of 45

Thread: Actual influence of flow rate on system temps

  1. #26
    Xtreme Enthusiast
    Join Date
    Feb 2008
    Location
    Canaduh
    Posts
    731
    i see gabe
    maybe give a try to MSI kombustor, they have single apps for single and multi gpu, maybe it will load both gpu better than furmark does.
    Intel i7 980x / 3001B331
    HK3.0+LaingDDCPRO+XSPCRX360+1xMCR220
    EVGA Classified X58+EK FB
    6GB Corsair Dominator GT 1866 7-8-7-20(TR3X6G1866C7GT)
    ASUS GTX580
    Enermax Revolution+85 950w
    Corsair Obsidian






  2. #27
    Xtreme Addict
    Join Date
    Jun 2007
    Posts
    1,442
    Quote Originally Posted by gabe View Post
    rge, I think you missed the part where we precisely showed the impact of the thermal load from the GPU at idle onto the CPU to be 0.34C , and the impact of reduced flow rate to be another 0.34C, for a total impact of adding 1 GPU to a CPU loop of 0.68C.
    yep, I missed that you used difference in water temp to filter out gpu idle heat from flow...I had not thought of doing that. I havent played around as much with testing as you have, I would have to test with just waterblock and no active gpu to confirm that changes in flow dont alter water temps in that circumstance, though you may have already done/know that.

    btw...planning on any testing with 2 360 rads...one of the most common scenarios is 1 loop or 2 with 2 rads.

  3. #28
    Technician
    Join Date
    Jun 2002
    Location
    Merseyside, UK
    Posts
    2,661
    To test both fire up Furmark 1.8 for multi GPUs to get 100% gpu load. Furmark's affinity is already set to Core 0 only, so if you load your other cores with prime (or linx for max temps) as suggested you'll be able to generate a huge amount of heat dump. This is the only situation where my single loop struggles. You can even fire off super pi 32m on core 0 simulatenously to get 100% load on that too as furmark only gives about 33% load on core 0.

  4. #29
    Xtreme Enthusiast
    Join Date
    Apr 2008
    Location
    France
    Posts
    950
    Quote Originally Posted by gabe View Post
    If your remark in parenthesis "(with a CPU in the loop)" signifies that you believe that a dedicated loop will reverse the results, then do I have news for you! We also tested a dedicated GPU loop performance, but did not publish the results yet in order to stay in sharp focus on the topic. Parallel loop performance of these blocks was also superior to serial in a dedicated loop.. this will be discussed in another chapter of this review.
    It was a reservation only, as i never considered a dedicated GPU loop. Good to know then, that means finally we will have solid results on the age old debate parallel vs series !

    24/7 running quiet and nice

  5. #30
    Xtreme Member
    Join Date
    May 2005
    Posts
    374
    Several years ago i used to have very large and ugly watercooled rig with adjustable pump 30-45-60w. Changing pump power from 30 to 60w would lower the cpu temps only measely 1-2c. The water capacity and radiators were so large that water temp didnt raise due the higher pump power output.

    The reason for this is that water has large heat capacity compared to cpu wattage. Its very easy to calculate too. The improvement is explained mainly with higher turbulence in the waterblock which makes the waterblock as a heat exchanger work sligtly better compared to low flow.
    "I would never want to be a member of a group whose symbol was a guy nailed to two pieces of wood."

  6. #31
    Xtreme Addict
    Join Date
    Jan 2007
    Location
    Michigan
    Posts
    1,785
    Quote Originally Posted by gabe View Post
    I knew this would come up (Folding), and I downloaded Folding for GPU, both in console, and command line, but I was unable to place any significant load onto the GPU's. Maybe I didn't wait long enough, maybe I didn't set the correct parameters, but GPU load remained the 1% or less. I think it might be possible, but I am just not sure how to do it.

    If any of the Folding@home folks was willing to guide me on how to place a constant 100% load on the GPU's I'd be more than happy to do another set of tests, with the reservations noted below.
    Gabe, with BOINC just connect to milkyway @ home. This can be set to use 100% GPU and then you can fold with your CPU as normal on WCG or whatever you like. I'm using two loops so I can fold GPU and CPU 24/7... This kind of test data would be very interesting to me since I have three 120.2 rads across two loops.

    Finally, great work on the data above and many thanks for sharing your findings. I found the data very concise and very informational.
    Last edited by Vinas; 06-07-2010 at 07:32 AM.
    Current: AMD Threadripper 1950X @ 4.2GHz / EK Supremacy/ 360 EK Rad, EK-DBAY D5 PWM, 32GB G.Skill 3000MHz DDR4, AMD Vega 64 Wave, Samsung nVME SSDs
    Prior Build: Core i7 7700K @ 4.9GHz / Apogee XT/120.2 Magicool rad, 16GB G.Skill 3000MHz DDR4, AMD Saphire rx580 8GB, Samsung 850 Pro SSD

    Intel 4.5GHz LinX Stable Club

    Crunch with us, the XS WCG team

  7. #32
    Mr Swiftech
    Join Date
    Dec 2002
    Location
    Long Beach, CA
    Posts
    1,561
    Quote Originally Posted by rge View Post
    I would have to test with just waterblock and no active gpu to confirm that changes in flow dont alter water temps in that circumstance, though you may have already done/know that.
    Yes it's another way to do it, but unnecessarily time consumming. The rise in coolant temp due to GPU heat is unequivocally showed by [(ΔT Water to Air test 2) - (ΔT Water to Air test 1)]; when substract this result from [(ΔT CPU air Test 2) - (ΔT CPU air test 1)], you isolate the rise in temp due to pressure drop, since its the only factor left.

    Quote Originally Posted by rge View Post
    btw...planning on any testing with 2 360 rads...one of the most common scenarios is 1 loop or 2 with 2 rads.
    This describes our Xtreme bench (also mentionned in the body of the article). Ppl have complained that it is too extreme. What we have done (not yet published though), is testing dual loops with a 320 and a 220. I am trying to test configurations that can be easily (without major mods) integrated inside of a case, which I believe represents what the majority of ppl would like to be able to accomplish. The very popular Cosmos S for example is capable of integrating a triple and a dual without any major mods.

    Quote Originally Posted by Vinas View Post
    Gabe, with BOINC just connect to milkyway @ home. This can be set to use 100% GPU and then you can fold with your CPU as normal on WCG or whatever you like. I'm using two loops so I can fold GPU and CPU 24/7... This kind of test data would be very interesting to me since I have three 120.2 rads across two loops.

    Finally, great work on the data above and many thanks for sharing your findings. I found the data very concise and very informational.
    I did already, and it is set to use the GPU, but all I get is 57% load on GPU #2. I need to reach 100% on both GPU's in order to have comparable results.
    CEO Swiftech

  8. #33
    Xtreme Addict
    Join Date
    Jun 2007
    Posts
    1,442
    Quote Originally Posted by gabe View Post
    This describes our Xtreme bench (also mentionned in the body of the article). Ppl have complained that it is too extreme. What we have done (not yet published though), is testing dual loops with a 320 and a 220. I am trying to test configurations that can be easily (without major mods) integrated inside of a case, which I believe represents what the majority of ppl would like to be able to accomplish. The very popular Cosmos S for example is capable of integrating a triple and a dual without any major mods.
    Actually, that is even better . I have a lian li a71F and it accommodates a 3x120 rad up top and 2x120 or 2x140 rad in front without mods (buy lian li replacement top which has screws/no rivets, easy switch)..and that is what I am using.

  9. #34
    Mr Swiftech
    Join Date
    Dec 2002
    Location
    Long Beach, CA
    Posts
    1,561
    Quote Originally Posted by rge View Post
    Actually, that is even better . I have a lian li a71F and it accommodates a 3x120 rad up top and 2x120 or 2x140 rad in front without mods (buy lian li replacement top which has screws/no rivets, easy switch)..and that is what I am using.
    excellent, this validates my hunch that triple + dual is a very common setup.
    CEO Swiftech

  10. #35
    I am Xtreme
    Join Date
    Feb 2003
    Location
    Netherlands
    Posts
    6,421
    Thanks for this very nice test. I would have expected to see better GPU temps with the vga blocks in series but your tests proved otherwise here. This conviced me to get a second d-plug to run my videocards parallel. I owe you a beer.
    Asus Crosshair V Formula-Z | FX 8350 | 2x4GB Trident-X 2600 C10 | 2x ATI HD5870 Crossfire | Enermax Revo 1050watt | OCZ Vertex 3 60GB | Samsung F1 1TB

    Watercooling: XSPC Raystorm | EK 5870 Delrin fullcover | TFC X-changer 480 w/ 4x Gentle Typhoon | DDC2+ Delrin top | EK 200mm res | Primochill LRT 3/8 tubing

    Case: Murdermodded TJ-07

    sub 9 sec. SPi1M 940BE 955BE 965BE 1090T

  11. #36
    Registered User
    Join Date
    Jan 2010
    Posts
    12
    Gabe: Thanks for the insightful and well-executed tests!

    I may be getting ahead of myself here, based on your comment above about upcoming results for a single vs. dual loop comparison; however, I recently upgraded my system and could use some advice in arranging my loops.

    I have (1) quad radiator and (3) triple radiators in my Mountain Mods Ascension case at the moment. Combined with (2) of your MCP655 pumps, I am attempting to determine the optimal loop configuration to cool (1) i7 930 on a Swiftech XT waterblock and (2) EVGA GTX 480s with aftermarket waterblocks.

    I currently have two dedicated loops (i.e. loop one is cpu only; loop two contains both GPUs in series seperated by a single triple radiator). All three triple radiators are in the GPU loop and the single quad is cooling the cpu loop.

    Based on the results above, and a recent post Vapor made here, at which you had hinted some agreement, I am considering putting everything in a single loop with the GPUs in parallel. Perhaps all three in parallel?

    I would be grateful for any help you, or any other knowledgeable XS member, could offer.
    Last edited by TradeWind; 06-08-2010 at 04:25 PM.

  12. #37
    Xtreme Member
    Join Date
    Oct 2009
    Location
    Canada
    Posts
    320
    Quote Originally Posted by gabe View Post
    excellent, this validates my hunch that triple + dual is a very common setup.
    +1 for that

    My current loop is mcr320, xt, mcp650(still running after 5.5 years), and microres v1, with 1/2 inch ID. The 320 is mounted with a radbox, and a bracket at the bottom, on the rear of my armor case.

    I have an mcr220 that I'm thinking of adding internally at the front of the case, and a microres v2 (adding thermal sensor and BigNG) and mcp655 have been ordered.

    So the question for me (once I finally order a block for my 5850) is one loop or two, since I'll have all the components for two.

    Now to decide what to do with my old mcr120 and 3/8 gpu and chipset blocks!

    Nicely presented, useful data Gabe
    *in progress*
    AMD FX-8350
    Asus Crosshair V Formula Z
    2X8GB G.Skill Trident X DDR3-2400 C10
    2X Sapphire Radeon R9 290 Tri-X
    D5|EK Res/top|2X Swiftech MCR320XP|EK Supremacy CPU|2X EK 290X Acetal Nickel
    Seasonic M12D 850w
    Fractal Design Arc Midi R2
    T-Balancer MiniNG
    Western Digital Caviar Black 2TB
    Windows 7 Home Premium 64 bit

    My last intel cpu was a celeron 300a. My first computer was a TI-99/4!

  13. #38
    Mr Swiftech
    Join Date
    Dec 2002
    Location
    Long Beach, CA
    Posts
    1,561
    Quote Originally Posted by Grinder View Post
    +1 for that

    My current loop is mcr320, xt, mcp650(still running after 5.5 years), and microres v1, with 1/2 inch ID. The 320 is mounted with a radbox, and a bracket at the bottom, on the rear of my armor case.

    I have an mcr220 that I'm thinking of adding internally at the front of the case, and a microres v2 (adding thermal sensor and BigNG) and mcp655 have been ordered.

    So the question for me (once I finally order a block for my 5850) is one loop or two, since I'll have all the components for two.

    Now to decide what to do with my old mcr120 and 3/8 gpu and chipset blocks!

    Nicely presented, useful data Gabe
    One loop will give you both performance and redundancy, thus reliability.
    CEO Swiftech

  14. #39
    Xtreme Member
    Join Date
    Oct 2009
    Location
    Canada
    Posts
    320
    Quote Originally Posted by gabe View Post
    One loop will give you both performance and redundancy, thus reliability.
    Thanks Gabe. Although it hasn't run 24/7, I know I'll feel better about the 5.5 year old mcp650 if it has a 655 backing it up. And the extra flow won't hurt! I do plan to monitor the flow with a T-Balancer set up to shut it down if there are any major changes.

    May I suggest that an rpm lead on the 655 NON B would be most welcome?

    I'm assuming that my old 120 probably wouldn't add much in that loop. I'm guessing part of the reason a 320/220 combo is common, is that people have parts leftover after upgrading. I know my upgrade path was 120->220->320.

    And thanks for the value for the $ over the years. I should send you a pic of my homemade mcw6000 am3 bracket I just retired that block and replaced it with an XT
    *in progress*
    AMD FX-8350
    Asus Crosshair V Formula Z
    2X8GB G.Skill Trident X DDR3-2400 C10
    2X Sapphire Radeon R9 290 Tri-X
    D5|EK Res/top|2X Swiftech MCR320XP|EK Supremacy CPU|2X EK 290X Acetal Nickel
    Seasonic M12D 850w
    Fractal Design Arc Midi R2
    T-Balancer MiniNG
    Western Digital Caviar Black 2TB
    Windows 7 Home Premium 64 bit

    My last intel cpu was a celeron 300a. My first computer was a TI-99/4!

  15. #40
    Xtreme Guru
    Join Date
    Dec 2009
    Location
    Latvia, Riga
    Posts
    3,972
    I never could understand reason for rpm lead to be soldered only on 655B/D5 non-vario. Of course, it's not too difficult mod to solder one for Vario, but why oh why more expensive pump should had that one feature less? I simply don't get marketing reasoning behind that.

  16. #41
    Xtreme Guru
    Join Date
    May 2009
    Location
    Southfield, MI
    Posts
    4,128
    Nice Gabe.

    It's nice to see some real world testing to show that for most systems you don't really need dual loop setups.

    It would be interesting to see some test results with the GPUs under load too.

    Thanks for taking the time.
    Project Millertime: The Core I5 build

    Crunching/folding box on air: AMD Athlon X2 7750 Black Edition; Sapphire Radeon HD 4830; Gigabyte MA78GM-US2H; Lian Li PC-V351; Windows 7 RC

  17. #42
    Chasing After Diety
    Join Date
    Jan 2007
    Location
    Absolutely Speachless :O
    Posts
    11,930
    "GPU load tests: We used Furmark in extreme burning mode, windowed in 1920x1050, post processing off to enable 100% load to both GPU’s in SLI configuration, and logged the temperature results at 2 seconds intervals with GPUZ."

    Gabe it said you ran the sli test windowed?

    Do you remember @ CES i told you SLI will not work unless you full screen it?

    And you did that and saw a difference in temps. Then you said Oh its fine.. let it run for a while.. and we walked away.

    Gabe you need to run full screen on SLI setups like i showed you @ CES.

    I dont think furmark changed this...
    Windowed mode does not allow the GPU scaling to occure on direct 3d.
    You need Fullscreen for gpu scaling to occur.

    This is why u cant play with Xfire or SLI in windowed mode on games.
    Last edited by NaeKuh; 06-21-2010 at 03:04 PM.
    Nadeshiko: i7 990 12GB DDR3 eVGA Classified *In Testing... Jealous? *
    Miyuki: W3580 6GB DDR3 P6T-Dlx
    Lind: Dual Gainestown 3.07
    Sammy: Dual Yonah Sossoman cheerleader. *Sammy-> Lind.*

    [12:37] skinnee: quit helping me procrastinate block reviews, you asshat. :p
    [12:38] Naekuh: i love watching u get the firing squad on XS
    Its my fault.. and no im not sorry about it either.

  18. #43
    Xtreme Mentor
    Join Date
    Oct 2007
    Location
    USA
    Posts
    2,622
    Quote Originally Posted by NaeKuh View Post
    "GPU load tests: We used Furmark in extreme burning mode, windowed in 1920x1050, post processing off to enable 100% load to both GPU’s in SLI configuration, and logged the temperature results at 2 seconds intervals with GPUZ."

    Gabe it said you ran the sli test windowed?

    Do you remember @ CES i told you SLI will not work unless you full screen it?

    And you did that and saw a difference in temps. Then you said Oh its fine.. let it run for a while.. and we walked away.

    Gabe you need to run full screen on SLI setups like i showed you @ CES.

    I dont think furmark changed this...
    Windowed mode does not allow the GPU scaling to occure on direct 3d.
    You need Fullscreen for gpu scaling to occur.

    This is why u cant play with Xfire or SLI in windowed mode on games.
    I was there. Saw temps later, they were higher.
    All stock for now, no need for more, but it's gonna be soon methinks.
    Giga Xtreme 58 mobo i7 965 ES D0 step Corsair 1600 6 gig
    SLI GTX470 EVGA
    EK HF nickle blue top CPU block (free from Eddie)
    Koolance 470 waterblocks
    One big loop, two 120x3 rads. Pa 120.3 and XSPC RX 120x3. Swiftech 35x pump with V2 restop. GT AP15 fans.
    Banchetto Tech Station
    120 GB SSD, and a few other drives.
    1000W UltraX3 PSU, 900 watt (1500VA UPS
    23.999" Acer GD235hz and 24" Acer H243H

  19. #44
    Mr Swiftech
    Join Date
    Dec 2002
    Location
    Long Beach, CA
    Posts
    1,561
    Quote Originally Posted by NaeKuh View Post
    "GPU load tests: We used Furmark in extreme burning mode, windowed in 1920x1050, post processing off to enable 100% load to both GPU’s in SLI configuration, and logged the temperature results at 2 seconds intervals with GPUZ."

    Gabe it said you ran the sli test windowed?

    Do you remember @ CES i told you SLI will not work unless you full screen it?

    And you did that and saw a difference in temps. Then you said Oh its fine.. let it run for a while.. and we walked away.

    Gabe you need to run full screen on SLI setups like i showed you @ CES.

    I dont think furmark changed this...
    Windowed mode does not allow the GPU scaling to occure on direct 3d.
    You need Fullscreen for gpu scaling to occur.

    This is why u cant play with Xfire or SLI in windowed mode on games.
    no need when you run xtremeburn mode.
    CEO Swiftech

  20. #45
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    522
    Quote Originally Posted by gabe View Post
    One loop will give you both performance and redundancy, thus reliability.
    Unless you have a leak. When one of my loops got sabotaged I got lucky that the loop only cooled half the computer and not the entire thing otherwise it would have been much worse.

    This data is very useful and I must say that I will be using parallel on my multi part loops from now on. Although back in the day with my first wc rig I actually had parallel serial lol. Yes it was parallel to dual xeons then to serial for the motherboard and hard drives. The setup for that old system was so strange and had so much pressure I still can not believe that I pulled off decent temps off a single 120mm rad. Then again I had a 120+cfm fan on it

Page 2 of 2 FirstFirst 12

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •