Thanks for doing this, as always it's great stuff!
Is there any chance you could put this and the other white paper into a PDF for easy printing? This is the kind of stuff I like to have a hard copy of and make notes on.
Thanks for doing this, as always it's great stuff!
Is there any chance you could put this and the other white paper into a PDF for easy printing? This is the kind of stuff I like to have a hard copy of and make notes on.
I have edited my post to "Jedi Masters" instead of "Trekkies". Sorry for the historical confusion .
If you refer to Part I of this article, you will observe that flow rate has a nominal impact on the water-blocks that we are using in this setup. See below regarding using (1) pump.
As long as you have two pumps in the same loop, irrespective of their position in the loop, you benefit from the redundancy factor and this is a huge plus for a high end system.
(1) pump instead of (2), see below..
1. The heat from the pump that goes into the system is less than 3W, compared to ~550W generated by the CPU&GPU devices (in SLI config). It's less than 1% of the overall load, thus negligible.
2. Part I of this article demonstrated the limited effect of flow rate variations on this generation of waterblocks, so the performance loss would also be nominal.
3. In general use and by simple virtue of the asymetric load (load ratios of devices vs. heat exchangers), the serial configuration would still pull comfortably ahead.
4. In extreme use, there is no question that dual loops would be ahead but the limited scope of this usage model does not infirm the conclusions of this article.
5. Using (1) pump instead of two certainly reflects the majority in terms of usage model; from a performance and economic standpoints, this seems like the sweet spot; on the other hand, one could argue that when people invest so much money in hardware, reliability alone could easily justify the investment for a second pump. Up until now, dual loop setups have been more complex and/or often difficult to implement because of space constraints; however the introduction of the MCR-Drive series of radiators with integrated pump considerably simplifies the integration task, particularly in light of the upcoming new generation of Radiators that can now operate horizontally (hint hint hint ).
See Part I, but keep in mind that the effect of flow is highly dependant on the type of water-block that one uses. Unquestionably, F/C blocks with simple channels for example will be much more sensitive to flow rate than the micro-pin technology that we employ; opposite to this argument however is the fact that current graphic dies have a much lower internal thermal resistance than CPU's due to their huge footprint, which results in lower temps than CPU's. Thus the loss of a few degress may not be as critical as it would be with CPU devices.
The absolute temperature values would differ, but the trends wouldn't.
Done! http://www.swiftnets.com/Technical/T...l_Articles.asp
Last edited by gabe; 06-30-2010 at 11:52 AM.
CEO Swiftech
very nice article, i'm looking forward to a part three for sure (if there will be one?). i guess its single loops from now on
Outstanding! Thanks gabe!
Just a thought:
I was pretty much able to predict how it was going to turn out before reading past your list of parts. When you have that much radiator power, there wont be much difference between a dual and a single loop.
Can you run this test agian, but with a single 120 and a 240 to see the other extreme? While we will all know whats going to happen if we run two 470's on a 120, Im interested to see how the CPU does in a dedicated 120 loop vs a shared loop.
All in all, this just goes to show that its more beneficial to spend the extra money on a bigger/extra rad instead of a second pump, res, more tubing, etc...
Last edited by StAndrew; 06-30-2010 at 02:50 PM.
Intel 8700k
16GB
Asus z370 Prime
1080 Ti
x2 Samsung 850Evo 500GB
x 1 500 Samsung 860Evo NVME
Swiftech Apogee XL2
Swiftech MCP35X x2
Full Cover GPU blocks
360 x1, 280 x1, 240 x1, 120 x1 Radiators
One needs to understand that the fundamental law being illustrated in this article is load ratios. With less cooling capacity, temperatures will substantially scale up, but as outlined earlier given the asymetric nature of the load, the outcome will remain the same (serial wins).
In normal usage, your CPU-dedicated 120mm radiator will see ~120 Watts at full CPU load, whereas in a serial loop composed of a 120 + a 220, this load will be shared by the ~equivalent of a triple rad, or ~40 Watts per 120mm fan if you will. As far as your triple SLI is concerned, at full load your dual rad currently "sees" ~360W, that's 180W per 120mm fan. If you serialize, this value will drop to 120W!
Last edited by gabe; 06-30-2010 at 04:04 PM.
CEO Swiftech
| Completed: Project "Simples" | Custom TJ07 | P67A-UD3 | 2600K | GTX460 | MCR320+MCR220 | DDC 18W+XSPC Res |
| In progress: Project "Weebeastie" | A70B | P6T7 WS | i7-970 | 4xGTX470 | PA120.3+RX240+TFC120 | XSPC Dual-Pump-BayRes |
| In progress: Project "Gemini" | PC-P80B | EVGA SR-2 | 2xX5650 | 7100GS | PA140.3 | EK DCP-4.0 |
Core i7 920 @4.31ghz (HK3.0+MCR320Drive) [DT Air/CPU @ 13c idle, 45c IBT]
MSI GTX 295 Single PCB with EK 295 + MCR320 Drive + 1450rpmGT [DT Air/CPU @ 10c idle]
Asus R3E . Corsair TX850 . OCZ 12Gb 1333mhz RAM
Just what I was going to say. Off topic, but I used to have to clean my system every 3 months using glycol/dye products. Since I've switched to straight distilled and some PTnuke...I've since been running the same loop for over a year and the water and UV tubing still looks perfectly clear.
You can still run multiple colored tubes if that's what floats your boat...
I've been using Feser 1 for years and not had any major problems apart from staining of some tubing. But recently took a loop apart when switching from LGA 775 to 1366 and the inside of the CPU block had some gunk in the pins.
So I've got a silver coil, some PT Nuke and plenty of distilled
| Completed: Project "Simples" | Custom TJ07 | P67A-UD3 | 2600K | GTX460 | MCR320+MCR220 | DDC 18W+XSPC Res |
| In progress: Project "Weebeastie" | A70B | P6T7 WS | i7-970 | 4xGTX470 | PA120.3+RX240+TFC120 | XSPC Dual-Pump-BayRes |
| In progress: Project "Gemini" | PC-P80B | EVGA SR-2 | 2xX5650 | 7100GS | PA140.3 | EK DCP-4.0 |
Core i7 920 @4.31ghz (HK3.0+MCR320Drive) [DT Air/CPU @ 13c idle, 45c IBT]
MSI GTX 295 Single PCB with EK 295 + MCR320 Drive + 1450rpmGT [DT Air/CPU @ 10c idle]
Asus R3E . Corsair TX850 . OCZ 12Gb 1333mhz RAM
A horizontal MCR-Drive? Sign me up! I have just the use
Sounds great! Will it be possible to both mount it in the bottom and the top? Because the way I invision it, the runs upside down in one of those positions.
“Little expense had been spared to create the impression that no expense had been spared.” - Hitchhiker's GuideMondays:It's better to ask dumb questions now, than to look stupid later
ha, nice guide man.. Always new 1 loop was the way to go.. less tubing and better temps.. What more can you ask for
Another thing I find funny is AMD/Intel would snipe any of our Moms on a grocery run if it meant good quarterly results, and you are forever whining about what feser did?
From my understanding, if you are using too many rads for this setup, you are removing the radiator cooling abilities as a factor in the change of temp. The temp differences between each loop will be based solely off the flow rate and how the blocks are affected by such.
Further, in my experience, many users have temp issues because their radiators are inadequate for their setup. They feel that a dual loop will correct this, but fail to understand that any amount of radiators can only dissipate a set amount of heat, regardless of the setup.
Anyways Im way behind the learning curve on the thermodynamics involved; I just wanted to see the opposite end of the spectrum .
Intel 8700k
16GB
Asus z370 Prime
1080 Ti
x2 Samsung 850Evo 500GB
x 1 500 Samsung 860Evo NVME
Swiftech Apogee XL2
Swiftech MCP35X x2
Full Cover GPU blocks
360 x1, 280 x1, 240 x1, 120 x1 Radiators
I understand your curiosity, sorry, these tests are very time consumming, and I had to pick what my gut feeling told me was the most commonly used high-end setup.. I had the choice between two duals, one dual + one triple, and two triples (not even looking into quads..). I picked the triple + dual, because I know it fits in many popular full tower cases like the cosmos S and so forth. The fact is, if I'm just looking at my sales statistics, I can tell you that single 120's are on the brink of extinction in my market space.
CEO Swiftech
Gabe, can we've another round of testing on the impact of flow rate for a single loop? Is it really necessary to run 2 pumps for multiple blocks and rads?
Phil
No it isn't necessary, but several factors led me to test with two pumps:
1. apple to apple directly comparable data
2. I do like the redundancy factor for a high-end loop
3. my bench is setup with MCR Drive units. it's so easy to setup, it's a dream come true (for me).
I removed the XT from the loop now, as I am testing the new Apogee LP (low profile for 1U and space constrained applications), so this will have to wait for now. but I do see the economic value of it, so Im not against it.
But to answer your concern more directly: if you now have SLI graphics cards connected in series, you would benefit from parallelizing them.
CEO Swiftech
gabe: btw, such sales statistics imho can be very interesting for us aswell. Of course, actual numbers might be trading secret, but how about elaborating us at least in relative ammounts/percentages of popularity of different models? And other data too .. like i recall few times flamewars about reliability of D5 vs DDC .. i wonder which wins from vendor standpoint with actual RMA numbers on hand . I'm shure that any big vendor with real data/statistics of several orders bigger then that of even any single LC shop can have, has lot of such interesting information available only to him that can tell final word in several flame wars arround here
Intel 8700k
16GB
Asus z370 Prime
1080 Ti
x2 Samsung 850Evo 500GB
x 1 500 Samsung 860Evo NVME
Swiftech Apogee XL2
Swiftech MCP35X x2
Full Cover GPU blocks
360 x1, 280 x1, 240 x1, 120 x1 Radiators
Gabe, thank you very much for looking into this topic. I am actually adding onto my loop pretty soon, which is a cpu-only loop at the moment, and I was really wondering whether it would be worth all of the trouble to add an entirely new loop for 2 new cards, or if I could manage them within one loop with some minor adjustments (will probably add an external 360 rad or internal 240 to my 800D). Perfect timing.
Bookmarks