Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 83

Thread: Swiftech® releases new Multi-port Apogee™ HD Waterblock & MCRx20 Drive R3 Rads

  1. #26
    Xtreme Member
    Join Date
    Mar 2011
    Location
    SoCal
    Posts
    268
    The actual flow rate you get in each card still depends on the pumping flow rate (or the flow rate you have in the master line). Because the parallel configuration of your 3 GPU is very low restriction, it results in a higher flow rate in the master line. So where you had say 1.3 GPM in the single line with 3 GPU blocks in series, approximately, with 3 block in // the master line flow rate will be 2 GPM (thanks to the MUCH lower restriction of the // setup). Then the actual flow rate in each card is close to .65 GPM (2GPM / 3) which is only half (and not the third) of the flow rate you had in your blocks when in series. This is a rough example but pretty close to actual numbers

    Parallel setups have been under rated for years for some reason (most likely because it all started 10 years ago with mostly CPU water blocks - back then when manufacturers and DYIers started to liquid cool other components they assumed the same rules applied to GPU and others, but even "if" this was true at some point, it's definitely no longer the case). Where CPU are still quite sensitive to flow rate and it does make sense to want to maximize the flow in a CPU block. But there is no reason to serialize anything else, really. These 2 new products offer a great alternative to splitters and other manifolds to build parallel setups and on top of that the Apogee HD is designed to be serialized with the master line for maximum flow rate.

  2. #27
    Registered User
    Join Date
    Apr 2010
    Posts
    27
    So when all is said and done.... the multible outlets feature will only be practical if you're watercooling RAM and/or the Mobo, correct? I can see having an outlet of the Apogee HD going to each GPU (if more than one), but that would probably only be possible with GPU-only blocks and not Fullcover blocks, because the inlets/outlets between the FC blocks are too close to each other to fit angled fittings.

    Still looks like a great block though, even just by itself. Can't wait to see some independent tests done.

  3. #28
    Xtreme Member
    Join Date
    Mar 2011
    Location
    SoCal
    Posts
    268
    Quote Originally Posted by Cory View Post
    So when all is said and done.... the multible outlets feature will only be practical if you're watercooling RAM and/or the Mobo, correct? I can see having an outlet of the Apogee HD going to each GPU (if more than one), but that would probably only be possible with GPU-only blocks and not Fullcover blocks, because the inlets/outlets between the FC blocks are too close to each other to fit angled fittings.

    Still looks like a great block though, even just by itself. Can't wait to see some independent tests done.
    RAM, Chipsets, mosfets and GPUs. Based on design Full cover block already have an easy way to be set up in parallel. That doesn't mean they can't work with the HD. If you just have a CPU and 2 GPU with full cover, you won't need to use one of the additional ports, but if you've got a RAM/chip/etc you will be parallelizing your GPU stack to whatever other block you've got.

  4. #29
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Atlanta, GA
    Posts
    404
    I too am going to miss the shiny metal tops of Swiftech blocks. Just got my new rev2 top in a few weeks ago (the day before you announced these new blocks), and at first I was a bit miffed that I got caught in the always something new arround the corner bubble. But now that I look at them, I'd rather have my XT for its dark chrome/gun-metal color scheme. Dont get me wrong, these new block are fine looking, but I'm just a metal kinda guy. I'll wait for the next block to see what Swiftech comes out with. But great show on the huge performance gains. I didnt expect anybody to bring to market a new block that would get temp drop of 3 degrees and increased flow like this anytime soon. At this point in the game, I was expecting 1 degree here and a wee bit of flow improvement, and in a couple of more years reach this point in a commercial block. Way to go Swiftech!
    At Xtremesystems.org we don't help you save money on your cooling gear, instead we try to make you feel better about the insane ammount you've spent.

  5. #30
    Xtreme Enthusiast miptzi's Avatar
    Join Date
    Feb 2008
    Location
    Not sure if...
    Posts
    595
    Quote Originally Posted by Cory View Post
    I can see having an outlet of the Apogee HD going to each GPU (if more than one), but that would probably only be possible with GPU-only blocks and not Fullcover blocks, because the inlets/outlets between the FC blocks are too close to each other to fit angled fittings.
    .
    remember the BP blocks, with its top-mounted inlets... that could be done.
    Raidmax ATOMIC - Core i7 3770s + Corsair H70 // Gigabyte H77N WIFI // Corsair Vengeance 2x4GB 2133mhz // EVGA GTX1060 ACX2.0// Kingston SUV400 480Gb // Sharkoon SFX500L

  6. #31
    Xtreme Guru
    Join Date
    Dec 2009
    Location
    Latvia, Riga
    Posts
    3,972
    Captain_Harlock: Depends on taste. I for example am all for fully black components or Koolance's "gun-metal-grey"
    Others prefer copper look.
    You - all shiny nickel or chrome. Different people, different tastes. Problem being - too many variations of same product = more expensive to make, so vendor has to choose which variations might be more popular @ market.
    Last edited by Church; 10-30-2011 at 03:26 AM.

  7. #32
    Xtreme Member
    Join Date
    Jan 2004
    Location
    Around the corner
    Posts
    175
    Just picked one of these up in black. I think it looks better than the XT rev 2. I can't wait to drop it on the new SB-E system I am building!
    Main Comp (Win 7): ASUS R4E, i7-3930k @ 4.9GHz, G.Skill Ripjaw Z 4x4GB @ 2133MHz, ATI 6950flash/6970 Xfire, Custom Watercooling
    3x Asus VW246H eyefinity; Polk Audio Monitor 40's (L,R), CS10 (C), Klipsch KG^4 (S), Sony STR-DA1000ES; all in a custom built desk
    Server: (Ubuntu 11.04) i7-920 @ 3.8GHz, 24GB G.Skill

  8. #33
    Xtreme Mentor
    Join Date
    Oct 2007
    Location
    USA
    Posts
    2,622
    I think it's a great idea. I knew Gabe would come up with something else, 2 years ago he came up with run in series with less rad because in gaming situations the CPU and GPU isn't running at max power. It was a neat rig he had at the CES XS party. I'm in series now with just one pump, things are great.
    Again, forward thinking ideas. I like the concept, makes me wanna drop $2000 on a new rig.
    Last edited by Conumdrum; 10-31-2011 at 02:59 AM.
    All stock for now, no need for more, but it's gonna be soon methinks.
    Giga Xtreme 58 mobo i7 965 ES D0 step Corsair 1600 6 gig
    SLI GTX470 EVGA
    EK HF nickle blue top CPU block (free from Eddie)
    Koolance 470 waterblocks
    One big loop, two 120x3 rads. Pa 120.3 and XSPC RX 120x3. Swiftech 35x pump with V2 restop. GT AP15 fans.
    Banchetto Tech Station
    120 GB SSD, and a few other drives.
    1000W UltraX3 PSU, 900 watt (1500VA UPS
    23.999" Acer GD235hz and 24" Acer H243H

  9. #34
    Xtreme Addict
    Join Date
    Mar 2008
    Location
    川崎市
    Posts
    2,076
    So many failed with delivering a Revolution in cooling, but Swiftech actually has a chance of delivering unlike other Companies that turned out to be all talk and no result...

    Gabe, I'd send you a love letter, but I think you prefer me buying Swiftech goods instead of trying to write letters, so thats what I'll do.
    Last edited by naokaji; 10-31-2011 at 08:40 AM.

  10. #35
    Xtreme Member
    Join Date
    Mar 2011
    Location
    SoCal
    Posts
    268
    I'll take the love letter <3

  11. #36
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Posts
    531
    Quote Originally Posted by stephenswiftech View Post
    The actual flow rate you get in each card still depends on the pumping flow rate (or the flow rate you have in the master line). Because the parallel configuration of your 3 GPU is very low restriction, it results in a higher flow rate in the master line. So where you had say 1.3 GPM in the single line with 3 GPU blocks in series, approximately, with 3 block in // the master line flow rate will be 2 GPM (thanks to the MUCH lower restriction of the // setup). Then the actual flow rate in each card is close to .65 GPM (2GPM / 3) which is only half (and not the third) of the flow rate you had in your blocks when in series. This is a rough example but pretty close to actual numbers

    Parallel setups have been under rated for years for some reason (most likely because it all started 10 years ago with mostly CPU water blocks - back then when manufacturers and DYIers started to liquid cool other components they assumed the same rules applied to GPU and others, but even "if" this was true at some point, it's definitely no longer the case). Where CPU are still quite sensitive to flow rate and it does make sense to want to maximize the flow in a CPU block. But there is no reason to serialize anything else, really. These 2 new products offer a great alternative to splitters and other manifolds to build parallel setups and on top of that the Apogee HD is designed to be serialized with the master line for maximum flow rate.
    I believe your numbers are off...by a long shot, and you also forget a few things and don't look at the whole picture.
    a) Its true that running parallel setups (or parts of it) you get better flow, but nowhere near that 50% you show over there.
    b) Less flow on blocks means worse performance.
    c) More overall flow means more pump power draw which means more heat into the loop.

    All in all, there is no reason to do parallel loops "for the sake of it". Gains are marginal...if there are at all.
    Quote Originally Posted by NKrader View Post
    im sure bill gates has always wanted OLED Toilet Paper wipe his butt with steve jobs talking about ipad..
    Mini-review: Q6600 vs i5 2500K. Gpu scaling on games.

  12. #37
    Xtreme Member
    Join Date
    Mar 2011
    Location
    SoCal
    Posts
    268
    Quote Originally Posted by prava View Post
    I believe your numbers are off...by a long shot, and you also forget a few things and don't look at the whole picture.
    a) Its true that running parallel setups (or parts of it) you get better flow, but nowhere near that 50% you show over there.
    b) Less flow on blocks means worse performance.
    c) More overall flow means more pump power draw which means more heat into the loop.

    All in all, there is no reason to do parallel loops "for the sake of it". Gains are marginal...if there are at all.
    Hi Prava Pardon me for not stating the obvious (which was mentioned in the product page): GPUs are much less sensitive to flow rate than CPUs are. There are several reasons for that but if you need the details take a look at the HD's product page: http://www.swiftech.com/ApogeeHD.aspx (just below the diagrams mid-page)

    I took the example of a loop built around three 3 GPUs. If you put them in series or in parallel, there will be a substantial difference in flow rate. Granted the flow in each of the GPU block will be lower than what it was in series, but overall it's much smarter way to do it.

    First, with all your GPUs in parallel, they will run at the same temperature. In series, the first one could be up to 3C lower than the last one (probably more if you're considering high end GPUs).

    Next, power draw: feel free to check these numbers with Martin's MCP35X data. The power draw is 4W higher at 2GPM than at 1GPM. Let's be honnest when you got 3 GPUs running, what is 4W? Even if I use the most conservative numbers say 300W for 3 GPUs, 4W is just about 1% - i.e. nothing even measurable.

    Assuming 3 GPUs in series give 1.3 GPM, setting them up in parallel will get you 2 GPM and there is nothing wrong with these numbers. Remember that the pressure drop of 3 blocks in parallel is lower than the pressure drop of a single block. Now, adds three of them series and you'll see that the pressure of 3 GPUs in parallel is much, much lower than 3 GPUs in series. Intersect these 2 Pressure Drop curves with any pump's PQ curve and you will see the substantial difference in flow rate.

    One could call this a marginal gain, but not long ago someone with 3 GPUs in series would typically add a second pump to his loop just to increase the flow rate by a little bit. Puting GPUs parallel is much smarter, it will give a better flow than adding a second pump (no additional power draw hehe ), it will give better (and even) temps and the radiator should also work slightly better.


    No one should go parallel or series just for the sake of it. That's quite the opposite: now you've got options to make the smartest choices. As we stated, in most instances CPUs should not be parallelized, they should remain in line with the pump as they benefit the most from high flow rates. But every other components are great candidates for parallel sub loops. The apogee HD basically is a good way to get the best of both worlds.
    Last edited by stephenswiftech; 10-31-2011 at 10:48 AM.

  13. #38
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    are there any actual measurement of flow rates at each "branch" after the cpu block?

    considering flow rate gets better as you add more gpu blocks in parallel, having just 1 fighting for flow with a NB block or RAM block might not get much flow at all

    theres definitely a large number of combinations (some might do 2 gpus and a NB/mosfet block and no ram block, like me for example), and making sure that all combinations are offering decent results even if not perfect would be good to know. and we definitely should be aware of any bad combinations that can arise.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  14. #39
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Atlanta, GA
    Posts
    404
    Crunchy, that was kinda the whole point of my post. I was just saying (as an earlier poster before I), that I happen to prefer the classic Swiftech look of all metal vs. the new style of polycarbs and what not. I only pointed out my like of the metal tops to let Gabe and crew, know that there are still fans for it and we would like to see an all metal block again in the future as long as cost permits.
    At Xtremesystems.org we don't help you save money on your cooling gear, instead we try to make you feel better about the insane ammount you've spent.

  15. #40
    Xtreme Member
    Join Date
    Mar 2011
    Location
    SoCal
    Posts
    268
    Quote Originally Posted by Manicdan View Post
    are there any actual measurement of flow rates at each "branch" after the cpu block?

    considering flow rate gets better as you add more gpu blocks in parallel, having just 1 fighting for flow with a NB block or RAM block might not get much flow at all

    theres definitely a large number of combinations (some might do 2 gpus and a NB/mosfet block and no ram block, like me for example), and making sure that all combinations are offering decent results even if not perfect would be good to know. and we definitely should be aware of any bad combinations that can arise.
    I can't think of any bad combination, really. As long as the CPU remains in the main line... And even that wouldn't be a big issue. If I had a dual CPU motherboard I would probably use two HD's with one in 'reverse' mode to parallelize both CPU blocks.

  16. #41
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by stephenswiftech View Post
    I can't think of any bad combination, really. As long as the CPU remains in the main line... And even that wouldn't be a big issue. If I had a dual CPU motherboard I would probably use two HD's with one in 'reverse' mode to parallelize both CPU blocks.
    i think it can if there is a restrictive GPU block and then 2 nonrestrictive blocks for the motherboard and ram. since the water will take the easiest way through, it might send 90% through the MB/Ram blocks, and 10% through the GPU and .2 GPM on a 300W gpu could be pretty bad. this case is probably rare since people might not spend alot on blocks for extras like RAM cooling, and only have one gpu. but it still is something that people need to think about before they set up such a system, and its something that many people might not even realize can happen. it reminds me alot of old electronics classes in highschool with parallel circuits and resistors and LEDs, many classmates hated that stuff.
    2500k @ 4900mhz - Asus Maxiums IV Gene Z - Swiftech Apogee LP
    GTX 680 @ +170 (1267mhz) / +300 (3305mhz) - EK 680 FC EN/Acteal
    Swiftech MCR320 Drive @ 1300rpms - 3x GT 1850s @ 1150rpms
    XS Build Log for: My Latest Custom Case

  17. #42
    Xtreme Mentor
    Join Date
    Mar 2006
    Location
    Evje, Norway
    Posts
    3,419
    I dont think there has been a non-restrictive MB since the MCW30 (kinda overstating abit but you get my point), also you can control the restriction in the MB loop a little by using way smaller nipples and tubing (Think TT )
    Quote Originally Posted by iddqd View Post
    Not to be outdone by rival ATi, nVidia's going to offer its own drivers on EA Download Manager.
    X2 555 @ B55 @ 4050 1.4v, NB @ 2700 1.35v Fuzion V1
    Gigabyte 890gpa-ud3h v2.1
    HD6950 2GB swiftech MCW60 @ 1000mhz, 1.168v 1515mhz memory
    Corsair Vengeance 2x4GB 1866 cas 9 @ 1800 8.9.8.27.41 1T 110ns 1.605v
    C300 64GB, 2X Seagate barracuda green LP 2TB, Essence STX, Zalman ZM750-HP
    DDC 3.2/petras, PA120.3 ek-res400, Stackers STC-01,
    Dell U2412m, G110, G9x, Razer Scarab

  18. #43
    Xtreme Member
    Join Date
    Mar 2011
    Location
    SoCal
    Posts
    268
    Quote Originally Posted by Manicdan View Post
    i think it can if there is a restrictive GPU block and then 2 nonrestrictive blocks for the motherboard and ram. since the water will take the easiest way through, it might send 90% through the MB/Ram blocks, and 10% through the GPU and .2 GPM on a 300W gpu could be pretty bad. this case is probably rare since people might not spend alot on blocks for extras like RAM cooling, and only have one gpu. but it still is something that people need to think about before they set up such a system, and its something that many people might not even realize can happen. it reminds me alot of old electronics classes in highschool with parallel circuits and resistors and LEDs, many classmates hated that stuff.
    I wouldn't call this a bad combination because: 1/ you would actually need to find a GPU block that is at least 10 times more restrictive than a chipset or ram block and based on the bunch of ram and other chipset block that we've tested it simply isn't the case. 2/ We've tested (our) gpu blocks at flow rates as low as .3GPM and the thermal resistance isn't as bad as you would think. Again, GPU's thermal design is way different than CPUs and the large die surface allows for flow rates. I have a couple unused (for now ) 6900's and I could grab one and publish a Thermal Resistance Vs. Flow Rate (from .2 GPM to 3GPM) if it's of interest to anyone.

    Also note that the minute you put 2 GPU in parallel (using a bridge or using 2 full cover blocks for examples) you will never be in that situation. Because when parallelized the GPUs have a very low Pressure Drop which is much closer to that of a chipset or RAM block which tends to equalize the flow in each sub lines.

  19. #44
    Xtreme Mentor
    Join Date
    Mar 2006
    Location
    Evje, Norway
    Posts
    3,419
    Quote Originally Posted by stephenswiftech View Post
    I have a couple unused (for now ) 6900's and I could grab one and publish a Thermal Resistance Vs. Flow Rate (from .2 GPM to 3GPM) if it's of interest to anyone.
    Do you even have to ask? There is no such thing as too much information
    Quote Originally Posted by iddqd View Post
    Not to be outdone by rival ATi, nVidia's going to offer its own drivers on EA Download Manager.
    X2 555 @ B55 @ 4050 1.4v, NB @ 2700 1.35v Fuzion V1
    Gigabyte 890gpa-ud3h v2.1
    HD6950 2GB swiftech MCW60 @ 1000mhz, 1.168v 1515mhz memory
    Corsair Vengeance 2x4GB 1866 cas 9 @ 1800 8.9.8.27.41 1T 110ns 1.605v
    C300 64GB, 2X Seagate barracuda green LP 2TB, Essence STX, Zalman ZM750-HP
    DDC 3.2/petras, PA120.3 ek-res400, Stackers STC-01,
    Dell U2412m, G110, G9x, Razer Scarab

  20. #45
    Xtreme Addict
    Join Date
    Mar 2008
    Location
    川崎市
    Posts
    2,076
    Quote Originally Posted by stephenswiftech View Post
    I have a couple unused (for now ) 6900's and I could grab one and publish a Thermal Resistance Vs. Flow Rate (from .2 GPM to 3GPM) if it's of interest to anyone.
    Please do

  21. #46
    Xtreme Member
    Join Date
    Jun 2011
    Posts
    146
    Words are at such a premium for how that looks....

  22. #47
    Xtreme Mentor
    Join Date
    Feb 2009
    Location
    Bangkok,Thailand (DamHot)
    Posts
    2,693
    Would Nickel plated base will possible?
    Intel Core i5 6600K + ASRock Z170 OC Formula + Galax HOF 4000 (8GBx2) + Antec 1200W OC Version
    EK SupremeHF + BlackIce GTX360 + Swiftech 655 + XSPC ResTop
    Macbook Pro 15" Late 2011 (i7 2760QM + HD 6770M)
    Samsung Galaxy Note 10.1 (2014) , Huawei Nexus 6P
    [history system]80286 80386 80486 Cyrix K5 Pentium133 Pentium II Duron1G Athlon1G E2180 E3300 E5300 E7200 E8200 E8400 E8500 E8600 Q9550 QX6800 X3-720BE i7-920 i3-530 i5-750 Semp140@x2 955BE X4-B55 Q6600 i5-2500K i7-2600K X4-B60 X6-1055T FX-8120 i7-4790K

  23. #48
    Xtreme Member
    Join Date
    Jun 2011
    Posts
    146
    Why? It doesnt help cooling at all, and you wouldnt see it. Besides based on their past products, if they did anything, it would be chrome, not nickel.

  24. #49
    Xtreme Member
    Join Date
    Mar 2011
    Location
    SoCal
    Posts
    268
    For an Christmas or new year gift, (or as an engagement present if your future is into overclocking/liquid cooling ), I present to you the Apogee HD 24K Gold Plated (Numbered & Limited!)

    A couple notes: although we've been wanting a limited edition block for some time, the Apogee HD Gold IS the solution we are going to offer for those of you who preferred full metal blocks! The picture below shows the Apogee HD with a 24K gold plated brass housing. Serial number will be laser engraved. No MSRP or ETA yet but we should have more information about that very soon. There are a couple of reasons behind our choice for Gold: one is that the combination yellow/black is brand scheme, and Gold is just a great choice for a limited edition block!

    As usual with us, your feedback is welcome!
    Cheers.

    Last edited by stephenswiftech; 11-16-2011 at 10:27 PM.

  25. #50
    Xtreme Member
    Join Date
    Jul 2011
    Location
    Kaiserslautern, GE
    Posts
    326
    damn, now THAT's bling - not that i'm into such things per se, but that's a sharp looking piece of copper =)

    note: so who get's the #001 shown in the picture lol???
    i7 3930@4.5GHz (EK Supreme HF), GTX690@1.2GHz (Koolance NX-690), 128G 4M + 2x128G 4M raid 0, Silverstone TJ07, Custom Enclosure w/MoRa, 18x GT AP-31, 401X2 dual PMP-400


Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •