Power consumption at stock is roughly 150w per card. Overclocked can rise to 200w+. Max Power Consumption = Max Heatdump.
Pair of 8800GTX in SLI at full load overclocked = 400w+
http://www.thermochill.com/PATesting...lowrateLPM.jpg
70cfm fans won't cut it and keep a decent air > coolant differential. Temps will be no better than aircooling. 70cfm will shift 225w of heat at 10degC air > coolant differential. To shift 400w+ of heat would result in coolant temps at ambient + 20ish deg C (ie: a 20ish deg differential), so 40 deg C water (assuming ambient is around 20 to start with).
Coolant_temp + ((heatload_per_card x C/W_of_waterblock)x2) = GPU temp(roughly)
PA120.2 for JUST the video cards oc'd at full load in SLI, with 100cfm fans, will give "good" temps.
PA120.3 with 50cfm fans would do "nicely".#
Now all you need is to talk GPU Block manufacturers into paying someone to independantly calc the c/w ratings of their blocks, all on the same testbench (preferably BillA's, as then would all be consistent to PA Series radiator testing.)