perhaps that is my problem

My i7 920 is oc'd at 3.0 atm and I have it set to 100% while my GPU a 5850 and doesnt seem to use more than 30% of itself (and im not sure if thats even F@H).

So if I decrease the CPU that will allow the GPU to gain more?
that could very well be an issue. in a previous post I noted that my old 965E put out just as much PPD as it did when I was also running a GTX285 (sometimes, disabling the 285 actually INCREASED PPD!!). remember: it takes CPU cycles to push data to the GPU, so you're starving the GPU by running the CPU at 100%. but, not only that, every time you have to push data to the GPU you also have to interrupt what the CPU is doing - lowering it's performance as well. by decreasing the CPU to 75-80% you are effectively allocating that CPU performance to pushing data to the GPUs. how much is needed? I have no idea, that's why I want to run some tests to see. I know I set my CPU usage to 80% and it was still running at 98-100%

Interesting perspective ...
The care about electricity and overclocking (which has the "highest" diminishing marginal utility).
OCing a 3930 and a GTX690 results in about +220W power draw (above stock) which equals about 52$/mo (GE) and 9.50$/mo (US). is it worth it? that's entirely up to the individual note: power draw numbers were taken from varying reviews across the web

so I crunched some numbers:

24*30=720 hrs/mo
stock/OC 3930 = 250/400W
stock/OC 690 = 270/340W
stock/OC system power draw = 630/850W

Germany: .33 $/1kW/1hr = 237$/mo (850W = 202$/mo)

US: .06 $/1kw/1hr = 43$/mo (850W = 37$/mo)

as you can see, things are a bit different here in Germany (about 5-6x the US). while its not enough to make me shy away from folding (or make purchases based on power consumption rather than performance desires) it is enough to say it matters. (it just doesn't matter enough to make me stop )