Nice work! I'm really liking the GPU test, is stresses my video card to the max, now I know I got a cooling problem with it and will RMA it for sure. I can finally supply some screen shots to EVGA customer support :D
Printable View
Nice work! I'm really liking the GPU test, is stresses my video card to the max, now I know I got a cooling problem with it and will RMA it for sure. I can finally supply some screen shots to EVGA customer support :D
Just tried 3.0.0.b07 GPU test. Ati tray tools showed that only one gpu was utilized (HD3870X2, crossfire enabled under server 2003). Both windowed and full-screen mode were tested.
The PSU test is stressing the CPU now, but OCCT now only sees 3 of my 4 cores (1, 2, and 4).
Windows Server 2003 32 or 64-Bits ?
Did you try making a profile under the ATI driver ? I know you need one to get the SLI working for Nvidia, i don't know about ATI though.
I really have to look at those profiles, i wonder if i can temper with them programmatically without screwing everything.
Too bad i don't have an ATI SLI configuration :/
Any reason why b05 monitors the temp of my 3rd core, but b07 doesn't?
That's weird. What happens if you restart OCCT and try to configure it yourself by clicking on the big orange button in the options ?
I didn't change many things from b05 to b07 regarding the CPU temps...
Do you see it appearing in the dropdown list in the option frame, or is it just not automatically detected ?
Can you doublecheck if the option "90% free mem" was really checked ?
Either my memory detection algorithm isn't right (but i can't see where it's wrong), either the option wasn't checked (which sounds ok, as 3.5GB of memory detected = 7GB at 50%, vista taking 1GB, which sounds Okay-ish).
I spent 1 hour playing with it, and i can't see where i'm wrong ;) Can you doublecheck ? And be sure to select the 90% free mem option.
How credible are the volt graphs?
http://i232.photobucket.com/albums/e...her/asdf-3.jpg
Weird, this time it's not even seeing my 4th Core and it was set default to GPU.
I would love a stress test with error checking for dual gpu's. That would be great. I love OCCT, btw.
This seems to be a bug with the SDK i'm using. I'll report it ASAP. I'll check my autodetection routines, but what may happen is that the autodetection took place when the core temp #3 wasn't available ni the list for some reason.
Could you read this thread again tomorrow, or send me an email so that i can contact you directly if you don't want to follow this thread ? This bug is critical, and as it involves 3rd party, i may have to send you a debug version so that we can audit the problem, probably by getting a debug function ready ;)
email : iench 'atttt' ocbase dot com.
Yes, but SLI isn't "i work - i do not work". 3dmark2006 shows god scores because a profile for this app exists in the Catalyst drivers, telling which mode to use for this particular applicaton.
Right now, there're no profile at all for OCCT GPU in any display driver... that's why SLI isn't working with OCCT GPU. I don't have control on it.
I'm looking into the Nvidia display driver mechanism, it seems easy enough. I'll have to look into the ATI one to automatically add a profile for OCCT GPU.
I used nHancer to create a profile for it. ATITrayTools should have a similar function.
got Asus P5K and E8400 here, it reads CPU speed 2Ghz ? CPU is clocked at stock speeds (3Ghz) though
OCCT PT 3.0.0.b08 OUT !
Changelog :
- The EXACT Linpack formula for Problem Size <-> Memory used has been found - OCCT Exclusive ! :) (This caused "no CPU Usage" bugs on this test under Vista (experienced here !).
- OCCT GPU does not force the resolution anymore - It can be run in window, fullscreen mode,...
- OCCT GPU in "error detection" mode now is now multithreaded - the checking algorithm will exec on several CPUs
- If your CPUs cannot check that many frames per second, OCCT GPU will now adapt dynamically its speed, instead of "just at the beginning".
- CPU Usage is now correctly calculated (it was minored in some cases (Linpack mostly)
- CPU Usage in Power Supply mode is now the sum of the CPU Usage of both tests (Linpack and GPU3D, not just Linpack)
- The display is only refreshed once per second, instead of once per 1/4s.
- The Linpack default mode is now Max stress instead of Mid.
Enjoy !
http://www.ocbase.com/download.php?fileext=beta