ok i finally got it to complete 3dmark06 again, 18856
sm2.0 8241
sm3.0 10497
cpu 3584
Printable View
ok i finally got it to complete 3dmark06 again, 18856
sm2.0 8241
sm3.0 10497
cpu 3584
nice, sm3.0 10497
:up:
I say there should be more if 4GPU's are going...
http://img.techpowerup.org/080920/3dmark06 at 19165.jpg
That's what I got when I was running my Gx2 before step-up to the Gtx... Man I miss my card.
Then again, I don't know where the clocking of the cards are at...
wow. i didn't realize those scores were for 2 9800GX2's in SLi... I heard a long time ago from the guy that i got mine from that there was no difference adding a scond one.
If they could figure that out , man that would be a killer combo./
well i dunno what im doing wrong, i pulled out the sli bridge connector with sli enabled in nvidia control panel it should be the same as using one standalone card in sli
14127
sm2.0 6513
sm3.0 6037
cpu 3613
Ouch, ok I would say leave just one card in and reinstall drivers, use 177.92 Betas, then test your score, should be upwards of 18k, then pop the second in and see if it scales up at all!
do you change any nvidia 3d settings for your setup?
well if you're worried about squeezing out every last point, you can set all the settings to performance as opposed to quality, but you're having more issues than settings
I really feel that you should do as I recommended and reinstall the new betas with only one card in and see how it does
ill try that soon and see what happens, ive been trying different setting in the nvid 3d settings and lot of them dont make much difference in 3dmark06 there is one i cannot decide where to set it which is the "maximum pre-rendered frames" from what i make of it the processor passes some of the graphics to the card gpu to take some stress off the gpu's? im unsure of what to set this to as i haven't seen much diference. any ideas?
ok just done what you said, with 1 card i got 18,6xx with quad sli i just got 19002
sm2.0 7951
sm3.0 10801 (it was 9xxx with 1 card???)
cpu 3666 (i clocked it a bit more to 4.15 from 4.05)
im gonna see if it gets any better with my timings tweaked, the cards are peaked at gpu 760 sc 1820 ddr3 1120 any higher i either get a driver error, freeze or artifacts blotching up in places.
it still isnt what i would expect from quad sli, what would you be expecting if you get 18xxx marks with 1gx2 and 4ghz cpu? i would expect around 24ish?
as for the issue with loosing 4.2ghz after i done a bios update it is to do with the Vdroop, i had the bios vcore set to 1.5v cpu-z would read about 1.42 even the bios showed around the same even tho it was set at 1.5. so if anyone with a 790 ultra is doing there head in that would be why and if anyone knows a fix PLEASE let me know
Let's make 4 GHz-3DMark 06/05/03/01 tests for a database!
GX2s have different clocks, so we adjust to max stock clock value of GX2s, cooling wouldn't be important.
Then we make a comparsion table and see all the dominant factors.
Possible inputs: CPU, HDD, OS, Driver, Chipset, Memory Size & Timings, 3D tweaks...
Another idea is maybe a free GX2 competition with no rulez...
http://img300.imageshack.us/img300/6...y02afulfu9.jpg
Hi!
i am having problems playing crysis in vista ultimate x64, i get this message after the screen goes black when loading the first mission
"Display Driver nvlddmkm Stopped Responding and Has Successfully Recovered"
i have spent the whole weekend downloading and trying different driver versions, vista has sp1 and is fully updated:mad:
i have an 8800 gt and it also happens with this as well
i also have searched for a solution to this on the web and any solutions that worked for some people did not work for me:mad:
any help would be appreciated
best regards
flan
^did u recently change or adjust your ram?
i had this same issue in multiple games after running for a while when i upgraded my 2x1gb setup to a 2x2gb setup..i sent the 2x2gb back and found 2x1gb of the same stuff i originally had and the problem is gone...
i haven't changed my ram settings as this a first install
Here is my results from 3dmark06 with 1 x gx2
http://i249.photobucket.com/albums/g...7/3dmark06.jpg
how do those OC or the SC GX2's have such high clocks? 680/1674/2100 i can get 760 core but a mem clock of 1674 i can only get around 1120 and shader of 2100 i get 1820 so do they actually change things on the clocked editions?
Nice score Wilson! im getting that with quad gx2 sli on a e8400 for some reason :(
you're most likely confusing the Double Data Rate (DDR) that people are reporting their ram clock as, when they say a mem clock of 2000MHz, they actually mean 1000MHz
and no, the SC and OC versions of cards are always the same as the regular ones, their BIOS' are just modified to have default higher clocks
ok so im dumb, when you say they are reporting 2000 and its actual value is 1000 if i am using rivatune to clock mine to 1820 is that 3640 in double data rate term or the ather way round? lol
oh and another question nflesher87, are you running 2 diff card in sli? the gx2 AND 8600??
lol you're not dumb, you're just confusing the numbers :)
the shader clock is the one you're clocking to 1820
the mem clock that people are saying they're clocking over 2000 is actually just the regular one that is default 1000MHz, they're just referring to it in terms of DDR (Double Data Rate) in order to make it sound more impressive
and no I'm not running them in SLI, only identical cores can be run in SLI, however I am running them all in the same system, with each core (2x GX2 cores and the 1x 8600 core) folding 24/7
Right, 2000 MHz sounds attractive, that's all...
I found that with my card it really sucked till i oc`ed my cpu to atleast 4.0Ghz and alot of the problems with quad sli is cpu bottleneck. Its always better to use a higher clocked duo than a quad for games and quad sli, that is untill games take advantage of all 4 cores or you can clock the cpu to over 4.0Ghz. When my cpu was at 3.8Ghz my 3dmark score was 16020 and with oc`ing to 4.2Ghz its jumped up to over 19000 and all 3 score`s where better not just the cpu score.
i know there are alot of games that cant take advantage of all 4 cores but i didnt think it would have that much of an affect, i though it was more to do with the fsb speed and cache, also is there any tips for vista ult x64 with oc systems? when i clock it to 4.15 it seems a tad unstable, i ruled out the gpu's cos i dropped the clocks down to stock, my ram is rated 1800 8-8-8-24 @1.9v and it was down around 1750 same timings but i tried to up the volts to 1.925, it wasnt the vcore that was around 1.49 (i used to hit 4.25 on 1.5 but i backed it down) and 90%sure it wasnt the temps, but crysis was freezing, no bsod, but audio looped and had to force shutdown?? ive been wondering if all the ocing and reebooting has somewhat corrupted some dll files or any files for that matter. any opinions?
also my highest i managed was at 4.15ghz 3d06 just over 19100.
im looking into reconverting to fluid cooling again, maybe a danger den rbx cpu block, laing ddc+ 18w pump and a cooling works 32T 120mm rad. purely cpu cooling i cant be bothered trying to cool those gx2's plus the blocks arnt cheap and im happy with were there at, from all your water cooling experiences here does this sound like a descent cooler setup?? i had 1 experience and it went majorly wrong and scared away from it lmao!
thnx guyz.
Rick.
Pushing watered CPU (better with higher L2/45nm) gives more advantage even without ocing GPU...