Desktop
[Asus Rampage III Gene] [i7 920 D0] [12GB OCZ3B2000C9LV6GK] [HIS HD 5970] [SeaSonic X750 Gold ] [Windows 7 (64bit)] [OCZ Vertex 30GB x3 Raid0] [Koolance CPU 360] [XSPC Razer 5970] [TFC 360 rad, D5 w/ Koolance RP-450X2]
HTPC
[Origen AE S10V] [MSI H57M-ED65] [ i5-661 w/ Scythe Big Shuriken] [Kingston HyperX LoVo 4GB ] [ SeaSonic X650 Gold ] [ OCZ Vertex 30GB SSD ] [ SAMSUNG Spinpoint 640GB 7200 RPM 2.5"][Panasonic UJ-225 Blu-ray Slot Burner] [ Ceton InfiniTV4]
LOL, why didn't *I* think of a hacksaw?
Ya know Diverge, I noticed we have about the same clocks... and I also noticed something in your 3dmark window... my SM2.0 score is IDENTICAL essentially to yours, while the HDR/SM3.0 score had a huge boost with the SLI (you're running a single card, correct?). Maybe the drivers aren't up to snuff yet, and that's causing the really low score... because that's just too coincidental if you ask me :p.
Last edited by GoldenTiger; 11-03-2007 at 10:17 PM.
yep, single card here. one thing that kinda confuses me is my CPU and SM2.0 score fluctuate from run to run... wish i knew why. i took lots of data, incrementing each clock individually, see how it affect each portion of the 3dmark score.. and SM3.0 is the only one that constantly went up with clock increases.. SM2.0 and CPU when up and down.
here's a graph some each individual portion of the 3dmark score vs overclocks of GPU/shader/RAM... each individually. only one that constantly increases with clocks is SM3.0
edit: here's a graph of 3dmark total score.. shader overclocks seem to increase the score more constantly
![]()
Last edited by Diverge; 11-03-2007 at 10:36 PM.
Desktop
[Asus Rampage III Gene] [i7 920 D0] [12GB OCZ3B2000C9LV6GK] [HIS HD 5970] [SeaSonic X750 Gold ] [Windows 7 (64bit)] [OCZ Vertex 30GB x3 Raid0] [Koolance CPU 360] [XSPC Razer 5970] [TFC 360 rad, D5 w/ Koolance RP-450X2]
HTPC
[Origen AE S10V] [MSI H57M-ED65] [ i5-661 w/ Scythe Big Shuriken] [Kingston HyperX LoVo 4GB ] [ SeaSonic X650 Gold ] [ OCZ Vertex 30GB SSD ] [ SAMSUNG Spinpoint 640GB 7200 RPM 2.5"][Panasonic UJ-225 Blu-ray Slot Burner] [ Ceton InfiniTV4]
Wow, that is interesting... lends credence to jason4207 of OCforums posting this as a possibility:
From a launch review of the 8800GT. I'm on Vista 64."Alas, the G92s aint flawless. We've tried to run the 8800GT in SLI mode with an EVGA 680i mainboard and the Asus P5N32E-SLI, but was only greeted with a single card in Vista's device manager despite our various attempts to rectify this problem. All users who wishes to buy a couple of 8800GT cards should take note, G92 SLI doesnt seem to be quite working with the 680i chipset in Vista just yet. Nevertheless, we reckon this isn't a major problem and Nvidia should be fixing this in due time, as we've seen SLI benchmarks of the G92s on Nforce 780i floating around the web in recent days."
Just a quick update, new stable clocks.
864mhz Core and 2106mhz shader, messing with memory timings at the moment.
Bookmarks