I finally broke 16K on a single 8800 GT. :eek:
Card on water with Swiftech mcw60 block and mc14 sinks. :up:
http://usera.imagecave.com/punisher69/16k2.bmp.jpg
Printable View
I finally broke 16K on a single 8800 GT. :eek:
Card on water with Swiftech mcw60 block and mc14 sinks. :up:
http://usera.imagecave.com/punisher69/16k2.bmp.jpg
did you do any volt-mods or just changed the cooler?
by the way, nice score!
nice score. is that an OC model card if do which? mods?
nice to know that on liquid this card can go up to 780, nice score man!
No volt mods. Stock card watercooled with sinks.
Thanks, It's a evga 8800 GT 512MB Superclocked. Link
Considering no volt mods have been done yet. I thought it was pretty good too.
Sweet score! can you show a screen of your 3dmark details. I wanna compare some things
I need a quad running at 4ghz. thats gotta be nice.
2nd on orb running vista dual core-note the guy in front has sli
;)
http://i239.photobucket.com/albums/f...alsmith/37.jpg
It'd be great if eVGA would ever give me confirmation on my step-up to the 8800GT, but I guess that won't be happening.
Thats awesome! I can't wait to get my 8800GT in to see what kind of clocks I can get with my water cooling setup.
Trying to break 10k in 06 with my broken X2 @ 2.2. lol
EDIT: Forgot to ask...what kind of load temps do you get?
Its weird, i just tried an evga 8800GT, stock with kentsfield at 3.9G it gives 15.3k and no mods clocked at 750core, 1820 shader, and 1030 mem gives 16.1k. Does that seem too high?
Could please someone post also a score @ DEFAULT?
My XPS410 Dell with E6700 core2duo @2.66 gets 10,920 on 3dmark06 running 650/950 factory clocked EVGA 8800 gt. Overclocking to 8800 gt SSC specs only brings it up to 11,290
Not worth overclocking just to gain 370 marks.
hmm I ran 651/951 on my 8800GTS SLI with cpu @ 3.4GHz and scored 16272...........
All air. 783/1944/2160, quad at 3.9Ghz. Duorb cooler, no hard mods.
16290
SM 2.0 Score 6817
SM 3.0 Score 6384
CPU Score 6073
http://img101.imageshack.us/img101/3020/16290zj6.th.jpg
Very Impressive score!! :)
It's interesting to see how much the shader 2.0 increases with CPU speed, but shader 3.0 really isn't effected as much.
With a E6700@3.5 and my GTX at 636/1650/1050 my shader 2.0 is 5900 and my shader 3.0 is 6300. You people are cranking out 6400 in shader 2.0.
Oh, just love reading these threads about a 200 dollar card beating my 800 dollar one:rolleyes:
Nice results!
What load temps you getting?