I have a BFG OC 8800gt w/ no mods and stock cooler,
here are my max ATItool clocks:
770/1105/1966
at 776 It froze, 1972 was full of artifacts, and 1109 on memory gave me BSOD. My 24/7 settings are 751/1066/1952
Printable View
I have a BFG OC 8800gt w/ no mods and stock cooler,
here are my max ATItool clocks:
770/1105/1966
at 776 It froze, 1972 was full of artifacts, and 1109 on memory gave me BSOD. My 24/7 settings are 751/1066/1952
I just got an Albatron 8800GT. Max ATItool stable clocks are 720/1782/1044mhz.
That's on stock volts with a Zalman VF1000 installed.
just got my 8800GT RMA as my previous one was DOA from EVGA
i was expecting recertified but to my suprise a brand spaking new one sealed in the box - and it looks like the new cooler, bigger fan and geforce logo on the side and not as loud, it is new cooler
and it is/supposedely the vanilla clocked one @ 600/900/1500 as the box just said 8800gt
to my suprise stock clocks read 650/950/1625
here's what i got first try, 15 min ati tool artifact free...........with this new cooler 70% is perfectly fine and only getting around 65c load temp
http://www.techpowerup.com/gpuz/8fusu
stock volts/BIOS and the cd included very recent drivers the ones right before beta 19.09 and 169.02 WHQL saved my dial-up a download
i like EVGA support
I tested a little my asus gt.
730/1730/1800 so far, stable! :)
just testing my new 8800GT SC :D
740 / 1800 / 2040 stable with crysis and Fur Benchmark
how do you overclock the shaders speed??
Also, when i try to install ATI Tool, it tells me it cant install the drivers for it, cause they are unsigned. It lets me continue, but the program cant read my hardware.
I just setup 2 8800GT cards in SLI and have been doing some preliminary testing. I get about 17100 in '06 with the specs in my sig and the cards running at 690/1699/1000 with fans at 80%.
My big concern is the huge difference in temperatures between the 2 cards. After 1 hour of playing Crysis with SLI enabled (and confirmed working), the GPU1 is at 90C while GPU2 is at 60C. That is a gigantic difference in load temps. I have both fans at 80% (at least I think from the setting used in Rivatuner) and both have good air flow around them. If anything, the hotter of the 2 cards has a PCI fan slot blower sitting directly beneath it.
Under idle conditions, GPU1 is at 63C and GPU2 is at 50C. What could possibly be the cause for this? I may try removing the heatsink and applying AS5.
Anyone else seeing such huge difference in temps between the cards?
Yeah... I use fur to test if it locks up you went too far, lol... I hope to bench mine this weekend though... homework sucks... :shrug:
If Fur is stable with high clocks i run crysis. :)
Niceee! :)
I have done the BIOS volt mod, to 1.1V, now I can run my GT @ 750/1850(gpu/shader)MHz...! Cool!
I got my BFG 8800GT OC today + TT DuOrb, the temps are amazing (screenshot), so is the OC on the core, but i don't know if i should keep pushing the Shaders // memory, i've heard the 8800GT memories are not exactly OC friendly.
750 - 1675 - 1800
Worth the risk to increase even more shaders + memory speed ?
http://img84.imageshack.us/img84/495...cbfg750fd4.jpg
Few more OC tests...
Really surprised with the performance, playing CoD4 with the GTX & this 8800GT OC'ed, the GT totally outperforms it @ 1680 X 1050.
Btw, i am using default BIOS, wondering, when i will reach the Core limit :D
http://img86.imageshack.us/img86/980...shaderspk0.jpg
http://img212.imageshack.us/img212/6...ysiscpufo0.jpg
http://img86.imageshack.us/img86/372...gcrysisnm1.jpg
I have a evga gt superclocked that can do 767/1955/1021 stable watercooled (mcw60) and ramsinks on all the memory/vregs.
It looks like the gts can produce about 850/2150/1150 watercooled or aircooled, but is it worth $100-120 more..
Finally i decided to OC the memories a little, from 900 to 950Mhz, super stable, right now the speeds are 740 - 1850 - 1900, i tried to push up a bit more the core // shaders with no success, excellent OC to be honest.
I guess its time to start gathering information about the bios vmod, i am not quite sure if the BFG 8800GT comes already with the core at 1.1v.
http://img231.imageshack.us/img231/6...2006qp2.th.jpg
Why was your stock shader so low?? Mine came at 1625. And if I tried OC them WITH the Core it would artifact and become unstable. If you lower your shaders, you will be able to push your core/mem further. you should worry about core/mem speed before shaders speed.
My XFX 8800GT XXX, GPU 907Mhz @ 1.49v, MEM 1026Mhz @ stock:
http://img339.imageshack.us/img339/3...hztest1fv2.jpg
Setup in teh sig. Low clocks only for now, 4.35ghz Q6600 later. This card ran 2202Mhz on the mem when I first got it, then 2106 and now I can only run 2052Mhz stable :confused:
:caution: Warnig:caution: im starting to see lots of memory failure on GT's and some GTS512mb's ...They memory slowly cooks itsself once it is overclocked past 2050mhz ,that is if your memory even goes that high!:explode2: ...Even with better cooling...The new GTS512mb has the problem too..I was gonna buy a new GTS512mb but started seeing memory issues popping up due to crappy memory ..The memory has been failing bad @stock on the GT,Especialy after heavy gaming even with fan up 100% people have posted RMA's due to memory in Nvidia/BfgTech Forums...That was the main reason Nvidia released a new cooler for GT to help with the memory overheat issue...
Im a die hard Nvidia fan ,but im gonna have to wait for better memory before i grab a GT/GTS512mb...
I just Orderd a 640SSC which Which COST MORE THAN GTS512mb..just to hold me over untill the issue is under control. Im only Stepping up when a better revision is out ,with better or less memory issues !:shakes: Even in our forums ive been seen GT/Gts problems starting to pop up in regards of game corruption/pixeling/cards slowly shutting down....This isnt due to the Drivers which had plagued 8800series in the past . I can safely say it is poor choice of memory/cooling used for the cards...If you have this problem ,,I highly suggest you use RivaTuner and declock the memory or not go above 2000mhz for stability...The memory will fail qiuk one it starts to go as my friends card did in less than two days of hard gamming at 2100mhz...Youll see what im talking about if it starts putting up wierd sqaures and or dots tearing or the display looks like a jigsaw puzzle..Qimondo memory sucks..The SAMSUNG does 2300mhz easy....Even my 320 did 2300mhz with out a problem!
If your gonna overclock it ,,i wouldnt exceed 2000mhz on the memory....Just crank the GPU and Shader....Maybe step up the card in a few months for a better revision or model..that is if you have that option!
P.S Sorry i had too write this,,I was looking forward to a new 512mb to Overclock..now im waiting to see if more memory issues keep popping up..:shakes:This really sucks,,,Nv memory/cooler 7900series youd think they would of learned that one:shakes: ...
still on water ~1,48vgpu on 3D
http://xs222.xs.to/xs222/07505/928.jpg
I recived my vanilla EVGA 8800 GT yesterday. And now I've overclocked it a bit, but the wierdest thing is that I get lower scores in 3dmark when I overclock it. :confused:
I haven't got the slightest idea what might be causing this, so I need your help.
I know my processor is bottlenecking the grapichscard, but it shouldn't stop me from getting higher SM 2.0 and SM 3.0 scores in 06.
Current system (sig is a bit outdated):
Opteron 165 @ 2,83 GHz
Asus A8N-SLi Deluxe with 1805 BIOS (latest)
2 GB G.skill ZX @ 257 MHz; 3 - 2 - 3 - 5
EVGA 8800 GT "Vanilla"
FSP Epsilon 700 W
I got 10681 points in 06 with 8800 GT @ stock and 2-400 lower score depending on overclock.
How the f*** are you guys hitting 900+ mhz core's!?!?!?!?! Holy crap!
edit: Omg, yoru shaders too!! I cant OC my shaders at all, or else it holds my core/mem speeds back.
Oh, and the Bandwith doesnt look right for that much of an overclock, I got mine at 700Mhz core, and 2015mhz memory, and Im at 65gb/s, same as yours yet you have a much higher OC.
GPU-Z doesn't have a benchmark to measure RAM bandwidth, it's just the theoretical maximum that it shows. It's easy to make this calculation:
Bandwidth in GByte = bus width (in bit) X frequency (DDR in mhz) / 8000
256bit X DDR2015 / 8000 = 64.48GByte/sec
256bit X DDR2052 / 8000 = 65.66GByte/sec
;)
i managed a run of 06 at 810/2052/1026 yesterday...
only with bios 1.2V mod...
I have better OC results when having the gpu not linked with the shader. My shader seems to be limited at 1850 (1836 real- about 740 core linked) but the core goes up a lot more. Im up to 770 and it seems like i still have room for more.