My Inno3D Nvidia 8800GT
GPU: 700Mhz
Memory: 1000Mhz
Shader: 1700Mhz
13093 3D Marks
GPU: 700Mhz
Memory: 1050Mhz
Shader: 1800Mhz
13407 3DMarks
My Inno3D Nvidia 8800GT
GPU: 700Mhz
Memory: 1000Mhz
Shader: 1700Mhz
13093 3D Marks
GPU: 700Mhz
Memory: 1050Mhz
Shader: 1800Mhz
13407 3DMarks
XFX 8800GT Alpha dog edition:
stock clocks:
GPU: 600Mhz
Memory: 1800Mhz
Shader: 1500Mhz
Max ATI tool artifact free clocks:
GPU: 690Mhz
Memory: 1000Mhz
Shader: 1700 (selected) = 1674Mhz (real)
Max 3dMark06 stable clocks:
GPU: 700 Mhz
Memory: 1050 Mhz
Shader: 1782 Mhz (real)
I think I got a not so good card because I see lot of people going much higher on the core and shaders clock![]()
I actually don't really care to be "ati tool stable" since I can go quite high on the shaders clock for 3dmark06, but it's not stable in Crysis. I'm still testing, but to be stable in Crysis it seems I need to use the same clocks as in ATI tool ...![]()
Is there a way to not jump in 54 MHz increment for the shaders clock ?
Last edited by r4st4m4n; 11-03-2007 at 08:18 AM.
Q6600@3.6 GHz / GA-X48-DQ6
2x2GB Kingston HyperX DDR2-800
Asus HD 5870
Corsair TX-850
WD Velociraptor 150GB / WD Green power 500GB
Water Cooling: EK res 250 > Swiftec MCP655 > Swiftec MCR320 > EK-FC5870 > HeatKiller 3.0 LT
3DMark06 Score 13285 3DMarks
SM 2.0 Score 6177 Marks
SM 3.0 Score 5802 Marks
CPU Score 3242 Marks
E6850 @ 400x9 = 3600
2GB DDR2 @ 1:1.25 Ratio = 1000 5-5-5-15 2T
PNY 8800GT @ 708/1690/2006
ATI Tool artifact free
Full load 76c
Can run Crysis on all High (Postprocessing Low) at a very playable 30fps (Vsync) at 1920x1200. The game is very smooth at these settings and the lighting is a lot better than when on Medium which I was limited to with my 8800GTS 640.
The same settings drive 3DMark2001 to a nice score of 63,558.
Last edited by Soulburner; 11-03-2007 at 02:28 PM.
System
ASUS Z170-Pro
Skylake i7-6700K @ 4600 Mhz
MSI GTX 1070 Armor OC
32 GB G.Skill Ripjaws V
Samsung 850 EVO (2)
EVGA SuperNOVA 650 G2
Corsair Hydro H90
NZXT S340
With me Opty at 3006MHZ I get ~11,070pts in 3DMark06.
I am running my eVGA 8800GT KO at 725/1753/975(havent played with mem yet)...all ATI TOOL error free.
Is ati tool the Prime95/Orthos of the video card world? Should I be more concerned of passing 3dmark and Crysis instead of ati tool? I know mine can pass Crysis at much higher speeds.
I am dissapointed at how low my shaders run compared to othres but then again, I am basing this on ATI TOOL errors.
I am testing tonight using my window air duct to get better temps. Yesterday I was getting about 58C max at load running ati tool scan from the normal 77C I was getting without the Window Air Duct. I wasnt able to test at night when it was colder though but I will tonight.
Regards
Regards
*CPU: Xeon X5650 @ 4.3 Ghz | Cooler: Thermaltake Water 2.0 Extreme
*Asus Rampage III Formula | RAM: 36GB DDR3 (Tracer LED + Hyper X Savage)
*Video Cards: Gigabyte Aorus 1080ti
Sound Card: Sound Blaster Z | PSU: Corsair HX1000W | Display: BenQ PD3200u | JVC RS520 projector
*Case: CoolerMaster HAF X (932 side panel) | Others: Roccat Kone AIMO | Roccat Alumic | Logitech G15 |Cameras: Sony A7R3 | RX100 V
From what I have seen Crysis will flake out around the same settings as the ATI Tool artifact checker. 3DMark will run with higher clocks though as the errors aren't critical enough to stop it from running until you are at the breaking point.
System
ASUS Z170-Pro
Skylake i7-6700K @ 4600 Mhz
MSI GTX 1070 Armor OC
32 GB G.Skill Ripjaws V
Samsung 850 EVO (2)
EVGA SuperNOVA 650 G2
Corsair Hydro H90
NZXT S340
Well, I will let you know later when I do my testing but for instance, ati tool wont let me get my shaders over 1753 while the first time I OCed my card I didnt do Core and Shader separately so I did 725 directly and played Crysis without issues.
Still, I will test tonight for sure.
*CPU: Xeon X5650 @ 4.3 Ghz | Cooler: Thermaltake Water 2.0 Extreme
*Asus Rampage III Formula | RAM: 36GB DDR3 (Tracer LED + Hyper X Savage)
*Video Cards: Gigabyte Aorus 1080ti
Sound Card: Sound Blaster Z | PSU: Corsair HX1000W | Display: BenQ PD3200u | JVC RS520 projector
*Case: CoolerMaster HAF X (932 side panel) | Others: Roccat Kone AIMO | Roccat Alumic | Logitech G15 |Cameras: Sony A7R3 | RX100 V
Q6600@3.6 GHz / GA-X48-DQ6
2x2GB Kingston HyperX DDR2-800
Asus HD 5870
Corsair TX-850
WD Velociraptor 150GB / WD Green power 500GB
Water Cooling: EK res 250 > Swiftec MCP655 > Swiftec MCR320 > EK-FC5870 > HeatKiller 3.0 LT
I have the xfx alphadog edition 600/900/1500 stock.
OC'd to 700/1000/1725
How long do you run ATI tool when checking for artifacts? I feel like this card has a lot more in it (especially shader) but so far this is the best stable clocks I've gotten.
http://service.futuremark.com/compare?3dm06=3623075
3DMark Score 15,702
SM 2.0 Score (Marks) 6513
SM 3.0 Score (Marks) 6115
CPU Score (Marks) 6099
8800 GT 745/2100/1863
ZR30W | 800D | 2600K | MIVE | 580 HydroGen | X25-160 | Aquaero
It's up to you
I actually don't care to be ATI tool stable. I just use it since it is a convenient way o load the card when I test the OC. but it seems that, for my card, Crysis stability results are very close to ATI tool stability result. I actually thinks that Crysis load even more than ATI tool![]()
Q6600@3.6 GHz / GA-X48-DQ6
2x2GB Kingston HyperX DDR2-800
Asus HD 5870
Corsair TX-850
WD Velociraptor 150GB / WD Green power 500GB
Water Cooling: EK res 250 > Swiftec MCP655 > Swiftec MCR320 > EK-FC5870 > HeatKiller 3.0 LT
retired AMD64 Opteron 170 Toledo CCBBE 0615EPMW 3.1GHz 1.52V with removed IHS
Intel Q9650 L827A 4.2GHz (467*9) 1.288v
G.Skill 8GB F2-8000CL5D-4GBPQ
ASUS P5Q Deluxe BIOS V1406
eVGA GeForce 8800GTS SSC 512MB
3x Western Digital 640GB Black RAID 0+5
Creative SB X-Fi XtremeGamer
Swiftech H2O-220 Compact 35C/58C
Cooler Master Real Power Pro1000 RS-A00-EMBA 1000W
Dell 2405FPW 24" WUXGA
Antec Twelve Hundred
Heatware iTrader eBay
Looking forward to the EVGA 8800GT SSC editions comming out in december... stock clocked @ 700/1750/2000
With those clocks are stock surely cherry piced cores to guarentie those stock speeds... would love to see OC's on those![]()
More info here: http://evga.com/products/pdf/512-P3-N806-A1.pdf
CPU: Intel Q6600 @ 3600Mhz 24/7
GFX: eVGA 8800GTS SSC 512MB
RAM: 4GB Corsair Dominator PC2-8500
MB: DFI LP LT X48-T2R
HDD:150GB WD Raptor X
PSU: Thermaltake Toughpower 1000W
Screen: Dell 2407WFP
I've actually noticed that as well. I have 720/1836/1900 stable on my eVGA SC, load temps 68-69C using ATi Tool. However, in Crysis it would lock up. So I backed it off to 712/1782/1900 and it's completely stable now, load temps 72-73C, after I played the Crysis demo for about 45 minutes. Hopefully once I get my mosfet ram sinks my FC-ZV9 will bring the temperatures down and let me clock back up to 720/1836 again.
I've actually had my core as high as 740MHz artifact free, so I'm guessing my shader looses stability somewhere between 1782-1836MHz, because 740/1782 is ATi Tool stable, but 740/1836 gives me the yellow pixel every few seconds. I'll have to try 740/1782 and see if that's Crysis stable later on.
Last edited by NJDevilsFan21; 11-04-2007 at 09:01 AM.
E6600 3.51GHz| P35-DQ6 F7a | 2x2GB | 500GB | 6200LE| HX520 | AL2051W
SLI runs like crap ... hopefully future driver will improve
http://www.xtremesystems.org/forums/...&postcount=635
Q6600 @ 3.5 , single GT @ 756/1944/2100
3dmark06 score: 15767
http://i121.photobucket.com/albums/o.../3dmarklf8.jpg
and i needed Q6600 @ 3.9GHz with GT SLI to get 19146:
http://service.futuremark.com/compare?3dm06=3619031
SLI is not working well....
anyone know any pencil mod for vGPU?? cuz i dont want to void my warranty just for that 100MHz ROP, and 100MHz shader .. not worth it
Last edited by theteamaqua; 11-04-2007 at 10:50 AM.
E6600 @ 3.6
IN9 32x MAX
EVGA 8800Ultra
750W
Not surprisingly, I'm finding the same results as you. Also, I found that my clocks are more stable if I scale the shader proportionally to the core.
I must say, I'm surprised how well this card handles 1920x1200. I know AA really kills its performance at this res, but it still looks and plays great without AA. I can actually use 2xAA in UT3 just fine at max settings, but now crysis is doing the white foliage thins even at low settings. Not sure why.
here you go.the most max stable so far for my eVGA.
i run the sm3.0 tests perfectly and the proxycon sm2.0 but firefly forest %&&*&^ at 756/1890/1044.....
![]()
[SIGPIC][/SIGPIC]
This is my best so far.
Keep in mind gpu is watercooled and the memory and other components have ramsinks with a fan blowing on them.
Prepare for an E-Rection![]()
http://www.xtremesystems.org/forums/...1&d=1194203108
Asus P6T, I7-920, 6gb ocz xmp, 4890, Raid 0-1 Terabyte, full watercooled - Triple Loop 5 radiators
didn't have enough time to test, was playing with my card in a friends computer (core 2 duo e6750@ 3.4ghz gigabyte ds3-r)
we were just using crysis to play with the card, got the gpu to 750 stable and shader to 1850, memory at 1050
question i have is about the memory, does the card change timings on the ram or something? going from 999 on the memory to 1050 caused our "lowest fps" score in crysis gpu benchmark to go from 16.5 to 15.2 FPS. can any of you guys confirm? i was going to bench it at 999mhz and then 1005mhz to see if there was a drop in performance, but didn't have time
i'll have more solid numbers when my new motherboard comes in in 2 days (my original ds3-r's gigabit lan died after 3 days, ordered a p5k-e to replace)
oh and the gpu never got over 71 celsius with fan control on 100%
If I'm not mistaken, this is what you are talking about, and it shows as "in stock" for $300:
http://www.evga.com/products/moreinf...6-A1&family=19
Core 2 Duo E6300 @2.8GHz(400x7)1.31v Gigabyte GA-965P-DS3 rev. 2.0 Thermaltake TMG i1 2GB Geil DDR2-800 (4-4-4-12) @1.8vEVGA 8800GT @ 675/1687/1950 Thermaltake Toughpower Cable Management 750W PSU
"Fool me once, shame on, shame on you. Fool me...can't get fooled again."
http://www.xtremesystems.org/forums/...d.php?t=163929
![]()
Did you do any Vmod on the card ?
Q6600@3.6 GHz / GA-X48-DQ6
2x2GB Kingston HyperX DDR2-800
Asus HD 5870
Corsair TX-850
WD Velociraptor 150GB / WD Green power 500GB
Water Cooling: EK res 250 > Swiftec MCP655 > Swiftec MCR320 > EK-FC5870 > HeatKiller 3.0 LT
yeah i read that when the thread started .... he said its impossible ....... i ask b/c maybe some one found some link that suggest different .....
Last edited by theteamaqua; 11-04-2007 at 08:49 PM.
E6600 @ 3.6
IN9 32x MAX
EVGA 8800Ultra
750W
I was just reading this from the JonnyGURU.
I wouldnt discredit it so quickly, we saw the *same exact thing* with the A1 silicon of the R520. It was later attributed to a bad metal layering near the memory controller that when given enough voltage for long enough periods would nuke the card. I myself had almost 6 seperate cards which mysterously died. Some were volted via ATi-tool, after the first card the remaining were not.
Until this is proven one way or another I plan on keeping my memory near default speeds, I can always overclock it later.
_____________________________________
My card BFG 8800GT OC purchased from bestbuy:
(bench suicide *artifact/distortion after long use*)
GPU: 738
Mem: 955
Fan: 90%
Idle: 48c*
Load: 66c*
(max stable *long-term use*)
GPU: 715
Mem: 955
Idle: 48c*
Load: 66c*
__________________________________________________ __________________
Id like to add that for those reading, the G92 is finicky like the 8600GTS was. It has tendency to produce alot of "false-positive" artifacts as well as "false-stable" results. For example this card will artifact and image-distort around 680-690 gpu if I push my CPU high enough. The card itself is fine, somehow the interaction with the motherboard itself with an unstable CPU is producing the visual errors which are noted in both ATi-tool and in 3dmark.
With the processer @ 3.0 and 3.1 with the same voltage, the same exact card and settings are stable to 715, 726mhz and higher produces sporatic "snow" style artifacting. I figured I would add this line in for people who are having trouble clocking their cards, it may not be their GPU at all, but instead they have a CPU that is semi or totally unstable.
Last edited by Sentential; 11-06-2007 at 05:33 AM.
NZXT Tempest | Corsair 1000W
Creative X-FI Titanium Fatal1ty Pro
Intel i7 2500K Corsair H100
PNY GTX 470 SLi (700 / 1400 / 1731 / 950mv)
Asus P8Z68-V Pro
Kingston HyperX PC3-10700 (4x4096MB)(9-9-9-28 @ 1600mhz @ 1.5v)
Heatware: 13-0-0
Bookmarks