|-------Conner-------|
RIP JimmyMoonDog
2,147,222 F@H Points - My F@H Statistics:
http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530
Man I feel like I got a dud. Have you volt modded the card at all?
C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64
Wow?!
What voltages are you pumping into that card?!
It didnt work for me on an 8800GTX, restarted, with 163.67.
That solved it for me.This worked for me......
Power User>Rivatuner\System>NVAPIUsageBehavior set value to 1
Apply, close out, then reopen and the shader slider should be there.
Gigabyte P35-DQ6 | Intel Core 2 Quad Q6700 | 2x1GB Crucial Ballistix DDR2-1066 5-5-5-15 | MSI nVIDIA GeForce 7300LE
I alpha tested this. was nice to see the changes it went through. development was amazingly fast FYI.
Last edited by xlink; 09-29-2007 at 06:58 PM.
Hmmm.. makes me wonder if in the future it will be possible to change the shader clocks independantly for the R600...![]()
Core i7 920 D0 B-batch (4.1) (Kinda Stable?) | DFI X58 T3eH8 (Fed up with its' issues, may get a new board soon) | Patriot 1600 (9-9-9-24) (for now) | XFX HD 4890 (971/1065) (for now) |
80GB X25-m G2 | WD 640GB | PCP&C 750 | Dell 2408 LCD | NEC 1970GX LCD | Win7 Pro | CoolerMaster ATCS 840 {Modded to reverse-ATX, WC'ing internal}
CPU Loop: MCP655 > HK 3.0 LT > ST 320 (3x Scythe G's) > ST Res >Pump
GPU Loop: MCP655 > MCW-60 > PA160 (1x YL D12SH) > ST Res > BIP 220 (2x YL D12SH) >Pump
|-------Conner-------|
RIP JimmyMoonDog
2,147,222 F@H Points - My F@H Statistics:
http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530
I have a 80MM fan blowing over the backside of the GPU but I didn't notice a difference either.
C2Q Q9550 @ 4.03Ghz/4GB OCZ DDR 1066 @ 902Mhz /BFG GTX 260/Gigabyte GA-EP45-UD3P/2 X Samsung F1 1TB/Samsung F1 750GB/Samsung SH-S223Q/Corsair TX650/X-Fi Titanium Fata1ity/Swiftech H20-220/Logitech Z5300 Speakers/Samsung SyncMaster 2253BW Monitor/Win 7 Ultimate x64
Ive been using the OpenGL Fur Benchmark to judge stability, and OC merit (whether stressing my card more is worth it...)...
I like that bench a lot because it is the only one that doesnt give me crappy scores based on my crappy CPU/RAM/Motherboard...![]()
|-------Conner-------|
RIP JimmyMoonDog
2,147,222 F@H Points - My F@H Statistics:
http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530
I just finished 2 hours of testing running the latest fur and surface deformer benchmarks I could not beat the scores I've gotten using ntune.
Did you OC the shaders? I got from 2108 score, just using nTune, to 2227 after Rivatuning my shaders... XD
|-------Conner-------|
RIP JimmyMoonDog
2,147,222 F@H Points - My F@H Statistics:
http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530
yes, I screwed around with it until I had max clock on everything. 770 core 1800 shader was starting to artifact and could not beat my ntune score in fur.
What resolution?
|-------Conner-------|
RIP JimmyMoonDog
2,147,222 F@H Points - My F@H Statistics:
http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530
Cheers, dengyong!
The extra ROPs 'n' shaders on GTX seem to be scaling nicely, as a comparison my GTS @ the same (ROP/shader) clocks does 13.5k with a 3.6GHz quad and DDR2-802 @ 4-4-4-10.
btw, I take that's with linked shader clock, yes?
You were not supposed to see this.
Yes, with ntune.
Last edited by dengyong; 09-30-2007 at 01:03 PM.
Bookmarks