PDA

View Full Version : Just got my 6800 gt



kryptobs2000
01-13-2005, 04:27 PM
I had one of the evga 6800 limited editions, and the memory on it sucked, woudln't go past 800mhz so I rmad it, 3 weeks later, I finnaly got my gt.

detected optimal frequencies came up with 413/1.12 found it stable at 415/1.14. Then I flashed the bios to 1.4v dof came up with only 421/1.14, found it to be stable at about 430/1.14, didn't test the core much it would bench at 440mhz, but during the HL2 stress test it froze for 10 seconds or so, and then unfroze and artifacted all over the screen. So I flashed the bios to 1.4v 425/1140, and thats what I'm running it at now.

Evga sent me a copper heatsink and I didn't put as5 on I just used the crappy uh... w/e was already on it, because I was just anxious to get it in and try it out. My temps I'm concerned might be a little high, under full load they peak at 77c, and I idle around 50-51c. Is that too hot? I might go back and replace the crap thermal compund with some as5. And my 3dmark 05 score was 5200, Hows that look?

edit: wierd thing I noticed, in the HL2 stress test @ 1280x960 everything maxed 4x/8x I get 130fps, but at 1600x1200 I only get 49-50fps. Whats wrong? With my 6800 LE @ 450/700 I would get I think around 90 or so. I should definatly equal that or better I would think now. And when I benched with the 6800 LE my proc. and memory were running stock too.

edit2: Just benched @ 1600x1200 w/o aa/af and I get 120fps.

DazzXP
01-13-2005, 04:50 PM
You RMAed a vid card because it doesn't overclock they way you wanted it?

thelostrican
01-13-2005, 04:50 PM
maybe you got some thermal throttling :confused:

shadowing
01-13-2005, 04:55 PM
Maybe not enough air? Or maybe it really was just a bad card.

Lifthanger
01-13-2005, 05:22 PM
1600x1200 are around 56% ((1600*1200/1280/960)-1) more pixels than 1280x960. 50fps/130fps 0.38

-> 56% more pixels lead to 62% less fps.. sounds alright to me.

kryptobs2000
01-13-2005, 05:36 PM
You RMAed a vid card because it doesn't overclock they way you wanted it?

Stock memory was supposed to be 1100mhz :rolleyes:

And no, thats not right if I got 90 with a slower card and proc. I shouldn't be getting 50 now. My friends x800 pro stock gets at least 60 or something. His proc. is slower, and yeah, x800's are faster clock for clock, but not the pro, and not when it's compared to a gt clocked faster than a stock ultra.

Lifthanger
01-13-2005, 05:42 PM
edit: wierd thing I noticed, in the HL2 stress test @ 1280x960 everything maxed 4x/8x I get 130fps, but at 1600x1200 I only get 49-50fps.

that's what I was referring to.

IvanAndreevich
01-13-2005, 05:55 PM
>>1600x1200 are around 56% ((1600*1200/1280/960)-1) more pixels than 1280x960. 50fps/130fps 0.38

ROFL that's just wrong.

kryptobs2000
01-13-2005, 08:37 PM
>>1600x1200 are around 56% ((1600*1200/1280/960)-1) more pixels than 1280x960. 50fps/130fps 0.38

ROFL that's just wrong.

I dunno, dosn't seem right to me, it's either 15% or 56% wish I had a math teacher here lol. It does seem like it's 50% more, but umm.... it dosn't matter lol. Just cause you have 50% more pixels dosn't mean ur performance is going to drop 50% lol, thats not how it works.

It draws what is on screen no matter what, if ur running at 640x480 it's still drawing every single polygon and the same texture resolution.

Ich
01-13-2005, 08:49 PM
Stock memory was supposed to be 1100mhz
How? If it's 2ns, then 1000MHz, if 1.6ns - then 1200, but no 1100.
Actually mem freq doesn't really matter for GF6800. The difference between 800MHz and 1150MHz is 3 fps in Doom3 on my rig.

77c, and I idle around 50-51c. Is that too hot?
Same question. After thermal grease change 58 idle/76 burn. Waiting until Newegg will have Zalman VF700-Cu

kryptobs2000
01-13-2005, 10:26 PM
How? If it's 2ns, then 1000MHz, if 1.6ns - then 1200, but no 1100.
Actually mem freq doesn't really matter for GF6800. The difference between 800MHz and 1150MHz is 3 fps in Doom3 on my rig.

Same question. After thermal grease change 58 idle/76 burn. Waiting until Newegg will have Zalman VF700-Cu

2ns is rated for 1000mhz and 1.6ns is rated for 1200mhz correct, but an evga 6800 limited edition (has 1.6ns) is supposed to be clocked at 1100mhz, it wouldn 't go past 800mhz.

I cleaned of the normal gunk and put some as5 on, max load is 74c or something like that, so only 3c drop if that, but in normal gaming the highest it will go is normally about 70c which is ok. I just don't know the limit, I'd think it's in the 80's so I wouldn't want my card getting to 77c.

edit: I just realized what you said, and 2.0ns is rated for 1100mhz anyways. :stick:

IvanAndreevich
01-13-2005, 11:59 PM
kryptobs2000
You don't need a math teacher to see that logic is totally wrong. And 2.0 ns _is_ 500 MHz DDR aka 1000 MHz. To covert from ns to mhz you need to divide 1000 / number of ns * 2 (for after "DDR" speeds).

Lifthanger
01-14-2005, 01:46 AM
>>1600x1200 are around 56% ((1600*1200/1280/960)-1) more pixels than 1280x960. 50fps/130fps 0.38

ROFL that's just wrong.

if its wrong, you could just proove your statement or make a counter proposal.
ROFL is not really something that makes me believe what you said.

1600*1200 = 1920000
1280*960 = 1228800

1920000 / 1228800 = 1.5625

-> 56.25% more pixels to render, or 1.5625 times the pixels to render

1600*1200 :50fps
1280*960 :130fps

50/130 = 0.3846

-> 50 is 38.46% of 130.. going from 130 its about 61% less.

result: 56.25% more pixels result in about 38.46% the fps, or 61.54% fps less than before.

If its still wrong I'd feel ashamed, but how about some proof this time?
You should know how discussions work, according to your post count.
I don't want to seem harsh, but I took my time to do the last post and I don't see, why I should accept your statement as justified.

EnJoY
01-14-2005, 02:06 AM
I had won that same evga copper heatsink. I installed it in my ultra and I load at around 60c at 450core. Your temps suck big time, you need to clean off the core and heatsink with some 93% isopropyl and apply a thin layer of AS5 or AS Ceramique. Do it now. ;)

kryptobs2000
01-14-2005, 04:35 AM
result: 56.25% more pixels result in about 38.46% the fps, or 61.54% fps less than before.


And do you think that logically makes sense lol? 50% more pixels on the screen, so I get a 60% drop in fps. And # of Pixels dosn't directly translate to loss of performance like I said earlier.



I had won that same evga copper heatsink. I installed it in my ultra and I load at around 60c at 450core. Your temps suck big time, you need to clean off the core and heatsink with some 93% isopropyl and apply a thin layer of AS5 or AS Ceramique. Do it now. ;)

Well I cleaned it off really good, and I then applied AS5 my temps only dropped about 2 or 3 degrees. Ultras have a bigger cooler and fan tho. My 6800 Limited Edition with 1.4v @ 450mhz (ultra cooler) would rarley go above 60c.

drunkenmaster
01-14-2005, 04:49 AM
And do you think that logically makes sense lol? 50% more pixels on the screen, so I get a 60% drop in fps. And # of Pixels dosn't directly translate to loss of performance like I said earlier.
.

it makes complete sense, cards doing 50% more work, but also eating up a lot more bandwidth meaning it can't move info quite as fast as it could before. Its storing a LOT more info in memory that has to get moved around, and lots of calculations have to wait for the results of other calculations, this isn't at all, or even close to a proportional pixels/framerate thing you'll see.

When doing aa/af you are not only drawing 50% more pixels you are then antialiasing all of those extra too. His numbers seem about right. 1280x960 is quite clearly a hugely smaller resolution that 1600x1200.

Grov
01-14-2005, 05:41 AM
You RMAed a vid card because it doesn't overclock they way you wanted it?

My thoughts as well.

Some people. :rolleyes:

Dunk
01-14-2005, 06:04 AM
My thoughts as well.

Some people. :rolleyes:

video cards, cpu's, motherboards and ram until you get the golden clocker :eek:

mine idles at 45 atm since I did said replacement of gunk with AS5, down from 53 (used to idle at 47.5 then got worse over time). barely gets over 53C upwards while gaming (bless g.wards copper cooler and damn the noise at full speed) :toast:

kryptobs2000
01-14-2005, 06:09 AM
ugh, no I didn't rma it because it didn't overclock good. I rmaed it because the stock speed is 325/1100 it would not run 325/1100 it would only run 450/800 The memory was messed up as you can see.

And no, that dosn't make sense that I would have 60% less fps at 1600x1200. Explain to me why an ultra that is clocked only 25mhz higher, and the memory is 400mhz slower gets almost double the fps, and an x800 pro gets about 50% more fps at the same settings. It's like you guys arn't even reading my posts.

edit: just ran the stress test @ 1600x1200 everything max 0x/8x and I get 111fps. So by enabling 4x AA I should be dropped to 50fps? I don't think so.

edit2: and 2x/8x I get 100fps thats not right.

edit3: I found the problem. If I forced a refresh rate of 85hz @ 16x12 (my driver only goes up to 75hz but it's wrong, monitor goes to 85hz) it has problems with 16x12 and any AA over 2xQ. And I get 89fps :toast: just like I should, not some 50fps :stick:

IvanAndreevich
01-15-2005, 12:02 AM
Lifthanger
Why do you assume the relationship between FPS and the number of pixels is linear? It's not. Either way, your calculation is wrong. You are subtracting something - why? Here's what it should be -

(1600x1200) / (1280x960) = 1.5625 more pixels
(130 FPS) / (50 FPS) = 2.60 times slower actually

Since FPS is inversely proportional to the number of pixels (according to you), he should get 130/1.5625 = 83 FPS (which would be much more reasonble, btw).

And no it's not linear. You can have a scene where everything will be limited by the CPU and the FPS will be the same in all resolutions. Or you could have a scene that is not fillrate limited and then the FPS will not be linear either. Besides, even if it is limited by fillrate, it's not linear anyway because these cards have smart compression technologies which work with different efficiency in different resolutions.

And here's empirical proof: 6800GT @ Ultra
1280x960 4xAA 16xAF ~105 FPS
1600x1200 4xAA 16xAF ~75 FPS

kryptobs2000
01-15-2005, 01:46 AM
Thank you Ivan :)

IvanAndreevich
01-16-2005, 06:34 PM
So you didn't resolve your problem?

HiJon89
01-16-2005, 08:08 PM
So you didn't resolve your problem?
His last edit implies he did, if I understand correctly, forcing a refresh rate of 85Hz was what was causing the problem. By not doing that, his FPS went from 50 to 89.

kryptobs2000
01-16-2005, 08:38 PM
Yeah, I fixed the problem.

IvanAndreevich
01-17-2005, 01:03 AM
That's weird & 60 Hz sucks. Your VSYNC is off, right? Try using Reforce (google for it).

kryptobs2000
01-17-2005, 01:23 AM
No, it's not @ 60hz, it's at 75hz and the problem only happens @ 1600x1200. The drivers that windows xp comes with are wrong, and I have only found them 1 other place. It's driverguide.com or something. You have to sign up for the site, which I have done, but it's not worth it, I barely use 1600x1200 anyways. I'll try this program though. Thanks.

IvanAndreevich
01-17-2005, 09:29 AM
I am running 85Hz in 1600x1200 with reforce and I have no abnormal perf. drops.

reject
01-17-2005, 10:21 AM
reforce is good, but you can just make a reg entry to enable the panel back in the forcewares

"REGEDIT4
[HKEY_LOCAL_MACHINE\SOFTWARE\NVIDIA Corporation\Global\NVTweak]
"CoolBits"=dword:ffffffff
"NvCplDisableRefreshRatePage"=dword:00000000"

in a notepad and save as refresh.reg. then open it and go to drivers wala!