Vanilla eVga 285, first run.
Printable View
Vanilla eVga 285, first run.
Hmm I have a BFG 285 OC and if I use the OCX settings Crysis Warhead demo blue screens the computer pretty quickly. It even eventually did it at 685/1584/1332 as well but after a much longer time. Running 2k6 at 720/1600/1332 and forced 100% fan did it as well. IDK I don't think I got a good overclocker but am thinking of eventually doing the trade up to the dual GPU card at some time anyway.
You should still score/game quite well, however if you can get it up to 4.2 - 4.3GHz you will see a gain for sure. A quad will only give you a a benefit in 3DM06 and farcry 2 (AFAIK), most things are not optimized for 4 cores, and a higher clocked dual will usually give you better frame rates. Also dont buy a QX, a Q9650 will clock much better (E0 stepping) and easily hit 4GHz, and is also much less expensive.
***********************
This is mine with E8400 @ 4.3GHz
3D Mark 06
http://i266.photobucket.com/albums/i...23v-215vHP.jpg
Here is my first try with XFX 285 OC. CPU- 24/7-4750MHz, video: 756/1584/1350 - on air.
Mark 06: XP x86 - 20631: http://img17.imageshack.us/img17/672...0631nf8.th.jpg
Now I'm on water already with vmod and I'll make some serious benches ;)
I am soon stepping up my 280's to 285's. My Core is already pretty solid but I am looking to gain on the shader and RAM.
Sweet! Thanks for the info, and nice scores.
I'm mostly thinking of going quad because I plan on using this system at least until Westmere (think I'm gonna skip Nehalem) and I don't think my Wolfdale will last me that long, if for no other reason than the insane voltage it requires to hit 4GHz (1.46-1.48 according to OCCT). Also I do a lot of video encoding and the extra cores would help tremendously. I'm thinking of a QX because I don't want to give up any clockspeed and I'm thinking the unlocked multiplier will make it much easier to hit 4GHz on a quad. 8x500 is doable on air on a dual-core but I doubt there are many quads that can handle that... Probably would pick up a used QX at that, definitely not going to pay $1000 for a processor :p
What do you guys think of the new 182.05 betas? They run alot better for me but i had my 1st nvk error yesterday with them :mad:
My GTX 285 is sitting at 756/1656/1455 right now :) I might take it higher later on, but for now, it more than suffices.
It's good to have an Nvidia card again though I must say.....after sending my GTX 260 216 back to EVGA for the step up, I was left with my HD 4870 1GB..
Don't get me wrong, the HD 4870 is a very nice card, but after installing my GTX 285, I can definitely notice the full frame AF, compared to the angle dependent AF of the ATI card..
This cards IQ is extremely good! :up:
This is my xfx gtx 285xxx on H2O EK Block
it shows 752 core but real clock was 756core.
http://img5.imageshack.us/img5/2254/92925950ps9.jpg
By chispy at 2009-02-14
I've noticed that most vanilla GTX 285 cards maxes out at 1620 on the shaders, when using stock cooling..
Even though mine runs very cool (doesn't exceed 65c) at 756/1656, it has a tendency to lock up... At 756/1620, it's rock solid stable though..
But when you look at the GTX SSC, it is seemingly capable of running at up to 1700+ shader speed..
It's probably a voltage issue, wherein the SSC is using a a higher Vcore giving it the ability to hit higher clocks.
It's a slight voltage boost on the SSC which can be duplicated very easily if you check out this thread.
http://www.xtremesystems.org/forums/...d.php?t=215224
what is the diffrence in benchmarks for a 280 and 285 with the same clocks
285s do not have the same voltage controller as the 260/280/295, and it does not support software voltage control.
Heres the detailed info on that........
http://www.xtremesystems.org/Forums/...2&postcount=77
I just wanted to update results on my card. 771 core gave artifacts after a good Crysis session, had to drop back down to 756. 1620 shader was giving me black screens every once and a while, had to drop down to 1584. And heres one I wasnt expecting. 1476mem was giving me artifacts while watching movies through WMP, had to drop down to 1404. :down:
Vmod here I come! :p:
Hey Cryptik, I have a similar system w/ e8400 @ 4300, just wanted to share with you guys.
This is mine with E8400 @ 4.3GHz
3D Mark 06
http://img248.imageshack.us/img248/5787/3dmark06dr9.jpg
EVGA SSC watercooled
http://images1.hiboox.com/images/080...91342aeef3.jpg
I just joined the gtx 285 club :up:
But my question is, does overclocking the shader do much if anything for the card's performance? Has anyone ran a bench at stock vs only with the shader OC'd?
core clock is above all else.. always been always will be
ex.
evga: ssc vs ftw
pcb: 4808 vs 5008
cooling: stock vs stock
actual precision clocks: 756/1764/1512 vs 799/1692/1494
http://img528.imageshack.us/img528/7699/285dk6.pnghttp://img15.imageshack.us/img15/427/ftwls3.png
Yes, it helps, some people argue it makes the most difference, but I totally disagree, core speed will give more performance clock for clock than shader speed increases as evident by Napalm's post above and any other benchmarks you should choose to run.
However, you do want to concentrate on getting the shader AND core speeds as high as possible before overclocking the memory as it will have less of an effect on performance than the former two.
You gain very little going from 1620MHz shader speed to 1700MHz, core speed is far more important and makes a much bigger difference. If you don't believe me, run a benchmark increasing only your core speed by 100MHz, then do the same increasing only your shader speed by 100MHz and notice how much more impact core speed has than shader speed.