at 756/1836/2000 ati tool artifacts, but crysis plays for an hour, no artifacts nothing, im usising vista x64 is ati tool a reliable nvidia tool???? i mean it is ATI tool???? what do you guys think, i need some expert advice.
at 756/1836/2000 ati tool artifacts, but crysis plays for an hour, no artifacts nothing, im usising vista x64 is ati tool a reliable nvidia tool???? i mean it is ATI tool???? what do you guys think, i need some expert advice.
ATITool has always been the utility I trust most as it's the most stressful for gfx card I was able to find, like you say it's even better finding artifacts than playing Crysis, if it's atitool stable it's stable for all games too. Like my 7900GTO would be artifact free in 3DMark at 700~705MHz core setting but require 690 to be Atitool artifact free. Great utility for artifact scanning and temp measurement. And a good thing you don't need to run it for long to know if the clock is unstable or not, even 15mins is a very good indication, testing over 1hr to me seems totally unnecessary in this utility.
Last edited by RPGWiZaRD; 11-06-2007 at 03:26 PM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
im opposite as i can have artifact free atitool and yet the occasional ingame glitch...eg.
720/1800/1950 in tf2..will play fine for say 15-30 minutes and then ill get a fatal glitch...
but maybe with your oc right on the edge of stable it's different.
does orthos stress gcards??
what other gcard specific test is there?
anyway if it's stable in the games you want to play who cares? the real stress test is the actual app...
theteamaqua says "crysis stable" - i think it is an apt term.
Last edited by adamsleath; 11-06-2007 at 03:35 PM.
i7 3610QM 1.2-3.2GHz
yeah lol i ran ati tool @ 740/1890/2000 it gave me error in like 2 min, so i then tried crysis ... which didnt crash nor any artifact , and sandbox was ok,... all the games i played was ok ....
but the funny thing is that @ 757/1836/2000 it passed ATI tool for like 10min, then i played crysis and it crashed, liek hang in just 1min, entire screen had like lots green dots everywhere, and jsut hangs there...so i decided to go with higher shader clock
seems to me that if u oc ur core too high it crashes, while mem/shaders gives u artifact?? dunno .. lol
Last edited by theteamaqua; 11-07-2007 at 12:51 AM.
E6600 @ 3.6
IN9 32x MAX
EVGA 8800Ultra
750W
why bothering with atitool for artifact scanning for hours? it just makes the gpu generate maximum heat, I don't find it much more interesting than that.
as you'll certainly get sligthly higher clocks in 3dmark than your games, use your games to test stability. just alt-tab to increase the clocks and go back in your game to test, until it fails. stability in the games you play is what you are looking for, don't you?
ASUS P8P67
i7 2600K 3.4GHz @ 4.6GHz
Twintech 8800 GT 512Mo Samsung (vgpu modded)
Crucial Ballistix DDR3 C7 2 * 2Go
2 * WD VelociRaptor 150Go RAID 0
2 * Samsung Spinpoint F3 1To RAID 0
Creative Sound Blaster Audigy 2 ZS
Seasonic S12 600HT
WC :
1A-SL2 CPU // 1A-SL2 GPU (home made fix)
Eheim 1048 + magicool 25
2 * Black Ice Pro 3 serial
Tygon 3603 + glycoshell
Well ATITool has for me been very accurate for the last 3 cards I've been using and found artifacts a bit before any game/benchmark I've tested so for me it's been the optimal test. Well perhaps it's different with Geforce8 series as ATITool perhaps might not stress the shader domain that heavily so that could be the reason for you that Crysis would show glitches earlier due to unstable shader clock. That's my best guess, using both sounds like a good idea.
Last edited by RPGWiZaRD; 11-06-2007 at 03:38 PM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
I've already experienced the reverse situation of adam, ie card stable in atitool's artifact scanning and not in games. with a X800 Pro.
this was my first try of this in fact. coupled to my first and last test of "find max gpu" which had increased the gpu freq so much that I didn't have any artifacts, just lines all over the screen like sort of a test card...
orthos or occtpt are ok for me as they test the cpu cores at their maximum and are reliable enough to ensure all others apps will run fine, but gpu talking, I'm not sure that many other games will stress your gpu more than crysis does atm, so playing your current "heaviest gpu duty" game should be enough in my opinion
Last edited by Pacha; 11-06-2007 at 03:50 PM.
ASUS P8P67
i7 2600K 3.4GHz @ 4.6GHz
Twintech 8800 GT 512Mo Samsung (vgpu modded)
Crucial Ballistix DDR3 C7 2 * 2Go
2 * WD VelociRaptor 150Go RAID 0
2 * Samsung Spinpoint F3 1To RAID 0
Creative Sound Blaster Audigy 2 ZS
Seasonic S12 600HT
WC :
1A-SL2 CPU // 1A-SL2 GPU (home made fix)
Eheim 1048 + magicool 25
2 * Black Ice Pro 3 serial
Tygon 3603 + glycoshell
Bookmarks