Nice to see the 2900XT performing well.
Printable View
Nice to see the 2900XT performing well.
NV, right? I remember folks renaming Bioshock's .exe to Oblivion and then using "override" awhile back. I wonder if the rename's even necessary.
Also, I know override's used when there's no AA option and enhance's used when the setting's in the game, but I've noticed HL2 looks better w/ override than enhance and that game does have AA in game. I suppose there's no better than trying it yourself, but it'd be nice if it worked the way it was meant to be playe...er... :shakes:
Yeah those are the drivers I'm using.
EDIT: Nevermind, it's working now.
Sorta blah'ish lookin levels in the demo as is Epic norm. I'm just worried that UT3 will be a modders nightmare compaired to UT2k3/2k4, kinda like Oblivion was in the face of Morrowind. The community bonus packs are what made ut2004 worth owning.
LOE, what?
Epic developed the engine, Epic developed the game...
At any rate, the graphics aren't by any means disappointing me. The only slight thing I am disappointed about is that the demo is whoring bloom all over the place.
Talking about PhysX for a bit...
The engine part isn't being pushed very much in the demo, not beyond debris and perhaps some waterfalls. Nor can we see whether or not using the card can gain one some fps, as the engine is capped at 62FPS right now, and I have been unable to unlock it yet, but I am working on it...
However, as with any UT game, the engine isn't pushed until the community gets it's hands on the editor, and has a go at modding around.
So, when the full game comes out, then I can assure you we'll see some proper physx usage.
C:\Documents and Settings\*windows account*\My Documents\My Games\Unreal Tournament 3 Demo\UTGame\Config\UTEngine.INI
[Engine.GameEngine]
bSmoothFrameRate=TRUE
MinSmoothedFrameRate=22
MaxSmoothedFrameRate=200
Don't forget to disable Vsync for your test (altough I'd keep it on during normal gaming for smoother gameplay).
AND:
[SystemSettings]
*.............................*
MaxAnisotropy=16
MaxMultisamples=4 (not sure if this works)
*............................*
game looks to be well optimized.... a single gtx @ 2560 x 1600 avg 50fps
very nice :)
Roger, will test asap. Thanks
UPDATE
I did some testing, ran Suspense a few times (should be the map that strains physics simulation the most, having a lot of vehicles and destroyable architecture)
First up a few screenshots with Hardware Physics Acceleration OFF:
http://www.imagehosting.com/out.php/...oPhysXOFF1.jpg
http://www.imagehosting.com/out.php/...oPhysXOFF2.jpg
Now, a few shots with Hardware Physics Acceleration ON:
http://www.imagehosting.com/out.php/...moPhysXON1.jpg
http://www.imagehosting.com/out.php/...moPhysXON2.jpg
In terms of visual quality, I can see very little difference, this comes as no shock though, since it wouldn't make sense to add extra debris to the scene when a card is slotted in.
However, I am pleasantly able to announce to you that when playing with the physics card in, I was seeing a performance gain of up to 10 frames per second, something that I was not expecting, but was indeed hoping to see.
In terms of raw numbers, this spells out like this:
When physics are going on, that means debris flying all around, explosions etc. Judging from the screenshots, with the physX card off, I'm seeing 29-30 FPS
With the Card in, in similar situations, I'm seeing 39-40 FPS.
Thanx a lot for these tests SafeFire, very useful indeed.
Now, only one test remains: Quad Core vs Dual Core. Unreal Engine 3 promised to be seriously multithreaded, is it true?
THAT is one thing I am unable to test, as I am only in posession of a Quad core chip right now :P
If anyone feels like donating me one then I'll be happy to give it a shot (although I'm sure quite a few review sites are working on it).
When I hear some news on this, I shall post it :)
Start->run->msconfig->BOOT.INI->Advanced Options->NUMPROC= 2 (4 is default for quad core). I've tried running UT3 on 1 core by setting process affinity in Task manager, but this will screw up game performance even when changed back to 2 cores on my C2D. Multi threading in UT3 is very complex, so I recommend booting with the amount of cores you want to test the game with.
Have fun :)
From what I have seen, Unreal 3 Engine supports dual cores, but I don't think it has been able to scale up to more than that yet. My CPU activity never seem to exceed 50% (more than 2 cores, 1 core = 25% each).
Looks like 7900/1900 series will be the minimum you want to get away with if you are running the widescreen resolutions.
damn it, i need another 8800GTX now :( i wont be able to run any game @ 1920x1200 with Max AA and Max AF if i dont
I forced AA in the nv panel wo/ renaming exe and it def worked, but halfed my framerate. Honestly, it's too fast paced to be concerned about jaggies. :D
the midrange card review is crap.
first, they use uxga as lowest resolution (:rofl: ) and they wonder that it will not run with 60fps at max details :rolleyes:
But as this is not crazy enough, the even RAISE the resolution.
Who would be a full dumbass and want to play a res higher then full-hd with a 100€ card ?
then they only compare 8600GTS and 2600xt, which makes no sense.
2600xt is much cheaper then 8600gts, where i live it's even cheaper then 8600gt.
True...though the max a GT will clock to is what a GTS is at stock..which says quite a bit for a card that is cheaper than the GT, beating out at stock what the Nvidia card fully clocked can get to.
Again, this is all ATI`s late-to-the-game performance that we are seeing.
Perkam
what res and settings is safefire playing at? it worries me taht a pc of his calibur is dipping into the 20's and 30's