Nice to see the 2900XT performing well.
Nice to see the 2900XT performing well.
"To exist in this vast universe for a speck of time is the great gift of life. Our tiny sliver of time is our gift of life. It is our only life. The universe will go on, indifferent to our brief existence, but while we are here we touch not just part of that vastness, but also the lives around us. Life is the gift each of us has been given. Each life is our own and no one else's. It is precious beyond all counting. It is the greatest value we have. Cherish it for what it truly is."
NV, right? I remember folks renaming Bioshock's .exe to Oblivion and then using "override" awhile back. I wonder if the rename's even necessary.
Also, I know override's used when there's no AA option and enhance's used when the setting's in the game, but I've noticed HL2 looks better w/ override than enhance and that game does have AA in game. I suppose there's no better than trying it yourself, but it'd be nice if it worked the way it was meant to be playe...er...![]()
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
Yeah those are the drivers I'm using.
EDIT: Nevermind, it's working now.
Last edited by Cybercat; 10-12-2007 at 11:27 PM.
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
Sorta blah'ish lookin levels in the demo as is Epic norm. I'm just worried that UT3 will be a modders nightmare compaired to UT2k3/2k4, kinda like Oblivion was in the face of Morrowind. The community bonus packs are what made ut2004 worth owning.
LOE, what?
Epic developed the engine, Epic developed the game...
At any rate, the graphics aren't by any means disappointing me. The only slight thing I am disappointed about is that the demo is whoring bloom all over the place.
Talking about PhysX for a bit...
The engine part isn't being pushed very much in the demo, not beyond debris and perhaps some waterfalls. Nor can we see whether or not using the card can gain one some fps, as the engine is capped at 62FPS right now, and I have been unable to unlock it yet, but I am working on it...
However, as with any UT game, the engine isn't pushed until the community gets it's hands on the editor, and has a go at modding around.
So, when the full game comes out, then I can assure you we'll see some proper physx usage.
CM Stacker 810 w/ Yate Loons | Enermax Infiniti 650W
ASUS P5K Deluxe | Corsair XMS2 PC2-6400 2x2GB
Intel Core 2 Quad Q6600 | Thermalright Ultra-120 eXtreme
BFG Geforce 8800GTS 640MB | ASUS Ageia PhysX P1 128MB
Logitech G15 | Logitech G5
C:\Documents and Settings\*windows account*\My Documents\My Games\Unreal Tournament 3 Demo\UTGame\Config\UTEngine.INI
[Engine.GameEngine]
bSmoothFrameRate=TRUE
MinSmoothedFrameRate=22
MaxSmoothedFrameRate=200
Don't forget to disable Vsync for your test (altough I'd keep it on during normal gaming for smoother gameplay).
AND:
[SystemSettings]
*.............................*
MaxAnisotropy=16
MaxMultisamples=4 (not sure if this works)
*............................*
Blue Dolphin Reviews & Guides
Blue Reviews:
Gigabyte G-Power PRO CPU cooler
Vantec Nexstar 3.5" external HDD enclosure
Gigabyte Poseidon 310 case
Blue Guides:
Fixing a GFX BIOS checksum yourself
98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.
Roger, will test asap. Thanks
CM Stacker 810 w/ Yate Loons | Enermax Infiniti 650W
ASUS P5K Deluxe | Corsair XMS2 PC2-6400 2x2GB
Intel Core 2 Quad Q6600 | Thermalright Ultra-120 eXtreme
BFG Geforce 8800GTS 640MB | ASUS Ageia PhysX P1 128MB
Logitech G15 | Logitech G5
UPDATE
I did some testing, ran Suspense a few times (should be the map that strains physics simulation the most, having a lot of vehicles and destroyable architecture)
First up a few screenshots with Hardware Physics Acceleration OFF:
Now, a few shots with Hardware Physics Acceleration ON:
In terms of visual quality, I can see very little difference, this comes as no shock though, since it wouldn't make sense to add extra debris to the scene when a card is slotted in.
However, I am pleasantly able to announce to you that when playing with the physics card in, I was seeing a performance gain of up to 10 frames per second, something that I was not expecting, but was indeed hoping to see.
In terms of raw numbers, this spells out like this:
When physics are going on, that means debris flying all around, explosions etc. Judging from the screenshots, with the physX card off, I'm seeing 29-30 FPS
With the Card in, in similar situations, I'm seeing 39-40 FPS.
CM Stacker 810 w/ Yate Loons | Enermax Infiniti 650W
ASUS P5K Deluxe | Corsair XMS2 PC2-6400 2x2GB
Intel Core 2 Quad Q6600 | Thermalright Ultra-120 eXtreme
BFG Geforce 8800GTS 640MB | ASUS Ageia PhysX P1 128MB
Logitech G15 | Logitech G5
Thanx a lot for these tests SafeFire, very useful indeed.
Now, only one test remains: Quad Core vs Dual Core. Unreal Engine 3 promised to be seriously multithreaded, is it true?
Lian Li V2000A+ Silver | Abit IP35Pro | Core 2 Quad Q6600 G0@3.2GHz@1.35V | Sunbeam Tower 120 | Sapphire HD 3870 X2 @ default | 4x1GB Crucial Value 800 D9GKX @ 800 | OCZ XTC RAM Cooling | Silverstone Olympia OP750W | 2x150GB Western Digital Raptor-X@ RAID-0 (30GB + 270GB) / 3xLacie (Seagate 7200.10) 500GB USB 2.0 | NEC 3520A + Samsung SH-S183A S-ATA | Auzentech X-Fi Prelude | ΕΙΖΟ S2431W | AKG K701
THAT is one thing I am unable to test, as I am only in posession of a Quad core chip right now :P
If anyone feels like donating me one then I'll be happy to give it a shot (although I'm sure quite a few review sites are working on it).
When I hear some news on this, I shall post it![]()
CM Stacker 810 w/ Yate Loons | Enermax Infiniti 650W
ASUS P5K Deluxe | Corsair XMS2 PC2-6400 2x2GB
Intel Core 2 Quad Q6600 | Thermalright Ultra-120 eXtreme
BFG Geforce 8800GTS 640MB | ASUS Ageia PhysX P1 128MB
Logitech G15 | Logitech G5
Start->run->msconfig->BOOT.INI->Advanced Options->NUMPROC= 2 (4 is default for quad core). I've tried running UT3 on 1 core by setting process affinity in Task manager, but this will screw up game performance even when changed back to 2 cores on my C2D. Multi threading in UT3 is very complex, so I recommend booting with the amount of cores you want to test the game with.
Have fun![]()
Blue Dolphin Reviews & Guides
Blue Reviews:
Gigabyte G-Power PRO CPU cooler
Vantec Nexstar 3.5" external HDD enclosure
Gigabyte Poseidon 310 case
Blue Guides:
Fixing a GFX BIOS checksum yourself
98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.
From what I have seen, Unreal 3 Engine supports dual cores, but I don't think it has been able to scale up to more than that yet. My CPU activity never seem to exceed 50% (more than 2 cores, 1 core = 25% each).
Looks like 7900/1900 series will be the minimum you want to get away with if you are running the widescreen resolutions.
System
ASUS Z170-Pro
Skylake i7-6700K @ 4600 Mhz
MSI GTX 1070 Armor OC
32 GB G.Skill Ripjaws V
Samsung 850 EVO (2)
EVGA SuperNOVA 650 G2
Corsair Hydro H90
NZXT S340
Core i3-550 Clarkdale @ 4.2GHz, 1.36v (Corsair A50 HS/F) LinX Stable
MSI H55-GD65 Motherboard
G.Skill 4GBRL DDR3-1600 @ 1755, CL9, 1.55v
Sapphire Radeon 5750 1GB
Samsung F4 320GB - WD Green 1TB
Xigmatek Utgard Case - Corsair VX550
the midrange card review is crap.
first, they use uxga as lowest resolution () and they wonder that it will not run with 60fps at max details
![]()
But as this is not crazy enough, the even RAISE the resolution.
Who would be a full dumbass and want to play a res higher then full-hd with a 100€ card ?
then they only compare 8600GTS and 2600xt, which makes no sense.
2600xt is much cheaper then 8600gts, where i live it's even cheaper then 8600gt.
System : E6600 @ 3150mhz, Gigabyte DS3, 4gb Infineon 667mhz, Amd-Ati X1900XT
Phenom 9950BE @ 3.24Ghz| ASUS M3A78-T | ASUS 4870 | 4gb G.SKILL DDR2-1000 |Silverstone Strider 600w ST60F| XFI Xtremegamer | Seagate 7200.10 320gb | Maxtor 200gb 7200rpm 16mb | Samsung 206BW | MCP655 | MCR320 | Apogee | MCW60 | MM U2-UFO |
A64 3800+ X2 AM2 @3.2Ghz| Biostar TF560 A2+ | 2gb Crucial Ballistix DDR2-800 | Sapphire 3870 512mb | Aircooled inside a White MM-UFO Horizon |
Current Phenom overclock
Max Phenom overclock
True...though the max a GT will clock to is what a GTS is at stock..which says quite a bit for a card that is cheaper than the GT, beating out at stock what the Nvidia card fully clocked can get to.
Again, this is all ATI`s late-to-the-game performance that we are seeing.
Perkam
what res and settings is safefire playing at? it worries me taht a pc of his calibur is dipping into the 20's and 30's
DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis
I'm actually going to tone down the graphics when I play. The pretty graphics distract me too much, can't concentrate on hitting my enemy![]()
Bookmarks