Slower than the GPU.
ehh.. with 175.16 i haven't got support for physx, but 177.35 wont ever starting installation w/out modded inf .. ^^' (have 9800GTX)
In GPU bound scenarios you'll always see FPS decrease. Always. "Problem" is, currently those UT3 PhysX enabled maps are not GPU bound but CPU bound due to physics. Again, someone please test Cellfactor with all the eye candy turned on and you'll see what a GPU doing rendering+physics is capable of.
177.35_geforce_winvista_32bit_international_whql from official nv download site without any different inf's (but almost now whql? smthing seems not to be true with this installer :D )
177.39 driver has finally installed now (but why xxx.35..? i don't know :shrug: :shakes: )
wot about gtx280 in 16x and 8800gt in 4x for physx on my p35 :hrhr:
i'm serious :shock:
it should work, but maybe it wont :shrug:
i had to change the .inf for 177.39 to install for my gt.
I found 177.39 to be better. I had BIG drops in GOW with it.
Cellfactor definitely isn't gpu bound. I can fire up cellfactor and temps on my gpu only raise by like 3-5ºC above idle. It's so far CPU bound that it's just crazy(even at 1920x1080 with AA/AF), and that's on the maps that don't require a PPU... There's no way the frame-rate will drop with GPU physics on, if anything they'll triple to quadruple.
I don't think it currently supports gpu physics though, I'd test myself, but I'm still rockin out my 8800GTX. I'd imagine if done the same way it was done to make things work properly with UT3 though, it could likely be done and working with cell-factor as well. :D
As for how the frame-rate will be effected by using GPU physics when gpu bound, we'll probably see minimum rates increase and maximum rates decrease.
Did some quick tests :)
PhysX OFF
Physx ON
Stock clocks 8800GT, the games runs ok with Physx on, hope more games are supported :)
STaRGaZeR, in order to UT3 physx work through GPU, users have to substitute the physxcore.dll file from UT3 binaries directory with a similar one from v2.7.3 directory of Ageia Technolgies.
Do you know what file we should replace in order to have PhysX Hardware acceleration in Cellfactor? I'll install the game right now.
Cellfactor without cloth and fluid effects (the flag in the top of the station and the fluid in the barrels) runs perfectly fine, 45FPS with slowdows during heavy combat in my modest 3870, because it's capped at that. You even have some unused GPU to do more work at times. If you enable the effects it turns into a freaking heavy physics test. Point is, without them it's a GPU bound game. With them, it's completely PhysX bound, it's not correct to say it's CPU bound because you're running something designed to run in hardware mode through software mode. Put them together and show me the power of the GPU doing rendering+real and complete PhysX effects. There are zero possibilities you'll see any gains.
I love Cellfactor and its complete PhysX set, but you need a dedicated GPU or PhysX card to run it smooth. And spending my money in another card just for a few games (1 or 2 with all the PhysX eye candy, that are techdemos?)... no. Put that engine in real games like Assassin's Creed, COD4, GRID, etc. and I'll buy 2 cheap cards if necessary :D
EDIT: Luka I'm going to reinstall it now, but PhysXcore.dll should be in the game directory. Download Cellfactor Combat Training, not Cellfactor Revolution (total POS). In orther to run it without an Ageia card you have to add "EnablePhysX=false" to the game executable, I don't know if you will be able to run it without this. This is only to bypass the PhysX card detection, it doesn't disable anything in game. Also in the direct access you can also configure resolution and shadows, turn everything on. The real deal is once you're in the game menu, for no PhysX card untick "Enable Cloth Effects", otherwise the PhysX engine will run in the CPU and you'll see a complete slowdown.
i couldn't find any Physxcore.dll, only Physxloader.dll /system directory.Quote:
Originally Posted by STaRGaZeR
I downloaded cell factor revolution...
True, there is only a PhysXLoader.dll in there. So I assume you guys with NV cards can't run it?
Runs fine with Vista64, I've reinstalled it right now. Just install, add "EnablePhysX=false", untick "Enable Cloth Effects" in the game menu and enjoy. Amazing engine to say the least.
The thing is GPU physics power obliterates CPU physics power... :yepp:
Anyway, I think trying to make elder games work with PhysX on GPU is a waste of time, if the support comes, i'll be happy if it doesn't i'll just wait for the future ones that do.
Time to go destroy some walls :D
Cellfactor was the techdemo of reference when Ageia launched PhysX. It would be an stupidity not to do it, so far CF is the only game that uses the PhysX engine in all its glory AFAIK.
Let's hope nVidia gives you a listen :p:
I keeping my expectations low, if the PhysX GPU support comes I'll be happy, if it doesn't, lowered expectations mean less pain :ROTF::ROTF::ROTF:
(it just came to my mind the "Lowered Expectations" jingle from MadTV :ROTF::ROTF::ROTF::ROTF:)
http://www.tvgasm.com/shows/loweredexpectations.jpg
If you want to be :eek: go here: http://www.cellfactorgame.com/cfct.html and download the real-time video (400MB). If NVIDIA has the power to do that at high FPS with only one card, or with one super low end card + normal one, then I'll say all this PhysX thing is worth the effort :D
There are plenty of sources out there that discuss how inefficient the CPU is at game physics....thus the whole reason agiea made their card in the first place....and furthermore why nvidia bought agiea so they could use their tech on nvidias GPU's.
http://en.wikipedia.org/wiki/Physics_engine
http://en.wikipedia.org/wiki/Physics_processing_unit
Abel, read the link you posted about Havok being free...
Havok isn't free if it's for a product to distribute... That press release is forQuote:
This initiative does not apply to license fees that may be payable to Havok for console versions of Havok Complete or to applications developed for other purposes such as game engines for redistribution, other middleware, movies, training, military or industrial simulation.
In otherwords, you still need a license to use it if it's for use in an actual game, unlike novodex. Only if you need source code do you have to pay a fee with novodex, or if you're going to modify it. ;)Quote:
Available for non-commercial use, Havok Complete for the PC will be freely downloadable in May 2008.
:up:
I read it :)
Quote:
Havok has entered into an agreement with Intel, Havok’s parent company, under which approved game developers on the PC platform can execute a commercial distribution license with Havok for free."
Free for Pc, Intel initiative :up:Quote:
Havok’s overall focus remains cross-platform and Havok will continue commercial licensing of Havok Complete for other platforms and in other industries such as movies and serious gaming. This initiative does not apply to license fees that may be payable to Havok for console versions of Havok Complete or to applications developed for other purposes such as game engines for redistribution, other middleware, movies, training, military or industrial simulation.
Look at a CPU test2, 11.64 steps per second.
http://www.xtremesystems.org/forums/...1&d=1213931681
Same rig, look at CPU test2, 164.87 steps per second.
http://www.xtremesystems.org/forums/...1&d=1213931661
:)
ah UT3 has been proven to work with Nvidia PhysX.
You can also play "grid rally" . With physx enabled it gives a boost from 50fps to 90fps (1920x1200) .
Stelios means Codemasters' new game, "Race Driver:Grid"
http://img112.imageshack.us/img112/1...utorialvv5.jpg
Yes, I know. I just didn't know PhysX had effects on it. Seems it does, at least it feels a bit more light with these drivers...
Xfastest Leaked 177.40 suppose to increase performance on GT200 series im downloading the vista x64 right now
Yea, and there is some interesting stuff in the inf.
Code:NVIDIA_GT200.DEV_05E0.1 = "NVIDIA GT200-400"
NVIDIA_GT200.DEV_05E1.1 = "NVIDIA GeForce GTX 280"
NVIDIA_GT200.DEV_05E2.1 = "NVIDIA GeForce GTX 260"
NVIDIA_GT200.DEV_05E3.1 = "NVIDIA GT200"
NVIDIA_GT200.DEV_05E4.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05E5.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05E6.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05E7.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05E8.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05E9.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05EA.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05EB.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05EC.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05ED.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05EE.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05EF.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F0.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F1.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F2.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F3.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F4.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F5.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F6.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F7.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F8.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05F9.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05FA.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05FB.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05FC.1 = "NVIDIA GT200 "
NVIDIA_GT200.DEV_05FD.1 = "NVIDIA GT200-875-GL"
NVIDIA_GT200.DEV_05FE.1 = "NVIDIA GT200-850-GL"
NVIDIA_GT200.DEV_05FF.1 = "NVIDIA GT200"
yea noticed that too :D
physx works as well going to test vantage and ut3
http://img175.imageshack.us/img175/1691/14292387dn9.png
I want that vista theme so bad. (offtopic)
maybe stupid qustion but 6xxx series won't support this feature?
Once I get this damn 780i board to boot into Vista, I'll run some multicard tests.
I didn't notice any performance difference in 3dmarks & crysis . Only difference is that 177.40 gives me a lot of BSOD's ! :p
So, has anyone managed to get the nVidia physX working in cellfactor?
Works nicely with UT3 physx maps... more than 2x boost compared to letting cpu doing the job.
With SLI.
http://i18.photobucket.com/albums/b1...me/Capture.jpg
I'll try disabling SLI to see how it compares.
EDIT: So i disabled SLI and just like another member that tested dual cards. I have no option to select what GPU does the PhysX's calculations which is not what the first post says :shrug:. I am going to run the test anyways, but i am going to assume the scores are going to be identical to a single card solution. Is there any other problems besides Vantage and UT3 that take advantage of this yet? I don't have UT3 to test.
As was thought scores were no different than a single GPU.
http://i18.photobucket.com/albums/b1.../Capture-1.jpg
Hopefully other apps will take use of it soon to really try it out.
you should try ut3 lighthouse map, and report your fps, so one can see if a card doing physx + one card rendering, gives much more fps.
don't try to select which card will do physx, according to vantage scores only one does it, if the same happens with ut3, fps should be high :)
http://img505.imageshack.us/img505/6...sxresulql9.jpg
3dmark vantage on this E7200 @ 3.6 GHz, RAM @ DDR2-1000 5-5-5-15, 8800GT @ 720/1800/2100
177.39 w/PhysX enabled...
[H]ardOCP has a nice picture for you all...
http://www.hardocp.com/news.html?new...VzaWFzdCwsLDE=Quote:
We don’t have much information on this aside from the screenshot below but, from the looks of this 3DMark Vantage screenshot, the Radeon HD 3850 is capable of PhysX GPU acceleration. Again, we don’t have any info on this other than what you see here but we thought we’d share anyway. Thanks to EBadit for the screenshot.
http://www.hardocp.com/image.html?im...pfMV8xX2wuanBn
http://www.hardocp.com/images/news/1...1qFZ_1_1_l.jpg
Photoshop?
THIS IS GREAT (if true)
It means every GPU will be able to accelerate PhysX!
Now I want to see the fanATIcs talk sh1t about nVidia.
If this is true, you can see that the control panel is ready for Radeon cards, and means nVidia is doing this free for all.
Talk about dark side now, huh?
NV drivers working with Radeon cards? I smell BS. Those drivers have PhysX ported to CUDA right?
AFAIK, every 177.xx works with PhysX 8.06.12, and those 177.xx bring cuda.dll.
I think CUDA works with radeon architecture, remember editor's day, nVidia's CEO told everybody that Cuda works with ATI GPUs, but please, don't tell them!!
makes me wish this is real :)
Let's hope so, I want Cellfactor with PhysX badly :D
http://techgage.com/article/nvidias_...status_report/
Techgage tested PhysX. In Vantage 9800 is 4 times more powerful than a PPU but in UT3 frames are lower. Since 8600s have 1/4 of 9800's SP, it should be as powerful as a PPU. I'm think getting another CUDA-enabled card for Physics processing when future drivers support them.
I hope they test a GPU as a separate PhysX card as well.
Hi guys, if i understand right the whole physx thing, than i'll be able to purchase gtx280 etc, and leave my existing 8800gts (g92) in a second PCIex(4x) slot and set it as a dedicated physx card, even though my board doesn't support SLI it wouldn't matter since i'm not gonna use it in sli, is that correct (i have abit ip35 pro)? And if anyone did any tests that show how usefull that would be? what's gonna be the real gain fps wise, comparing to a single gtx280 with no additional card ?
http://pclab.pl/art33645-8.html
It's in polish, but you should understand graphs ;)
In graphs, was it a 9800 + 9800as a dedicated physx card, or is it 9800+9800 in sli?, i can read russian, not polish :) and what is 9800 +GPU physx, did they just enabled physx in the driver for just a single 9800 card?
Yes, 2nd 9800 GTX+ was used as dedicated physX card. 9800 GTX+ + GPU PhysX is with one card doing PhysX and graphics.
Do you guys think i'd need a second power supply if i buy gtx280? right now i have corsair 620W..I know you could get psu from thermaltake to put into 5,25 bay, are trhere any other alternatives?