I would surely love to, but I borrowed my UT3 DVD to a friend of mine, I will have to wait he's on-line to test it :(
It won't work on my 8800GTX, The PhysX doesn't enable. :(
Is there is a way to test that thing on widows XP ?????????
The Demo is working and Geforce Physx is ticked
Windows XP 174.74 driver ...
Correct me if i'm wrong. Can I use one 9800GTX as my main GPU and an 8800GT for PhysX?
cool. Will try out the following combos...
GTX 280 + 9800GTX
GTX 280 + 8800GT
9800GTX + 8800GT
Would I be needing an SLI bridge? Or just plug them on the PCI-E Slots?
Screenshot from PhysX CP's documentation.
Quote:
The Settings tab contains the following options (see Figure 3):
Hardware Device Selection box. Select “GeForce PhysX” to allow a GeForce enabled PhysX engine use your GeForce GPU. Select “AGEIA PhysX” to allow the PhysX engine to use your AGEIA PhysX processor.
If you have multiple GeForce GPUs that the PhysX engine can use, the “Select GPU for GeForce PhysX” button will appear. Click it to display the “GeForce PhysX Device Selection” dialog (see Figure 4). You may pick which GeForce GPU the PhysX engine will use, or allow the PhysX engine to choose. If you are running Windows XPTM, the PhysX engine will attempt to find a GeForce GPU that is not driving a display. If you are running Windows VistaTM the PhysX engine will pick the first GeForce GPU that it finds.
If you are having problems with multiple GeForce GPUs in Windows VistaTM: You must extend a desktop onto each GeForce GPU if you want the PhysX engine to use it in Windows VistaTM.
If your GeForce GPUs are in and SLI configuration, they will appear to the PhysX engine and the PhysX Control Panel Settings as a single GPU.
For this release the PhysX engine is limited to using either an AGEIA PhysX Processor or a limited set of GeForce GPUs that have passed our rigorous test suite.
Reset AGEIA PhysX Card button. Select this option for a soft reset of the AGEIA PhysX processor.
Start Extended Diagnostics. Select this option to run extended diagnostics on the AGEIA PhysX processor and display the information in the tab. Click Save Report to File to save the information to a file for technical support. This is a very useful tool if the AGEIA PhysX processor does not seem to be working properly and you need to communicate with support staff.
Cool. :D Will try it out.
Do you think anyone will test if bandwidth will make a big difference? If it doesn't use the SLI bridge, will running it at 4x instead hurt performance a lot (or even 1x in some cases)?
If anyone has the 2 GPU combo, can you set the PCI-Express to 4x (and 8x if you have it) and see if it lowers your CPU score?
any modded inf for 177.39 vista32? Looks like I'll be needing one to test this one...
9800GTX main + 8800GT
@wyemarn
I can't see the select GPU for Geforce PhysX tab. :(
@warboy
Thanks. :D
Can't wait for the GTX 280 to arrive at my doorstep. :D will test it while the 9800GTX is still here. hehehe
reinstalling drivers. :D Hope this one works.
edit...
Its weird... I still can't get that "select GPU for Geforce PhysX" tab. :(
noob alert...
"If you are having problems with multiple GeForce GPUs in Windows VistaTM: You must extend a desktop onto each GeForce GPU if you want the PhysX engine to use it in Windows VistaTM."
How do you do this?
Now getting a 3rd 8800GT doesn't seem so pointless :clap:
EDIT: Can someone with a working SLI setup confirm that the PhysX will work without the SLI bridge?
I wonder if the dual-gpu thing will work on P35 or a non-sli board because that's not confirmed. If true, that would be epic.
If I would need an Nvidia board to run PhysX, then this so pointless. :(
Anyone out there running the physics driver AND also running the new F@H CUDA ? I have an 8800gts g92, but I also want to fold when not gaming. Anyone know if this is a problem or is there a way to make this work? (I won't install till i figure that out).
I activated Ageia with the DFI X48 T2R
im using two cards... but i dont think the other one is working as a physx card. any way to check if its working?
http://img293.imageshack.us/img293/8...0sscagemt2.jpg
With the driver 177.39 I had gain performance.
175.16
P4078
GPU - 4907
CPU - 2707
http://i186.photobucket.com/albums/x...arkVantage.jpg
177.39 & PhysX driver
P6477
GPU - 5293
CPU - 19666
http://i186.photobucket.com/albums/x...ewithPhysX.jpg
:up:
---------------------------------->
Overclocked my 9800 GTX a bit. :)
P7022
CPU - 21066
GPU - 5746
http://i186.photobucket.com/albums/x...overclocke.jpg
i wish that Ea would allow nvidia physx support for supcom forged alliance. Nothing is more annoying than spending three hours building up a strike force armada of tech3 gunships just to see your fps drop dramatically (in the range of 10-15) when you give the assault order. (this is with a q6600 at 3.6, and a g92 8800gts 512 btw) It's obvious that the game is cpu limited.
EDIT can someone please test actual game performance? My beloved g92 is on the lamb (rma).
I thought any game supporting PhysX will take advantageQuote:
GPU Acceleration Of PhysX In Games
GPU acceleration of PhysX is currently supported only in 3DMark Vantage and Unreal Tournament 3. Future games that will support GPU acceleration of PhysX include :
* Mstar
* Mirrors Edge
* Empire : Total War
* Backbreaker
* Pwnage
There are other games that use the PhysX API but they do not specifically support GPU acceleration of PhysX.
:up:
There are quite alot of PhysX games released, mostly because of UE3, but almost all of them is software based.
For all games with PhysX support no increase of performance should be observed.
Unless...
The game already supports hardware acceleration, the game is being patched to support hardware acceleration, or it is forced by the user.
The last option is clearly the best, if it was easy to do.
No, not at all. Most PhysX games are done using software acceleration, so a PhysX isn't needed, and the option for hardware acceleration is left out. Only games that would see a gain with a PhysX card, will see gains with GPU card. Without any changes or patches done to the game and its files.
At least, until it can be forced to do hardware acceleration, which may not be so hard to do.
I want physX on my 8800GTX! GRR!
I am really wondering if running physx on the video card could warranty me removing my Ageia card from my system altogether or would it still be best to keep a dedicated card due to the video card being slowed down if it must be used for physx calculations as well.
Ok, so I installed UT3, patch 1.2, physx mod by ageia.
I only have 2GB vista x64 and unreal took me more than 1GB memory for itself :eek:
Gaming became unplayable, just as soon as memory runs out, but physx were working alright. Guess I'll have to get 4gb memory ASAP :D
That is a very good question, and one I am puzzled about as well.
If PhysX card + 8800 Ultra is slower than just using a single Ultra, then the answer is a clear one.
But with no support for G80, yet, AFAIK, then comparisons can't really be made. I am an Ultra user as well with a dedicated PhysX card. I am really puzzled if I'll loose SLI benefits, if I dedicated a single one for PhysX, and how it all would compare to Ultra SLI + PhysX, or using one as GPU and one as PhysX.
If there is a driver for G80 PhysX that worked now, for Vista Ultimate x64, then I would install the OS and do the benchmarks myself. Unfortunately, that doesn't appear to be possible :(
Anybody have a XP x64 modded inf. for 8800GT 512?
Thanks! And if you dont, maybe warboy can help?
I am not too concerned about keeping my 8800 Ultra in replacement of the Ageia card PLUS adding in another video card. I was thinking more along the lines of getting getting rid of my 8800 Ultra and Ageia Physx card and getting an overclocked GTX 280 in replacement of both to do the job of both. I am wondering if having a single GTX 280 doing both GFX and Physx work would be a bad idea and I should just stick with the GTX 280 as a replacement for my 8800 Ultra for merely GFX yet still keep the AGEIA card solely for physx. If the GTX 280 can do both with no real loss than I would love to free up the slot in my PC that the AGEIA card is using now as well as cut back on heat, noise and power. Something tells me I am expecting way too much however.
@Raptor22
Maybe, I'll see what I can cook up in a bit, I'm playing Spore right now.
As you say, UE3 games use PhysX but not hardware accelerated. Only UT3 has hardware support. And of course, if you use only one card you'll see perfomance decrease 100% sure. If a chip is doing two task at the same time it will be slower than if it's doing only one. You'll see that behaviour in your daily UT3 usage. The tests in the first post are like techdemos for physics, not real game situations.
People, don't be fooled by this. You'll only see benefit by using a second card and this second card only doing physic calculations.
i tried PhysX drivers with my 8400GS LOL and it doesn't seem to work (grayed out) but i can run those physx demos :confused:
well something aint fvckin right my score blows big nuts
my cpu score alone leads me to believe the Ageia card isnt working properly
have you compared your results with other's runs
CPU score seems kinda high donno
I can now confirm that for Physx to work 177.xx drivers is must because they include nvcuda.dll which is essential and without it Vantage fails in the second CPU test
no not really. especially when the GPU is handling graphics at the same time.
those CPU tests are not JUST cpu, they involve alot of graphics also that could never be rendered with today's cpu's like they are on screen.
granted the frame rates are pretty poor which shows there is alot of cpu usage but i dont believe everything in those tests is all cpu.
anyways.. i installed the drivers with the modded inf the other guy posted and they wont install... well let me rephrase that, they install but they do not work. the driver installs ok,, i reboot and Vista acts like no driver was installed...
anyone with an 8800 GTX been able to install the drivers and make it work.
i have seen 8800 gT's work so why is the GTX not ?
Thanks guys! I got the physics driver to work in vista x64 w/ my 8800gts g92 w/o any issues, and no modded .inf (maybe because I have a modded .inf for cuda?).
I'm going to see if it changes my scores for 3dmark06 (no clue if it will or not). I'm a cheap bastard, so I didn't pay for vantage.
http://img95.imageshack.us/my.php?image=35866533te0.png
i can do some ut3 tests in a few hours when i get back from class!
driver PhYsX nvidia!!!
http://i27.photobucket.com/albums/c1...na:vantage.jpg
devil may cry4 gain excellent driver with PhYsX DX10 vs DX10
1280x1024 AAx4 AFx16 super high driver physx nvidia 177.39 DX10 vista ultimate 32bits
http://i27.photobucket.com/albums/c1...na:vantage.jpg
1280 x1240 AAx4 AFx16 devil 4super high driver 175.16 force ware DX10 vista ultimate
http://i27.photobucket.com/albums/c1...08-06-0516.jpg
Ok, now, this is just sickness.
was playing one of the UT3 Physx maps, tornado, I think. Decided to monitor FPS, CPU1 and 2 usage, physical memory usage and page file usage.
Any of the physx maps after 30 secs of play and some scenery destruction becomes unplayable, framerates drops hugely.
Here is the result:
http://img154.imageshack.us/my.php?i...phsxmemwi7.jpg
I know I shouldn't have a page file, but tell me...
HOW MUCH MEMORY WILL I NEED? 8GB? WTF?
OK, Unreal Tournament Benchmark Tool Version 0.2.1.0 (didn't check to see if there's a newer version)
Hardware:
q6600 @ 3.6 ghz
8 gigs ddr2 @ 800
8800gts g92 @ 784/1960 1013mem
Vista x64
177.35 driver w/ modded .inf for folding (CUDA) & 177.39
Stock settings, 1920x1200, DX9 (forgot to do DX110), texture and level details 1, software settings: 12, 60 (no idea what this means)
Here's the results:
http://i30.tinypic.com/10ohduo.jpg
First is with the driver, 2nd is without.
About 7 FPS better. Is that what everyone else gets?
The physX does seem to smooth out the look entirely.
I forgot to do DX10 and for now, I don't feel like uninstalling, running 10 tests, then reinstalling. I'll do this later if anyone is interested.
Here is the Vista64bit Modded Inf..
Link for UT3 modpack: http://www.techpowerup.com/downloads/856/mirrors.php
About the plugins for Rivatuner, choose hardware monitoring, click Setup (lower right corner), and click on the "v" in order to make the *.dll plugs available on hardware monitoring main page :)
well i only gain 2000 points when using my Ageia card so...
the ageia card is worthless for Vantage....
(this is with Vista64)
i also found out why the driver wont install in Vista32
during the driver install it asks if you want to isntall the driver or not and i accidentally clicked do not install, so now it will never install.
how can i make vista forget that i clicked "do not install" so it asks me again if i want to ok the driver install?
I am not 100% sure but I believe there is some sort of memory leak in the physx component of these drivers.
I actually managed to run out of RAM playing UT3!! and I have 4GB of RAM!
Without Hardware Physics enabled everything is fine.
Just thought you might want to know.
By the way I am now getting P7000 or so in 3dmark Vantage and breaking 14K in 3dmark06 in Vista x64 with a Stock G92 8800GTS
John
Yes spot on :up:
Well maybe not ok as with the hardware physics turned off it is slow compared to with it turned on however after a few minutes gameplay with it turned on the PC starts to lag bad and that is because it is paging (I checked this out with the Vista resource monitor to...it had used all of the physical memory)!!.
John
Do the original physx games support this?
I'm talking about graw, etc, i wonder what kind of fps imprvements we can expect, if any.
Here is a free ageia physx game, lets hope nvidia got the coding right.
My g92 gts was dropped off earlier today, i'll update, and test later on
has anyone here tried the physics driver with 2 GPUs?
they arent faster when its not using the GPU. which is my point of asking if anyone with an 8800 GTX is able to make it work.
with the Ageia card disabled its using CPU because, well, as far as i know the 8800 GTX isnt working
unless my score of 13000K cpu points is normal for my rig (specs are in my sig)
lestat
i looked at other ppls runs and your score is not normal
that is your physx card contributing to the clocks
also you will notice that GPU does not take a hit when you see ppl posting before and after shots :)
Dinos,,, interested because i disabled the Ageia card in device manager and lost 2000 points, and then reenabled it and gained the 2000 back...
its obviously using the card still at some level..
now this was with Vista64 i cant make Vista32 install the drivers because i chose to deny the install once so now it wont ever install.
so we wont know the truth of the Ageia vs 8800 GTX until nvidia releases working drivers for the 8800 cards (non G92)
i have an 8800 GT in another rig but its XP so Vantage is a no go,, (stupid Futuremark)
warboy - i know the GTX isnt supported by the drivers officially but it wouldnt be the first driver that would make something work w/ a modded inf... :)
i don't think nvidia planned to open physx to anything other than new models otherwise we wouldn't have to mod .inf files you know what i mean
i think the reason why G92 and later work is probably because they may have ported it through newer version of CUDA but dont quote me on this as i'm not too sure
They did plan to have it work, but just like with drivers, these things take a lot of time to properly QA and pass certifications.
A lot of times when new cards are being released, they need to either just QA on those new cards to get them out the door, then followup with a normal driver for all cards, or delay the launch until they've QA'd everything.
So the point is they test the bare minimum of cards (new ones) to get them out the door and followup with more later.
Do you really think they're going permanently to leave the ability of a card to run PhysX or not up to an .inf entry?
fair enough
that's what they were saying during editor's day but i am just thinking why not at least open up the driver to G92 as it clearly already works in this first release you know what i mean :shrug:
i guess i should have asked about that in hindsight :shrug:
http://www.ocforums.com/showthread.php?p=5678432Quote:
Quote:
Originally Posted by kellygtp View Post
Does anyone have a copy of thier modded INF for a 8800GT G92
See my post above. Extract the driver files[from the .exe driver file] in a folder and open up the inf file in notepad (for the XP drivers it's nv4_disp.inf). Search for all instances of 0612[there are about 3...use search in notepad]
and replace them with 0611. Save the file, and install the driver. Works like a charm for 8800GT.
i got 177.39 to install, then dubble click on 8.06.12 nvidia physx installable file...->demo's function from nvidia physx control panel:shrug::)
8800gt XP
i guess for gtx, you would just change the 0612 to wotever your inf shows the gtx to be.
This doesn't work on x64 systems, does it? I downloaded it and installed it.....but theres no start menu entry, nothing in control panel...what gives?
I just wanted to let you guys know. I decided to go and try the Physx driver on my 9800gx2... I'm running the modded 177.34 Nvidia Drivers, and It works sweet without going with the 177.39 drivers.
Without Physx...
http://img.techpowerup.org/080621/Vantage 11837.jpg
With Physx
http://img.techpowerup.org/080621/Vantage 14086304.jpg
I hope this can help other users that have the Nvidia card! :up:
actually the entire 8800 series was planned from the start.
it was started when they released the cards that they would eventually migrate the physics to the 8800 cards and eliminate the Ageia cards.
i think the 8800 cards are just not as native to physics as the 9800 or G2X series.
Please smoeone - tell me how to mod this 177.39 drivers - for 9800gtx
I have the 177.35's installed....do i really NEED the 177.39's for PhysX to work?
Wanna get high? :D
The only problem I had was with Unreal Tournament 3 Physx maps, after playing a bit, memory usage was insane.
Check it here ;)
Either way, I hope this is solved with a future UE3 patch or a new physx driver :)
I am using Vista X64, and everything is ok, except that :)
Oh, kai, you better check your x64 driver source, you might have downloaded a corrupt self-extractor ;)
Adding a conspiracy theory to the mix...perhaps the G92 has some "magic" in the core which means the Physics support is better than the G80?!?! perhaps even less of a performance hit
John
Don't think so, I think they firstly sent G92 support because userbase is larger.
I managed to install Infernal, it uses Ageia PhysX engine, and I think it is working :)
I will have to uninstall the PhysX driver to confirm, so stay tuned :)
Edit: I unninstalled PhysX by nVidia and installed PhysX Ageia included with the game, tested one, then other, and I think none of them is giving me the same game experience that i could have if i had an Ageia PPU.
I suppose PhysX will just be from now on :)
come on brutha you know what i mean lol ;)
i simply meant that when the 8800 cards were released they stated back then that they would support physX natively.
but i think they really exploded with the 9000 series and the 2xx series so it supports it even further.
so thats why i said i dont think its as deep of a native support as the other cards.
it supports it but not as well ?? or not as efficiently (there that last one is better)
no im not confused, your just too high... :PQuote:
Probably he is confuse
Physx runs on the GPU through CUDA (Compute Unified Device Architecture), CUDA is on the paper since G80.
G80 can run Physx because it can run CUDA, just as simple as that
think back when the 8xxx cards were released, they said back then they would eventually make the 8xxx cards support physX on die..
please dont tell me you forgot that already guys, that was one of the reasons i was happy to buy my GTX