quoted. 10 charsQuote:
Originally Posted by brinox
Printable View
Shift installs 9.09.072 drivers which disables ageia ppu.You need 9.09.0814,and Modded .dll
Also
THE LAST REMNANT and GOTHIC 3 WORKS :D
That is not someone asking for PhysX on ATI cards that is someone who has a PPU that stopped being supported & now wants to see if he can use a 9800 GT EE to do the job of the PPU.Quote:
Originally Posted by brinox
I want to run my HD 4890 with my 9800 GT EE picking up the offloaded hardware physx abilities. Better yet, I want to use my BFG/AGEIA PPU that I purchased forever ago to work with my ATI card, as it was originally marketed.
The difference is that on= running on ATI cards with=running ATI card with NV/PPU card to do the PhysX
Well ill have to pick one up :up:.
The thing about a standalone PPU PhysX card is that it only adds & can never take away performance directly from the GPU.
NV should of given the option for selling there lower end GPU's as a PPU by having no display ports & i Bios change & of coarse the current options with standard GPUs.
Could not be more incorrect. There has not been one single ATi user in any thread asking for PhysX on ATi hardware.
The only thing ATi users want is for PhysX not to be disabled on their nV secondary GPU when their primary GPU is not an nV GPU.
The graphics drivers do not interact when one is performing PhysX operations and the other is performing graphics rendering work. nVidia is only throwing this in as a bone to themselves. I guess saying "using PhysX in a system with a non nVidia GPU may cause performance and/or stability issues" is too easy for nV.
I failed to understand the logic of NVIDIA with this. Every GeForce sold is better than none right? If they expect existing Radeon users to switch from great Radeon to a higher range GeForce just because of PhysX, i don't know from what planet they came up with this idea. Because i don't know anyone that would do that. But i know plenty of ppl who would buy an extra mid range GeForce card just for PhysX to work along their existing Radeon.
Did you even click on the link? The link shows that a GTS 250 can achieve very playable frames with maxed out PhysX at 1050p and below. It's only when you start increasing the resolution/AA that it becomes unplayable.
This mirrors my own experience. When I had my 24 inch monitor (1920x1200), I could easily play Batman AA with maxed out PhysX and settings on a single overclocked GTX 285.
When I got my 30 inch monitor though, I had to get a dedicated PhysX card because the 3D load had increased substantially.
Premature to say the least. Anyway, unless you believe CPUs and GPUs are equally capable in FPU work (which we all know isn't true), then it's ridiculous to state that GPU accelerated physics can't offer anything that CPU physics can already do.Quote:
gpu physix are dead imo... gpu physix this and that... all the hype... for almost 10 years now... and still, today gpu physics cant offer anything worth the effort that cpu physix couldnt do as well...
This assertion has nothing to do with the intrinsic value of GPU accelerated PhysX, and everything to do with GAME DEVELOPERS.Quote:
sure the gpu CAN do physix as well, im sure there are a lot of things it can do better than a cpu... but do those things really matter and add value to a game that is in any reasonable ratio to the work it requires to implement? clearly not...
After all, it's game developers that decide how to implement physics in their games. If games have lackluster physics, do you blame the physics API or the game developers?
A reasonable person would place the blame squarely on the developer, much as they would if a game had :banana::banana::banana::banana:ty graphics, boring storyline and crashed to the desktop every 10 minutes.
Granted, some 3D engines or Physics APIs are inherently better than others, but ultimately, it's the talent, skill and creativity of the developer that makes or breaks a game.
Well at least you show your true colors now.. From one FanATIc to another :ROTF:
I provided evidence that PhysX can run on a single graphics card with decent performance.Quote:
Yes, we'll want to run PhysX on a single card...if you still have a Tandy 13" dia. screen/640 res/16 colors and load in King's Quest on floppy. :idea: Oughta get some playable FPS then, yep.
Or...maybe not. :rotf: :rolleyes:
You on the other hand, haven't provided squat to back up your claims other than some nonsensical fanATIcal diatribe :rolleyes:
From the bitter tone of your comment, it sounds like you're the one that needs to get over it.. :shakes:Quote:
I've used Nv and ATI and liked them both. But this Nv + PhysX is just another steaming pile from their marketing division, as others have pointed out.
Get over it.
I looked at your link. 34fps avg at 1920x1080 4xAA PhysX On with a 285 single is hardly playable.. and drops to a 19fps average on a more intensive (scarecrow) benchmark. You can get by with 30-40fps on some engines and in some single player games, but honestly single player games have a very short shelf life and engines which pull your FPS unrealistically for their graphical display prowess down suffer from the Crysis effect: useless benchmarks only.
The link was in reference to the single GTS 250 getting an average FPS of 38 in the Scarecrow level with 2xAA at 1680x1050, with maxed settings.
Disable AA and the frames will increase even more..
Same thing with the GTX 285. At 1920x1200 4xAA with all settings maxed, the framerate isn't playable because the GPU is over burdened with 3D rendering.
However, if you turn down the AA to 2xAA or disable it entirely, you will have playable frames.
But the point is that you guys are making "absolute" statements that you can't run PhysX and 3D on the same card, which is false. I did it before when I had my GTX 285, and there are many others that have done, and are doing it..
May you have to turn down some 3D settings a bit? Perhaps.. But, that doesn't disqualify the assertion that PhysX and 3D can be run on the same card.
no... i plated BAA on my core i7 rig, i dont need an article to tell me what i experienced first hand... i was on dual 250s, had to set one as deciated physix to get ok perf, even then it wasnt great, swapped them for 2 260s and even with a dedicated 260 with physiox set to max it STILL stuttered and min fps were way below what youd expect from such a highend system... ESPECIALLY since the effects were nothing special at all...
i didnt say that... i said nothing worthwile...
im sure there is a lot you can do on a gpu you cant do on a cpu... but is it worth it? i havent seen anything so far that is...
so physix is awesome but there isnt ONE game dev that can use it properly, thats why physix implementations arent really overwhelming and cause massive fps drops?
with physix what you need to look at is minfps... thats what tells you if you need a dedicated card for physix or a more powerful card for it...
i see those FAQ / Q&A many months ago
do you know of any benchmarks that show the fps loss when turning on phyxs? i would like to see how much performance is required to handle those things. and if you can, find one that shows how of the cpu is being used. and finally, one with a hacked version that runs it on the cpu entirely.
what i expect to see is that you loose probably 30% of your avearge fps with physx on for any average card (meaning not spending 500$ on the gpu setup)
and i bet you see that these games are duel core optimized and if you run physx on the cpu, it will be about the same, except your gpu can get more fps, even if there are a few spikes here an there due to cpu bottleneck.
No, there is no reason why NVIDIA should allow AMD to use PhysX as well, but I see little reason to artificially block PhysX when a competitors card is present, too. This is just the same marketing-BS like back in the days when NVIDIA said Intel's chipsets didn't have enough bandwith for SLI, hence blocking SLI on their chipsets.
The only ones losing are the customers, that's why I find it hard to understand those defending NVIDIA's actions.
Also the AGEIA ppu has a lot of life left in it why wont they support it.I remember when they were bought they said they would still support the ppu. LIARS,Carmack was right they didn't care about the growth of the gaming industry they just wanted some Money.
If people were complaining about issues with PhysX not working properly because of an actual issue or glitch while using ATI graphics, it wouldn't be on Nvidia to fix it. I don't think many people would complain about it. However, this is not what the issue is. PhysX has and does work perfectly fine with ATI in the system and Nvidia is purposely disabling it and lying about the reasons.
Nvidia is a bunch of bold faced liars, who in my legal opinion, has committed (in the US) illegal marketing practices. They advertise the ability for cards to use PhysX (says so right on the box), then when put in the system it does not work if another card other than Nvidia is detected.
I would wager to say that had Nvidia not done this you would see wider adoption of PhysX by us, the customers. Because regardless if we choose Nvidia or ATI for our main graphics we could still buy an Nvidia card for PhysX support. But this isn't about PhysX and it certainly isn't about the money they paid for it. It is about marketing, because PhysX is another marketing gimmick by Nvidia not an actual physics platform.
34fps is unplayable, but 38fps isnt?
I agree turning down AA should boost performance, but unfortunately the benchmarks are incomplete and do not give performance for lesser AA levels.
Given the minimum FPS numbers displayed with low average numbers and the "intensive" (scarecrow) benchmarks i'm going to sit with the "rendering and PhysX on the same card results in a poor gaming experience" crowd.
This. Its quite unfortunate for the PPU owners as well. Nvidia are turning into a true bunch of :banana::banana::banana:gots
Honestly, even the Driver issue is bull$hit. Nvidia cards being used as a PPU should really be supported, i am sure an ATI driver update will not break that. Is there even any driver level interaction between the two cards? no. why would a secondary card being used for physics stop working then?
I guess Nvidia cannot bear the fact that their cards would only be run as a physics processing card only while the ati card would be the primary card. Yep:yepp: