Actually PhysX on the GPU especially makes sense in games where there is not much graphics work to be done.. CPU-bound games.
You don't need 300fps. Why not use a few cycles for something else?
Printable View
there is MSAA built into the UT3 engine but its MSAA only and needs the commute shader (dx10 cards and the 360), its just a brand only restriction. and im betting that it will have GFWL and securom with activation since it has DLC right from the menu. the game looks clean and dosnt need AA though but its BS that i cant enable it.
the demo is nice its got a free rome for a little bit that the old console one didnt have, but its 2GB for 20mins.
Even if there was that option (which there isn't), games that tweak the engine to be fully deferred or introduce other elements to the rendering loop make AA virtually impossible. What, did you think that UE3 game developers all conspired together to not include AA in any of their games? Or that it was coincidence?
The fact is, most of the time it can't be done without a considerable amount of hassle (if at all).
It's very likely that NVAPI is used which is why it's not trivial to make it work the same on ATI. It's not "just a brand only restriction".
Commute shader? *sigh* please don't post more of this nonsense.
I have no clue what were devs thinking when they designed ultra realistic graphics engines without FSAA support. That's like designing a car without wheels...
But then again i was able to use FSAA in BioShock using HD4850 and i can use FSAA+HDR in TES4:Oblivion on HD4870. Which both weren't suppose to be able to use FSAA...
but NV will still own the majority of studios and have no card so other than tech demos im not expecting any dx11 for atleast a year
and its not the in game optimization that bothers me from NV its when something is the same and requires no new code but its locked out (IE the super high/ultra setting on doom3 that was locked to NV only and when it came out NV was on the gforce FX and it could barely play the game) or when u get a game like crysis that was intended to only be maxed out on one architecture so the game dosnt play well at all with any hardware
edit- the game is perfectly smooth i have no idea on the frames but it had a feeling of over 60-75
I like how when PhysX came out, everybody was posting where are the games?
Now we have some good ones released... :)
As far as running the PhysX on the CPU, I don't think you could withought a large performance hit...
Take Darkest of Days...
I think it may be an ideal PhysX benchmark...
These are not my official numbers, but let me tell you why I am excited about this PhysX benchmark. You get to select your level of PhysX!! I love that...
Resolution @ 1920x1200
Ambient Occlusion On
Graphic Details - Very high
PhysX Details - High
Antialiasing 4X
Anisotropic 4X
It gave me an average of 30 to 32 FPS
Same settings except PhysX Details - Med
I believe my score was ave FPS=47
Same settings except PhysX Details - Low
I believe my score was ave FPS=84
This may be the ideal Test to shake out the performance difference between dedicated PhysX GPS?
FYI - I am doing my numbers from memory right now, but know we will see some good PhysX numbers from this test. The level of PhysX selected, directly impacts the test. Systems running a dedicated PhysX GPU should shine. :up:
I also set my Ambient Occlusion to ON in the Nvidia control panel.
Thanks for the heads-up Talonman, trying the DoD demo now. Btw, I don't think the cape in Batman is affected by PhysX - which is really weird since that would be the most obvious and visible use of cloth effects. Who knows what these devs are really thinking :shrug:
When did that become a problem though? Everything our GPUs do is eye candy. I never got why "physics" eye candy was somehow second rate :shrug: And it's kinda hard to fault devs for using hardware PhysX, it's not like there's a competing solution out there. What I find fault with is how they're using it.
Only ironic thing is that all this physics crap could be done on CPU. I tried Mirror's Edge with full PhysX run on CPU. Back then i was using an overclocked E5200. It worked like crap. Today, equiped with i7 920 and the game runs just as bad with it. Don't tell me thats even virtually normal. Red Faction and Max Payne 2 looked just the same impressive half a decade ago. NVIDIA, you can lick my ballz. I'm not buying the "this can only be done on GPU" crap. Either the CPU code is intentionally full of garbage or they intentionally use some killer uber precision (that no one really needs) just to kill performance on CPU. I mean, give me a break, few cans and papers can fly around on a freakin AthlonXP 2400+ using Havok. You don't need an extra 200 eur gfx card to do that.
The rest of your post doesn't disqualify your categorizing it as a problem so I didn't need to quote the rest in order to raise my question. Which you still haven't answered by the way :)
Rezjor, the proof is in the pudding. If all these things are doable on the CPU, why aren't they being done by other people? The easiest way to disprove the need for PhysX is to point to alternative CPU implementations of the same things.
And who would "the other ppl" be? There is just NVIDIA with PhysX and Havok. And you can't just walk out into the scene and expect everyone to know your physics engine or even use it in developed games. The games that were using Havok just weren't developed with extensive physics in mind, they always used it in minimal possible way just so they could run even on crappiest dual core.
Just take Painkiller for example. All the physics affected debris and objects just fade away after 5 seconds. But when all the same objects are flying around i couldn't notice any slowdowns. So why not leave them there for 10 minutes so they are affected by any later rocket/grenade launcher or corpses? Or Doom 3 back then. All monsters just evaporated after a second. But when i used a no-evaporation mod, the monsters were flying around corridors, hanging on railways, sliding off slopes etc. Half-Life 2. Stuff faded away much later and i couldn't notice any slowdowns. Ever.
I wonder why they don't implement Havok with different levels of physics detail. So if you have crappy CPU, you can set it to Low. But if you have high end quad, you could set it to High and experience larger amounts of objects be affected by physics.
But now it's just always "ON" with no adjustable levels. So they just bog it down to a very minimum so it works on all systems, starting from crappiest dual cores).
You mean exactly like what Havoc and Apex destruction does...
Give it more time... These developers aren't sitting still. I am sure thay are trying to make their physics engines as interactive as possible. I think they are making progress.
I also love eye candy, so I don't hold that against them. :)
Actually to run PhysX does require more than just a CPU, if you want to run fast, have a decent amount of PhysX work implemented, and you plan on selling it to more than just the 2% of the market with the i7 boys....
Looking at my numbers posted above in Darkest of Days, I am convinced that a CPU would NOT do a good job for you calculating the extra PhysX work too. My dedicated PhysX 280 is faster than my Q6600 in both single and double precision calculations. Even with that, you can see PhysX is taxing to my system, depending on the level of PhysX I select.
I am unconvinced that my Q6600 could also do all the PhysX work, that my dedicated 280 is doing now, and not have a major performance issue...
It was the same story with Cryostasis. These new games are getting expensive on resources I think...
http://www.evga.com/forums/tm.asp?m=...1&key=�
Bottom line is the extra eye candy would have to go. I don't want that.
I am glad the developers are cutting their teeth on all physics engines, and think their method of incorporating them into the game will only improve. Sorry if there aren't enough effects for you in the games yet...
Using physics to be able to run on the crappiest dual core is exactly what I hope developers don't do. GPU physics is going to take us places that dual core CPU physics will only dream of. We are in for a good ride!! :up:
Are we talking about this game?
http://www.youtube.com/watch?v=Smgz66Q9bfU
I think if we left all destructed objects and corpses on the screen, for the entire game, it would be a resource hog. I am not against the idea, but for the tech options we have available to us right now, performance wins over keeping objects on the screen infinitely. Call it a trade off.
Nvidia has adjusted the length of time particles stay on your screen once already, from the way it was back in the ageia days. With more powerful GPU's getting made, they may bump it back up some day down the line.
They should have various PhysX effect levels in games to select from. I agree, but they probably figure the game will also run fine withought PhysX too, so the others can just run that way? Not sure myself...
Well you make it sound like anybody can do it. In any case Havok has been around forever but have you taken a look at their toolset lately? Ever wonder why they don't have support for SPH or physically simulated particle systems? If it was doable on a CPU it would've been done long ago!
That's because those "objects" aren't physically modelled at all, they're just the usual graphics tricks that fool so many people into thinking the CPU is actually doing something.Quote:
Just take Painkiller for example. All the physics affected debris and objects just fade away after 5 seconds.
Because even the lowest level of detail of some effects will kill even the mightiest CPU. This stuff is all very well documented, no reason for guesswork.Quote:
I wonder why they don't implement Havok with different levels of physics detail
Another that thinks PhysX, AA, and any other form of eye candy is a waste in games. :confused:
I wonder what developers should be working on for you? The Nintendo Wii?
It ENHANCE game INTERACTIVENESS and EXPERIENCE with crap graphics... Right up your alley.
Meh I watched the compero vids on youtube and frankly the differences were not that striking. Yey some cheese fog effects that will effect you but not the AI. Some :banana::banana::banana::banana:ty sparks and whoopdy to for a cape and some banners. Oh yeah you can scuff up the floor..
Waste of time.
I will never jump on the physix crap its still doesn't look or feel anything remotely close to RL. What I would pay for is an AI accelerator.
It is rather odd that on any related Cuda, or PhysX post, we have to re-hash the debate if PhysX is a good thing. The game, or main subject of the thread always seems to get bogged down by posts about what bad guys Nvidia is, for giving us PhysX for free.
I think both of these games look good, and at least Batman is fun to play. Darkest of Days will probably be too.
That should trump all.
Hmmmm somebody go tell Nvidia, AMD, Sony, Microsoft, iD, Epic, Crytek, Insomniac, InfinityWard, Naughty Dog, Codemasters and Co that they're wasting their time and millions of dollars on all those non-essential technologies that make games look better.
It's the images that communicate what's happening in the game to the user. What exactly is "gameplay" that I can't see? If I have AI that properly reacts to and makes use of shadows in the environment, it really helps believability if those shadows are actually there on the screen. If I want to get a sense of chaos on the battlefield it really helps to have smoke, debris and just crap in general flying everywhere....etc, etc. Don't really understand how you can separate what you experience from what you see. It's not like you're reading a book.
+1 :)