there is MSAA built into the UT3 engine but its MSAA only and needs the commute shader (dx10 cards and the 360), its just a brand only restriction. and im betting that it will have GFWL and securom with activation since it has DLC right from the menu. the game looks clean and dosnt need AA though but its BS that i cant enable it.
the demo is nice its got a free rome for a little bit that the old console one didnt have, but its 2GB for 20mins.
Last edited by zanzabar; 08-11-2009 at 10:57 PM.
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
Even if there was that option (which there isn't), games that tweak the engine to be fully deferred or introduce other elements to the rendering loop make AA virtually impossible. What, did you think that UE3 game developers all conspired together to not include AA in any of their games? Or that it was coincidence?
The fact is, most of the time it can't be done without a considerable amount of hassle (if at all).
It's very likely that NVAPI is used which is why it's not trivial to make it work the same on ATI. It's not "just a brand only restriction".
Commute shader? *sigh* please don't post more of this nonsense.
Last edited by Sr7; 08-11-2009 at 11:02 PM.
I have no clue what were devs thinking when they designed ultra realistic graphics engines without FSAA support. That's like designing a car without wheels...
But then again i was able to use FSAA in BioShock using HD4850 and i can use FSAA+HDR in TES4:Oblivion on HD4870. Which both weren't suppose to be able to use FSAA...
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
but NV will still own the majority of studios and have no card so other than tech demos im not expecting any dx11 for atleast a year
and its not the in game optimization that bothers me from NV its when something is the same and requires no new code but its locked out (IE the super high/ultra setting on doom3 that was locked to NV only and when it came out NV was on the gforce FX and it could barely play the game) or when u get a game like crysis that was intended to only be maxed out on one architecture so the game dosnt play well at all with any hardware
edit- the game is perfectly smooth i have no idea on the frames but it had a feeling of over 60-75
Last edited by zanzabar; 08-11-2009 at 11:51 PM.
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
I like how when PhysX came out, everybody was posting where are the games?
Now we have some good ones released...
As far as running the PhysX on the CPU, I don't think you could withought a large performance hit...
Take Darkest of Days...
I think it may be an ideal PhysX benchmark...
These are not my official numbers, but let me tell you why I am excited about this PhysX benchmark. You get to select your level of PhysX!! I love that...
Resolution @ 1920x1200
Ambient Occlusion On
Graphic Details - Very high
PhysX Details - High
Antialiasing 4X
Anisotropic 4X
It gave me an average of 30 to 32 FPS
Same settings except PhysX Details - Med
I believe my score was ave FPS=47
Same settings except PhysX Details - Low
I believe my score was ave FPS=84
This may be the ideal Test to shake out the performance difference between dedicated PhysX GPS?
FYI - I am doing my numbers from memory right now, but know we will see some good PhysX numbers from this test. The level of PhysX selected, directly impacts the test. Systems running a dedicated PhysX GPU should shine.
I also set my Ambient Occlusion to ON in the Nvidia control panel.
Last edited by Talonman; 08-12-2009 at 12:33 AM.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Thanks for the heads-up Talonman, trying the DoD demo now. Btw, I don't think the cape in Batman is affected by PhysX - which is really weird since that would be the most obvious and visible use of cloth effects. Who knows what these devs are really thinking![]()
When did that become a problem though? Everything our GPUs do is eye candy. I never got why "physics" eye candy was somehow second rateAnd it's kinda hard to fault devs for using hardware PhysX, it's not like there's a competing solution out there. What I find fault with is how they're using it.
Only ironic thing is that all this physics crap could be done on CPU. I tried Mirror's Edge with full PhysX run on CPU. Back then i was using an overclocked E5200. It worked like crap. Today, equiped with i7 920 and the game runs just as bad with it. Don't tell me thats even virtually normal. Red Faction and Max Payne 2 looked just the same impressive half a decade ago. NVIDIA, you can lick my ballz. I'm not buying the "this can only be done on GPU" crap. Either the CPU code is intentionally full of garbage or they intentionally use some killer uber precision (that no one really needs) just to kill performance on CPU. I mean, give me a break, few cans and papers can fly around on a freakin AthlonXP 2400+ using Havok. You don't need an extra 200 eur gfx card to do that.
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
The rest of your post doesn't disqualify your categorizing it as a problem so I didn't need to quote the rest in order to raise my question. Which you still haven't answered by the way
Rezjor, the proof is in the pudding. If all these things are doable on the CPU, why aren't they being done by other people? The easiest way to disprove the need for PhysX is to point to alternative CPU implementations of the same things.
Intel i7 920 C0 @ 3.67GHz
ASUS 6T Deluxe
Powercolor 7970 @ 1050/1475
12GB GSkill Ripjaws
Antec 850W TruePower Quattro
50" Full HD PDP
Red Cosmos 1000
And who would "the other ppl" be? There is just NVIDIA with PhysX and Havok. And you can't just walk out into the scene and expect everyone to know your physics engine or even use it in developed games. The games that were using Havok just weren't developed with extensive physics in mind, they always used it in minimal possible way just so they could run even on crappiest dual core.
Just take Painkiller for example. All the physics affected debris and objects just fade away after 5 seconds. But when all the same objects are flying around i couldn't notice any slowdowns. So why not leave them there for 10 minutes so they are affected by any later rocket/grenade launcher or corpses? Or Doom 3 back then. All monsters just evaporated after a second. But when i used a no-evaporation mod, the monsters were flying around corridors, hanging on railways, sliding off slopes etc. Half-Life 2. Stuff faded away much later and i couldn't notice any slowdowns. Ever.
I wonder why they don't implement Havok with different levels of physics detail. So if you have crappy CPU, you can set it to Low. But if you have high end quad, you could set it to High and experience larger amounts of objects be affected by physics.
But now it's just always "ON" with no adjustable levels. So they just bog it down to a very minimum so it works on all systems, starting from crappiest dual cores).
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
E8400 @ 4.0 | ASUS P5Q-E P45 | 4GB Mushkin Redline DDR2-1000 | WD SE16 640GB | HD4870 ASUS Top | Antec 300 | Noctua & Thermalright Cool
Windows 7 Professional x64
Vista & Seven Tweaks, Tips, and Tutorials: http://www.vistax64.com/
Game's running choppy? See: http://www.tweakguides.com/
You mean exactly like what Havoc and Apex destruction does...
Give it more time... These developers aren't sitting still. I am sure thay are trying to make their physics engines as interactive as possible. I think they are making progress.
I also love eye candy, so I don't hold that against them.
Actually to run PhysX does require more than just a CPU, if you want to run fast, have a decent amount of PhysX work implemented, and you plan on selling it to more than just the 2% of the market with the i7 boys....
Looking at my numbers posted above in Darkest of Days, I am convinced that a CPU would NOT do a good job for you calculating the extra PhysX work too. My dedicated PhysX 280 is faster than my Q6600 in both single and double precision calculations. Even with that, you can see PhysX is taxing to my system, depending on the level of PhysX I select.
I am unconvinced that my Q6600 could also do all the PhysX work, that my dedicated 280 is doing now, and not have a major performance issue...
It was the same story with Cryostasis. These new games are getting expensive on resources I think...
http://www.evga.com/forums/tm.asp?m=...1&key=�
Bottom line is the extra eye candy would have to go. I don't want that.
I am glad the developers are cutting their teeth on all physics engines, and think their method of incorporating them into the game will only improve. Sorry if there aren't enough effects for you in the games yet...
Using physics to be able to run on the crappiest dual core is exactly what I hope developers don't do. GPU physics is going to take us places that dual core CPU physics will only dream of. We are in for a good ride!!
Are we talking about this game?
http://www.youtube.com/watch?v=Smgz66Q9bfU
I think if we left all destructed objects and corpses on the screen, for the entire game, it would be a resource hog. I am not against the idea, but for the tech options we have available to us right now, performance wins over keeping objects on the screen infinitely. Call it a trade off.
Nvidia has adjusted the length of time particles stay on your screen once already, from the way it was back in the ageia days. With more powerful GPU's getting made, they may bump it back up some day down the line.
They should have various PhysX effect levels in games to select from. I agree, but they probably figure the game will also run fine withought PhysX too, so the others can just run that way? Not sure myself...
Last edited by Talonman; 08-12-2009 at 05:23 AM.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Well you make it sound like anybody can do it. In any case Havok has been around forever but have you taken a look at their toolset lately? Ever wonder why they don't have support for SPH or physically simulated particle systems? If it was doable on a CPU it would've been done long ago!
That's because those "objects" aren't physically modelled at all, they're just the usual graphics tricks that fool so many people into thinking the CPU is actually doing something.Just take Painkiller for example. All the physics affected debris and objects just fade away after 5 seconds.
Because even the lowest level of detail of some effects will kill even the mightiest CPU. This stuff is all very well documented, no reason for guesswork.I wonder why they don't implement Havok with different levels of physics detail
Another that thinks PhysX, AA, and any other form of eye candy is a waste in games.
I wonder what developers should be working on for you? The Nintendo Wii?
It ENHANCE game INTERACTIVENESS and EXPERIENCE with crap graphics... Right up your alley.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Meh I watched the compero vids on youtube and frankly the differences were not that striking. Yey some cheese fog effects that will effect you but not the AI. Somety sparks and whoopdy to for a cape and some banners. Oh yeah you can scuff up the floor..
Waste of time.
I will never jump on the physix crap its still doesn't look or feel anything remotely close to RL. What I would pay for is an AI accelerator.
It is rather odd that on any related Cuda, or PhysX post, we have to re-hash the debate if PhysX is a good thing. The game, or main subject of the thread always seems to get bogged down by posts about what bad guys Nvidia is, for giving us PhysX for free.
I think both of these games look good, and at least Batman is fun to play. Darkest of Days will probably be too.
That should trump all.
Last edited by Talonman; 08-12-2009 at 05:48 AM.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Hmmmm somebody go tell Nvidia, AMD, Sony, Microsoft, iD, Epic, Crytek, Insomniac, InfinityWard, Naughty Dog, Codemasters and Co that they're wasting their time and millions of dollars on all those non-essential technologies that make games look better.
It's the images that communicate what's happening in the game to the user. What exactly is "gameplay" that I can't see? If I have AI that properly reacts to and makes use of shadows in the environment, it really helps believability if those shadows are actually there on the screen. If I want to get a sense of chaos on the battlefield it really helps to have smoke, debris and just crap in general flying everywhere....etc, etc. Don't really understand how you can separate what you experience from what you see. It's not like you're reading a book.
+1![]()
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
None of which has anything to do with PhysX...
You simply want games fun to play. I do too.
Don't hold eye candy responsible for that.
I love UT3 by the way...
I am a firm believer that eye candy and fun games can work well together.
FYI - The boys want to have a contest to show the performance variance between the dedicated PhysX processors we select...
http://www.evga.com/forums/tm.asp?m=...5&key=�
Stay tuned there for some numbers.![]()
Last edited by Talonman; 08-12-2009 at 06:56 AM.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
You make it sound like stunning graphics leads to moronic AI. The point is that supersmart AI is a lot more effective at improving the gaming experience if that AI operates in a dynamic, engaging environment.
That's because our expectations and standards 10 years ago were much lower. Don't let nostalgia get the best of you. I remember being blown away the first time I played Mortal Kombat II. I saw it again a few years ago and it looked like vomit to me, I couldn't enjoy it cause it looked so bad. It's not the game's fault, it's just that I've updated my expectations over time. This isn't unique to graphics, our expectations for gameplay have changed as well.games 10 years ago, even 15 years ago had MUCH BETTER gaming experience.
We don't have any fewer great games now than we had back then. I remember a lot of crap that came out for Nintendo and Atari back in the day. But people have a tendency to falsely think that every single game back in the 90's was a blockbuster.
Last edited by Talonman; 08-12-2009 at 07:30 AM.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
I think he was referring to the Nvidia only AA. Though his proclamation that he won't play the game as a result is strange. Adding AA for Nvidia didn't take anything away from AMD users, so missing out on the game just because of that is just sour grapes or some sort of silent protest I guess. Though most UE3 games don't have AA anyway so just rename to Bioshock.exe as you're used to doing and call it a day.
Opps... Good call.
But I am glad that me using a 280 for PhysX isn't a total waste for game performance like some think.
PhysX is demanding, and a 280 does serve me well.![]()
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Bookmarks