Read: determinism
Printable View
Read: determinism
it uses randomized basses for the calculations or set values if u have set values it will always go the same (as in BC2 with havok or pshyX with multilayer games) the problem is in when u have the randomized values it ends up different. if u look at things like source were it dose that it will send debris randomly on every1s clients as there are no set values. if u dont then u get ghost recon with phsyX were things move differently on each players client and it turns into a huge mess ending with no physics being used since it has to be removed
so if u want random or non stock explosions/physics with things that people interact with in multilayer u have to have set values generated by a central client then distributed but that would add alot of latency. the only other way u could do is to make a way of not using floating point but then it would be a mess and with alot of overhead.
It's not really so hard. The clients can sync with the server or agree with eachother on a seed value at the map/match start. Subsequent operations based on the seed will look random but are deterministic.
People don't understand chaos theory very well. It's not that you can't determine the outcome - it's that the outcome varies wildly based on small differences in starting values. But if the starting values are exactly alike (binary) then the outcome will be the same.
but the values are generated based on rounded floating point numbers and on positional data that can vary client to client. im not saying that it cant be done its really impractical or seams that way based on the previous implementations
The Physics has to be done at the server level, job done.
The initial seed values are exact binary exchanges. Any subsequent calculations based on the seed should come out the same - assuming everyone is running processors that conform to established standards.
I like the PhysX Nurien demo, anyone got some more examples?:ROTF::ROTF::rofl::rofl:
I've pointed out that this is an obvious, but unfortunately, unfeasible solution. A rube's solution.
Okay, so you do the physics calculations server-side. And THEN what?
Good thing with have established standards, then. Such as x86.
The no-brainer fix to this is to generate them based on something that's no so prone to desyncrhonization. A random seed can be any arbitrary variable. You could even make the server spit them out once every N seconds.
How is one thread full of so much fail? And more importantly why is it so hard for some to understand that we don't want something for nothing from Nvidia. We only want what we paid for! We're not asking for our ATI cards to run PhysX, we want our Nvidia cards to do what we bought them for. How is that getting anything for free? As a paying Nvidia customer I expect the product I payed for to do what it says it will do and in fact is supposed to do.
Then we have some people who say why should only Nvidia pay for development to support ATi users? We don't want them to pay for anything! In fact we are trying to save them money by not having to keep paying their developers to purposely cripple their own cards! Obviously they are paying their devs to do this. They could have just left it alone and none of this would be an issue and hey, they'd have a few more bucks in the bank. And more importantly they wouldn't have generated all this negative press and inadvertently shot themselves in the foot.
Why would ATI/AMD need to pay Nvidia to use it when the ATI card is not using it. Customers are paying for it when they buy an NVIDIA card, Nvidia is just a bunch of :banana::banana::banana::banana::banana:s that say "if you dont use us as primary, then you don't get to use the product you purchased".
The hilarious thing is an actual Ageia PPU will refuse to work with an ATi card as well.
It seems you have some assumptions in your theory which aren't really accurate. Order of operations can have this varying effect you're referring to on calculations. Things can be different driver to driver.
Server-side seems the way to go to me in order to ensure the same output across multiple computing architectures (think some physics on CPU some on GPU), at least for multiplayer games.
I already said this,it was ruled as unfeasible.:eek:
it's reasonable to keep the position of ... let's say, 64 player characters synchronized.
It's entirely unreasonable to keep coordinates of 200,000 physics objects synchronized.
Read: unfeasible solution.
(a lot of posters in this thread seem to have trouble reading)
By quoting this post in its entirety, I affirm that I can neither read nor write, and am, in fact a big dumb babby.
It sounds perfectly feasible to me.
Because we already have it.
I play a game of pool on-line with 12 others watching each game is random & so is each hit, everyone sees that ball fly around & interact with the other balls on the table exactly the same as everyone else, because the server is sending out the same results to everyone even tho the the balls gets it differently every time i take a shot.
Actually i do understand.
It just need to be established that its a matter of computation power & not because of everyone would get random results to each other which would happen if all advanced physics was computed on everyone's client side.
This just in: Source engine game support player-interactive physics and have since release.
off topic much?
If the computers are all in sync then they should be performing the same operations in the same order on the same data. Client side prediction throws a wrench in this by letting the local computer work with data the others don't have yet. That's when it (the game state) becomes a chaotic system. Until the other clients are updated with the new data (and vice versa) the state of the system diverges more and more wildly with time. That's why I advocate an honest dedicated server and why real peer-to-peer game networking died shortly after doom. Because even if the clients' calculations are diverging for whatever reason ( lag, etc ), it will get synced with the server periodically. Hopefully in short enough intervals that the divergence doesn't harm gameplay.
So what if debris flies around differently on different clients. They aren't critical to the gameplay anyway, it's just visual. Just like entire PhysX thingie.
They are castrating their own cards, screwing their clients, loosing some extra sales and gaining bad reputation/haters in the process (even more).....
I canīt understand how this guys of Nvidia are making business with those "brilliant" ideas :shrug:
Oh right! I didn't notice that i was hitting the ball in the same place & power & spin every time & every shot was exactly the same & every game had the same out come.
Dice has a less predictable outcome because you have initially less initial control over it. but once things are set in motion then it gets calculated all the same.
And yeah HL2: deathmatch... is great fun.
I have a Asus M3N78 with a GF8200 IGP, which is currently disabled in favour of a overclocked HD3870 512MB. Anyone interested in the results if I enable the IGP and install the drivers and hack?
If I remember correctly the BIOS has a option of what it looks for first when the IGP is enabled, but doesn't disable any add-in graphics card. Wouldn't take much effort if people want any specific details.
I have a 3870 512 gddr4 that I can plug in,so we can compare Ageia ppu vs onboard Geforce 8200.:up:
You're on :D I'll get to it later or tomorrow. I'm going to meet a hot girl in 3 months or so so deadicating lots of time to buffing up again :up:
am i the only one who realise the hydra chip solution by msi will not work...........
As has been pointed out, this phrasing is a strawman.
No one wants Nvidia to offer Physx to ATI for free.
What people want, is the product that they purchased to work.
This isnt about ATI it is about paying Nvidia customers who bought an NV card for physx. Again, nowhere on the package does it state "Physx will not run if you have an ATI component in your system".
In one sense, NV could be considered guilty of false advertising.
Still, you can't expect that a FEATURE of a specific product of a certain manufacturer to be compatible with it's direct competitor product as well, even though they share the same "functioning" environment. It makes no sense.
I am pretty sure no court would blame nVidia for not guaranteeing proper functioning of a feature of it's own hardware when a competitor's hardware is present on the same platform.
They are not obligated to make a feature of their own compatible with their competitor's products, therefore, they can avoid exposing it's own product/feature to a supposed non enjoyable/ non compatible/random experience by simply disabling that feature under specific conditions.
It's their feature, they decide how you can use it or not.
Don't agree? Don't buy.
Problem solved.
Except PhysX will run even if an ATI card is present. The catch is, PhysX and rendering must be processed exclusively on the very same nVidia GPU, while ATI will be doing nothing.
I repeat, they (nVidia) are not obligated to make a feature of their own compatible with their competitor's products, therefore, they can avoid exposing it's own product/feature to a supposed non enjoyable/non compatible/random experience by simply disabling that feature under specific conditions.
And do you know what you can do about it? Hack it, like it has been done before, or moan about it on some internet forum.
Omg not again. How many more times do you people need it spelled out? Allow me to try; I'll do it bullet-point style to keep the needed comprehension to a minimum.
1. People are not complaining about PhysX not working on ATi hardware
2. People ARE complaining about features not working on nvidias own hardware, which they paid good money for, just because a ATi card is detected.
By your logic, your saying if you bought a car and the accelerator / brakes didn't work together properly you wouldn't care. Their designed and work within the same "operational environment", just like graphics cards. Win 7 makes it possible for 2 graphics drivers to work together on the same system harmlessly, and synchroneously. Just like you would expect the relationship to work in a car engine and all of its components. nvidia have disabled their own hardware in a system simply if a ATi product is detected. Something which you can comparatively compare to a child throwing a tantrum because he/she has to share their toys. Only in the case of nvidia, they are disabling a feature which in no part communicates with the ATi card just because they don't want a consumer to run a card from the competition alongside their card. Again I must stress the nvidia card in NO way, shape, or form communicates with the ATi card. These practices from nvidia are nothing short of anti-competitive and intended only to be used to try and monopolise. If AMd/ATi chose to persue legal action, they would have another intel scenario on their hands, namely, getting a :banana::banana::banana::banana:load of cash in compensation.
Its not about not agree because your simply wrong about that facts just as much as 4 + 4 = 6.
Phyxs GPU acceleration is not compatible with anything else but an NV GPU because its only running on the NV GPU.
If Phyxs was running on the ATI card then it would be called compatible. so unless the hack is making Phyxs run on the ATI or that people are asking for Phyxs to run on the ATI card then your comment is totally null & void & makes no sense at all, 4 + 4 = 6.
Phyxs is just a program that does not even need to be graphical aware just like the thousands of other programs running on windows that don't care or even need to know what the output GPU is.
As I read it, hes saying that legally Nvidia doesn't have to give accelerated Physx to Nvidia card owners that want to run it beside an ATI card rendering because that's an untested environment. I agree but personally find that to be about the weakest excuse for corporate laziness ever. Yeah, ATI and Intel don't legally have to test their chipsets with competitor's hardware either, but you can bet that people would be complaining if either or both decided that their chipset drivers shouldn't work with Nvidia cards because they didn't feel like testing in that environment.
And we all know that it isn't about not being a tested environment, that's just an excuse. It used to work fine with older versions and works fine with the hack.
False dichotomy. I'll take option 3: Don't agree, buy (if the product otherwise compares favorably), and hack. Or maybe I should just take your second option and not buy. :rolleyes:Quote:
Originally Posted by Luka_Aveiro
what happens if u have an intel igp will phsyX work.
also if u bought an NV card in the EU can u get a partial refund every time NV takes features away like this. people who bought fat ps3s got like 100 euros back when linux support was removed so since they removed the use as a PPU u should get money back from the retailer who then would get some back from the manufacture
I agree that it's a weak excuse, but companies do it all the time. I don't know if it would stand up in court, but that's a different issue.
As for imagining a software and hardware industry where eulas are filled with such BS I don't have to - we live in that world right now.
It may or may not be legal. I suspect the courts will have to decide that very matter in the near future.
But regardless of the legality - I already paid for the card and intend on using it to my satisfaction. I may very well take Luca's suggestion and not buy an Nvidia card next time if they haven't changed their policy and all else is equal. But in the mean time I'm going to keep using Physx on the card I already have, regardless of what the law or Nvidia says about it. I know what I believe is right and Nvidia is powerless to stop me from acting on it.
I'm not saying how the game you are talking about works...but you would agree with me in that pool is not random. Is there physics in pool? Sure there are...but not randomness.
Lets see. Imagine I bought and Ageia PPU some time ago. Why am I not allowed to use it now? The problem is not that NVIDIA deactivates its hardware acceleration por physx if they detect some non-NVIDIA card...the problem is that they didn't do that at the beginning and, as such, many people bought a NVIDIA card just for this reason. And now what, they can change the rules of the game for no reason? This can't be legal at all, imagine they did the same with the 3D option: you buy the screen, the glasses and the NVIDIA card (to match your ATI one) to rule 3D. Then, all of a sudden, they restrict 3D to only-NVIDIA-rigs...and you get screwed. This is NOT normal, no matter your moral, its not.
By its intended purpose its not meant to be, but by the very nature of the objects the out come can be very random from what was intend.
Dice have to be thrown, if pool was played the same why with throwing the cue ball at the balls on the table the same way you have to throw a dice then it would just as unpredictable.
If i placed a Dice on the table & hit it with the cue so that it did not roll & tumble then the out come would be more predictable with what number came up.
Its all about the initial control over an object & the rules of play in whether or not how random the possible out come will be in these environments.
Anyway Physics is governed by rules it only seems random to us because we don't have to mental capacity to simply work it all out & the ability to controls everything precisely enough.
Simple, it's because the whole thing is just a tempest in a teacup. Just because there are a few guys on forums royally pissed off about something the mainstream/OEM/casual market won't necessarily reflect that. Most people just don't know and/or care.
The resentment stems from people feeling that they have the right to use their Nvidia cards for PhysX even if they have an AMD card installed as primary. That might be a reasonable stance but really, the vast majority of people don't get that emotional about their computer hardware. Maybe if Dell or some bigtime developer makes it an issue it will have an impact to the bottomline.
Nvidia's inability to execute on their bread and butter business will hurt them a lot more than anything related to PhysX ever will.
The reason why the majority don't get upset about such things in the computer world is through ignorance of the platform and that many cant see the potential harm of something & that it is happening until it far to late.
Even Hitler was non threatening & cute once upon a time.
majority of people only run a single GPU as well, and to them technically physx is useless given the immense slowdowns it causes when being run on the same GPU as graphics are also being done.
Really PhysX is only for multi-GPU people and tech demos.
technically it could be for anyone, but nV claims the dev's are the ones not allowing multithreading.. so...
Basic business sense - the more products you sell, the more profit you make.
If Nvidia allow their cards to work for Physx allongside ATI cards, more people would have bought them and Nvidia would make more money.
I wouldnt like to be in yours, or Nvidias Business or Economic classes.
There was a film where he's DNA had been saved & decades later he was brought back as a new born baby & people were trying to kill him while he was still a child for preventative measure.
You don't leave a cancer to grow no matter how small it started out when you became aware of it.
Anyway you get my point.
If your point is that aggressive business tactics by a graphics card company is comparable to cancer, war and genocide then I certainly don't get it.
As usual you have a habit of totally not getting something because comparing it literally to the severity & not the obvious point of prevention of wrong doing or harm because it currently does not represent any great problem does not mean it will not become one if unchecked.
Just because someone goes to prison for one crime does not mean that anyone is say all people going to prison are considers to have committed crimes of the same severity.
What's there to get? You're trying to add gravity to a trivial issue by making ridiculous comparisons to widespread death and murder. The only reason you feel compelled to do so is that you realize the issue itself isn't worthy of the intense disdain you feel and therefore need to reach for something more provocative.
Tell me, what's the "great problem" that will befall us if Nvidia isn't stopped?
And i will have to repeat the point again because as i have just said to not take the comparisons literally & yet you fail instantly.
The point is that small problems can become bigger ones if gone unchecked & no comparisons are needed to understand that.
And if Nvidia is not stopped i would rather not have to find that out because the possibilities are endless but i can not see any good coming from it.
Yes, of course but that alone isn't of any practical concern unless the "bigger ones" are actually serious. My point is that the melodrama is overblown in context of the issue we're discussing. What's the worst that could happen? Some people don't see some effects in games? Nvidia goes out of business? The world is surely coming to an end.
Indeed blown out of proportion.
And of no practical concern unless the "bigger ones" are actually serious is a matter of opinion.
Having Fun is a serious concern for many & being a PC enthusiast & PC gamer is part of my fun & we have already seen the effects of what consoles are having on PC gaming & we don't need to have any help from PC hardware companies fragmenting gaming further.
Phyxs is dead as far as i'm concerned but what next will NV do if the message its ok every time it pulls such a stunt.
And let me remind you that i have said in the past that if ATI was doing the same my resolve would be just the same.
I'm not sure why they're trying to keep it a closed API. If anything, more people would use PhysX if both ATi and nVidia supported it.
Despite all Nvidia's PR BS, i think they push in the right direction. Geometry, physics and IQ is where real time lacks from now. Hardware makers put all of their effort on shading/texturing for like 10 years, we ve reached a point when it's time to move on something else. It doesn't mean there s no more boundaries to reach on shaders, lightning, etc..but the evolution ratio should be different now.
PhysX is a really good intention, with a bad PR strategy.