ATI can do 3d as well http://3dvision-blog.com/clearing-up...ic-3d-support/ :shocked:
Printable View
ATI can do 3d as well http://3dvision-blog.com/clearing-up...ic-3d-support/ :shocked:
You can have stereoscopic 3D output on ATI as well.
But the tech is too expensive to bother with atm if you ask me...
I bolded some other part of your text to show where the problem lies. If ATI paid to license PhysX, what's to say Nvidia didn't change the way it worked to make it perform worse on ATI cards? Bullet physics use OpenCL which is open for both ATI and Nvidia - a much better solution. There was a thread about Nvidia making PhysX run OpenCL but still wouldn't run on ATI hardware - that's just crazy.
Also I believe many people are upset because they bought a Nvidia GPU to run PhysX but can't anymore because of the newest drivers (combined with ATI GPU/IGP) even though the games worked flawlessly before.
Final point: The reason a lot of people are angry at Nvidia is because of shady business tactics. They rename GPUs more than twice, disable AA support for ATI hardware in games even though it works (Batman: AA), disable Dx10.1 in Stalker (the two latter were done by the game devs but the reasoning behind it was less than convincing).
ati has their own 3d-tech in the pipeline and i couldn't care less about any 3d-technology for my pc.
the only good thing about 3d is that now a few 120hz tfts show up on the market. all i want are the 120hz (instead of 2x60hz for 3d) though ;)
btw, i also think that eyefinity is useless - atleast for me. 3d, physx, eyefinity... these are all marketing gimmicks. only a small amount of people actually care about them or actually use them.
regarding physx:
i think everything was said already.
imo, some physx demos are really nice and look great, but this whole "nvidia has it and nobody else" is a vicious circle. the customers don't benefit from that, because without physx being available to everyone it can't be used to its full potential in games (as fundamental game elements would only be available for nvidia customers), and so everything that's left are useless eyecandy effects which don't justify the price you pay.
plus that we have enough unused cpu ressources that could be used for physics calculations instead of wasting gpu-cycles with physx.
i dont care for nvidia bs
if i want physx i'll just pop in an 8800gt n physx hack
that was proven to be a load of :spam: the indevidual who wrote the original article was using an interview with an nvidia rep from almost a year ago, wherein the rep stated that "physx in opencl is possible." from there the indevidual hypothesized that if they had started work on it at the time of the interview, it would almost be ready for distrobution. nvidia is not moving physx to opencl... not yet.
and as for physx as a physics engine... a lot of fanbois are about to crap thier pants. physx will work on any pc, because it runs on the cpu. physx eye-candy... only runs on nvidia gpus, and i can't see why anyone cares. if you own an nvidia gpu, use it... if not, whatever.
also marten, nvidia did not disable AA for amd gpu's in batman-aa. period. nvidia spent the time and money to make AA work in the unreal engine for their own gpu's (any expectation that they should have done so for thier competators product is absurd). if you want AA in batman-aa, go bang down the front door at amd and demand you some AA. additionally, the stalker engine favored amd products to begin with, so nvidia asking them to disable dx10.1 would have been pointless.
You may not care for 3D Vision, but my point is that the basic concept is practically the same. In other words, both 3D Vision and PhysX are proprietary technologies that require Nvidia hardware to use and enjoy.
I just find it odd that so much ire is directed towards PhysX because it's restricted to Nvidia hardware, whilst 3D Vision is also restricted to Nvidia hardware and yet no one cares.
I suppose it must mean that more people care about game physics than 3D gaming.. :shrug:
I'm going to have to disagree with you here. First off, hardware physics has always been proprietary, from it's inception. Hardware PhysX once required an Ageia PPU, but after Nvidia bought them out and ported their software to CUDA, we now require an Nvidia GPU instead of an Ageia PPU to run it.Quote:
regarding physx:
i think everything was said already.
imo, some physx demos are really nice and look great, but this whole "nvidia has it and nobody else" is a vicious circle. the customers don't benefit from that, because without physx being available to everyone it can't be used to its full potential in games (as fundamental game elements would only be available for nvidia customers), and so everything that's left are useless eyecandy effects which don't justify the price you pay.
This was years and years ago, and in all of this time, no open standard has been developed.. Who's fault is that? Nvidia's?
Of course not. If Nvidia hadn't bought Ageia, it's likely the company would have failed (the PPU never was popular) and hardware accelerated physics along with it. Even Havok once tried the hardware acceleration route (Havok FX), but it failed for some reason that I can't remember.
Anyway, if we didn't have PhysX, we wouldn't have ANY hardware accelerated physics at all, so I'm glad that Nvidia took the initiative rather than wait for an open standard to come along.
It depends on what sort of physics you're talking about.Quote:
plus that we have enough unused cpu ressources that could be used for physics calculations instead of wasting gpu-cycles with physx.
It's a fact that a GPU can process physics code much faster than any CPU, so the more complex and advanced the physics, the greater the necessity for a GPU.
If you've seen some of the PhysX demos (I posted some in this thread), you'd know what I'm talking about. You won't find any software physics demos that are even remotely comparable to what you'll find in a PhysX demo (especially the new ones), because even the most powerful hexcore CPUs are incapable of achieving the level of raw processing power found on even low end GPUs.
wow you really hate ati huh?
then why are you so excited about this :confused:
you said it yourself, its just patched on and it WILL NOT PUT IT TO GOOD USE... your words... yet your all excited and claim its the best thing next to sliced bread :stick:
oh and about nvidia customers should be happy to get something extra for free...
totally disagree... im actually on 260sli and i really hate physix... rather than those ridiculous effects they should push game devs to offer an option for higher res textures and better image quality... THATS what i am willing to buy a second vga for... not some silly physix effects that kick my min fps down a big notch for something i wouldnt even know was special and computationally demanding unless somebody points it out to me...
Didn't know that or rather, I must have forgotten that part. The point is still valid though, PhysX is proprietary and closed, Bullet is open. The other points are more open for discussion, in my opinion those acts were at least fishy but in the case of Batman and Stalker more on the game dev's strange reason for not implementing/improving things (or at least test them as in the Batman case).
Wow, all these completely closed minded people refusing to play a game because of Physx, are you kidding me? What....cause Nvidias name is on it. This is going to be a good game, i could care less if it has eyefinity, physx, 3d vision, Havok, whatever. Get over yourselves people, your not cooler for hating a company, turn physx off if your that against, wow. Don't blame the developers for accepting money to make a profit, Physx is not even a game changing technology (or is, depending on the way you think about it). Its still the same game with or with out it. Its not liek it's :banana::banana::banana::banana:ing required.
don't get me wrong, i'm fully down for some open-sauce physics thus giving the choice to the gamer, i'm just a bit flustered by the amount of misinformation surrounding the physx engine.... is it ideal? no. is it open? no. but it does work (eye-candy aside) on everything, and it works really well. i would love to see bullet in a few high-dollar games, but i feel that they're going to have to do something a bit more exciting than just open source code to draw the attention of devs from the usual havok or physx.
:2cents:
some people said they wont play it cause of physix? :confused:
i guess i missed those posts... but i agree, thats a weird reason not to play a game...
Probably because it has a negligible impact on anything. It would be like MS adding DX11 to some game when only engineering samples for DX11 hardware existed. It's hard to kick up a stink about something that is used by such a niche market (for both ATI and NVIDIA users) that it is irrelevant for all intents and purposes.
PhysX on the other hand locks a large segment of users out of what would otherwise be a universally usable feature: physics. ATI hardware can do physics. NVIDIA hardware can do physics. General purpose CPUs can do physics. Yet some developers choose to only allow one type of hardware to do what should be doable on all machines. I am an NVIDIA user at the moment, but I still don't approve of this lock-in idea because it slows down progress. Should 3D gaming take off I would also disprove of 3D Vision being used to lock users to a platform to play certain games, but for now it really doesn't make much difference, although I am still principally against it.
Hardware locking the api is wrong but so is killing off the api.Now would be a great time to leave like origin did,start selling hardware to accelerate a new better physics api that everybody can use. :up:
Sorry, that's not stereo 3d the way that NVIDIA supports it. NVIDIA leverages their position as the driver writer to make stereoscopic 3d work in games that didn't originally have any work done to support 3d.. normal games work in 3d with NVIDIA 3D Vision.
This functionality is not as robust with 3rd party solutions.
Also, the "ATI stereo solution" you're referencing isn't anything like NVIDIA 3D Vision.. titles can't be profiled to work in stereo 3D... the game has to be designed to explicitly take advantage of (i.e. make explicit API calls to their libraries) quad-buffer for stereo purposes.. if the developer opts not to bother with doing any work for supporting stereo (i.e. most devs) then you'll get no stereo on ATI.
The monitor makers will write drivers for all hardware powerful enough to use the feature.Sounds better than 3dvison :ROTF:
Can you link us to your resource, for such comments?
Most, if not all PhysX demos have $700 worth of video cards to run those demos.. but you are suggesting the 3 idle cores on the new CPU can't handle physical environments?
Secondly, why are you discussing "advanced physics" when there hasn't been one game from Nvidia, that offers what we see in Battlefield..?
Let alone something more advanced..lol
LOL @ gpu physics. In a couple of years nvidias tour in physics will be over, there won't be any reason to run it on gpu because cpus of both vendors will have enough power to make all the physics calculations much quicker and cheaper (performance loss wise & monetary) than with nvidia's gpus.
Add in the fact that havoc has nothing to worry to physix and you see in the forseeable future the death of physix, unless of course it becomes an open standard with freedom to run on any cpu.
Ciao gpu Physixs :woot:
What makes you think I hate ATI? :confused:
I've had ATI hardware in my rigs before, specifically HD 4870s in CF. ATI (or AMD) is a worthy competitor to Nvidia that makes quality products.
I do have a preference for Nvidia however, and this stems mostly from Nvidia's strong driver team and the extras (such as PhysX) you get for having Nvidia hardware.
I'm not excited at all. That post was mostly tongue in cheek because I knew that certain people on this forum would be very upset by the news, seeing as they hate PhysX with a passion :DQuote:
then why are you so excited about this :confused:
you said it yourself, its just patched on and it WILL NOT PUT IT TO GOOD USE... your words... yet your all excited and claim its the best thing next to sliced bread :stick:
so you basically admit to trolling then? :D
the info and the way you posted the info in is clearly meant to provoke people and cause a heated ati vs nvidia debate? ;)
i dont think that anybody really hates physix, people are just tired of the marketing and really dont care about those gimmick effects and dont want to get bothered about it all the time...
what makes me think you hate ati?
read your posts again heh :D
you seem really excited and happy that something happened that ati and ati fanboys dont like, not cause its useful or cool in your opinion, but just cause some people might not like to hear the news.
i really dont get you... you waving an nvidia physix banner and sing the their hymn, yet you admit that its actually nothing special and just marketing :confused:
Like I said to RaZZ, this is incorrect.
Whilst SOFTWARE physics has been a universal feature in many PC games, HARDWARE physics has always been proprietary and restricted to those that had the proper hardware.
This debate isn't over software physics. Software PhysX is found in hundreds of games, and runs perfectly fine on the CPU.
Hardware physX on the other hand requires specific hardware in the form of Nvidia GPUs, much like it once did with the Ageia PPU.
Hmm, thats quite an assumption. I myself have downloaded the rocket sled demo and run it using just one of my 470s with SLI disabled (USD 359.00), and it ran fine at 2560x1600 with 4xAA. There was a slight bit of lag, but that was mostly due to the high resolution and AA.
If you click on the link, the Nvidia engineer ran the demo using a single 480 and it was able to handle millions of particles.
Show me one CPU demo that can handle that many particles.. Also, the last time we had this discussion, I showed that the majority of gamers are still running dual core based processors.
Quadcore CPUs probably won't become fully mainstream until another year or more from now.
I haven't played BBC2 yet so bear with me, but what exactly do you find so impressive about the physics in the game?Quote:
Secondly, why are you discussing "advanced physics" when there hasn't been one game from Nvidia, that offers what we see in Battlefield..?
Let alone something more advanced..lol
Doing a quick look on youtube, it seems like BBC2 utilizes scripted destruction animations with a small number of rendered objects (in the hundreds or low thousands).
That sort of thing has been in games for years and years.. When Mafia 2 is released, we'll see how the explosions compare..
Now if you want to see what I find impressive, check this out.
Big explosions are yesterday's news. Games with realistic hair and cloth physics on the other hand have never been possible......until now.
yes, but i meant that hardware physics will never get mainstream if one or more manufacturers can't use the respective api.
you can do awesome stuff with physx, yes, but until everyone is able to make use of it, physx will always be eyecandy only. if you really want to take full advantage of physx you need to build a game from the ground up with physx in mind. but this will never happen when physx is only restricted to nvidia, as no dev in the world would lock out half of the market (which he would do if he was using physx for basic elements of the gameplay).
but what's this thread about anyways? there were a lot of discussions about the pros and cons of physx in the past here at xs - even in the news section. so there's absolutely no need to start the same discussion all over again. especially not if the thread was intended to provoke certain individuals anywas.