Meh I don't understand NVIDIA, if they want to make sure PhysX will live they should make sure it runs on both NVIDIA and ATI cards, it's just a question about when, another physics standard overtakes PhysX.
Meh I don't understand NVIDIA, if they want to make sure PhysX will live they should make sure it runs on both NVIDIA and ATI cards, it's just a question about when, another physics standard overtakes PhysX.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
Why is nvidia shooting itself in the face?
MSI P67 GD65 B3
2500k 4.8
GTX480 835/1650/2000
8gb ram
Win7
becouse for phsyX to adopted as an integral game component and not some BS marketing tool it needs a high adoption rate, by deoptimizing the pc and actively stopping their cards from being used as a PPU they are stopping adoption so when people need physics for a game they go for havoc since it has proper cpu support.
think of it like movies u can use a dvd (no physics or agea style phsyX cpu), an enhanced video dvd (NV phsyX with gpu) or a blu ray (havoc, u need a cpu but it has good adoption)
Last edited by zanzabar; 04-26-2010 at 12:42 AM.
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
This is just pathetic. NVIDIA, burn in hell with your infinite idiocy. One more reason to hate them, fu**in' morons. This actually confirms that their PR is being run my chimpanzees and orangutans. No, wait. That's discriminatory to the apes. Their PR is led by retarded amoebas. Yeah, that's more like it.
This is like cutting your leg off to prevent infection instead curing it with antibiotics. They rather cut off a small margin of income created by existing Radeon users than eat their fat pride and let those few users use PhysX. Because no one is stupid enough to sell their HD5850 and wait for (still) non existing GTX 460 just to get dumb PhysX lmao.
It just doesn't compute.
Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
Super silent cooling powered by (((Noiseblocker)))
With any luck, games will start using the OpenCL supporting Bullet physics engine. Then it will run on all cuda/opencl supporting GPUs, and we can all be happy. I think Havoc is working on OpenCL stuff too?
Essentially, I expect these open platforms to eventually win over PhysX, however much money nV throw at developers to use it.
Bah!! it looks like the bomb already exploded in their own hand![]()
They opened a can of Whoop-ass!
So if i understand correctly, Nvidia doesn't want you to run physics if you buy any Nvidia card and combine it with a 785G mobo? If this is true, it is quite sad....
CPU: Intel 2500k (4.8ghz)
Mobo: Asus P8P67 PRO
GPU: HIS 6950 flashed to Asus 6970 (1000/1400) under water
Sound: Corsair SP2500 with X-Fi
Storage: Intel X-25M g2 160GB + 1x1TB f1
Case: Sivlerstone Raven RV02
PSU: Corsair HX850
Cooling: Custom loop: EK Supreme HF, EK 6970
Screens: BenQ XL2410T 120hz
Help for Heroes
Ok, I don't know where to start. First of all, ATI was offered Physx support if they paid licensing fees to nVidia. If they would do that, nVidia would help ATI run Physx on their Radeons. ATI, however, chose not to. Now how is it "theft of their technology" to BUY a nVidia product, which says it supports Physx on the BOX, to actually use it? It's not like the hacks are allowing you to run Physx on the Radeon cards. You are paying nVidia for a card, which supports Physx. Sounds fair to me you actually get to use Physx, no matter what other hardware you might be running in your PC.
They already work together splendidly. This hack shows that. And it was possibly with older drivers without any hack. It not working together is 100% an artificial lock made by nVidia.
You're stating something different here. You are saying in your example "to use with my nV card". That's exactly the point. Ofcourse it's only fair to pay nVidia for using Physx, but people are already doing that by buying their cards. It would be totally unfair and theft of technology if the hack allowed you to use Physx on a Radeon card, but it does not.
Add in the fact that people who bought the original Ageia Physx card can't even use theirs anymore, even though it's 100% possible to still use it by using this hack, and it's very clear nVidia is at fault here. Just think about it.
Last edited by Musho; 04-26-2010 at 03:55 AM.
Why do ATI need to pay for Physx when they arent putting it on their graphics cards? Common sense fail?
If people buy an Nvidia card to use Physx, they have paid in full to use a graphics card with Physx. ATI dont have to pay Nvidia because their customers also want to purchase Nvidia products for Physx.
Its funny though, Nvidia only lose sales and damage their reutaton from this. People using ATI setups arent exactly going to ditch their ATI cards just to use Physx, which is what Nvidia are trying to make them do.
I can live without it. The only game I play that supports Physx is bugged and crashes like crazy with it enabled according to other users (Sacred 2).
IMO Intel and AMD should just stop Nvidia cards from working on their chipsets altogether, and not provide them with the rights to make their own chipsets as Intel already do. Thats pretty much doing the same thing that Nvidia want to do with Physx.
Last edited by Mungri; 04-26-2010 at 04:21 AM.
Agree 100%. See how their own medicine tastes. intel basically threw nvidia a bone by allowing sli to work on x58, right? Now make it not work. Or make it so slow its worthless. I think intel can manage a way to make that work.
I don't know why the ftc is allowing stuff like this if it was obviously against the law for intel to do the same things to AMD.
Aaron___________________________Wife________________________ HTPC
intel i7 2600k_____________________AMD5000+ BE @ 3ghz___________AMD4850+ BE @ 2.5ghz
stock cooling______________________CM Vortex P912_______________ Foxconn A7GM-S 780G
AsRock Extreme 4_________________ GB GA-MA78GM-S2H 780G_______OCZ SLI 2gb PC6400
4gb 1600 DDR3___________________ OCZ Plat 2GB PC6400___________Avermedia A180 HDTV tuner
MSI 48901gb 950/999______________Tt Toughpower 600w___________ Saphire 4830
Corsair HX620____________________ inwin allure case___________ ___ Coolmax 480w
NZXT 410 Gunmetal________________Acer 23" 1080p________________ LiteOn BD player
X2gen 22" WS________________ ________________________________ nMediaPC 1000B case
^^ Is it illegal under fair trading to buy an Nvidia GPU clearly advertised with supporting Physx, that doesnt work if you have hardware from either ATI or Intel?
I would think it is. AMD should definately give Nvidia such a shove and disable support for their cards on AMD motherboards. Let them see exactly what its like.
First off is PissX worth the money?
When I game my eyes are focused looking at my next move or action, not staring at how good a dust cloud looks or the way cans fall off a table, google or photochop is a lot cheaper to use to see the things, like pretty pictures.
Dont get me wrong PissX looks great on a cartoon game like spongebob or maplestory if your into those games.
I would better understand nvidia's move if they were still making chipsets, but for the price of PissX these days they should add a pack of lube with each card and call them FTF* editions*(F*#% the Fan).
Gigagyte Z68X-UD3P
i7 2600k@4.6/1.35v
GSkill Rippers 2133
MSI 7870@1275/1450
Antec Kuhler 920
i think they should send a eye gouge'in ninja over to anyones house that wants to use phx on any brand card.
but thats just me
_________________
Talk about completely missing the point?
People want to buy *NVIDIA* cards to use Physx, not a card by any brand to run it.
Can anyone who doesnt get this care to explain what is wrong with adding an Nvidia card to a PC to use a physx accelerator? People would be using Nvidia hardware for Physx, not just 'any brand'.
i did not miss the point at all
it seems many miss the point by trying to run nv phx with an ati card installed no matter if they have an nv card to do it.
_________________
Honestly, after seeing what CPU accelerated Havok physics could do in Just Cause 2, I'm totally unconvinced that GPU accelerated physics should even be on the table anymore. That goes for PhysX, OpenCL apps, etc, etc.
Why would I want the rendering power of my GPU compromised by doing physics calculations while my 4-8 core processor sits there twiddling its thumbs on most of its cores? I don't give two hoots if the GPU can calculate fluid dynamics, rigid body, AI, and so on more efficiently when my processor can do it inefficiently and actually USE the threads it isn't utilizing.
I wonder if Charlie is going to write anything about this one? I get the impression there is more to it then what's told.
[SIGPIC][/SIGPIC]
Well, potentially a GPU (even an older one) could do way more simultaneous physics objects than a CPU. So if you have a random 8800gt lying around, it would make a way more efficient physics calculator than your new top of the line CPU.
But on the other hand, current implementations of physics features don't really have that many physics objects to make it necessary. I'm guessing it's got something to do with most of the market not having any kind of consistent hardware physics solution. So most will design games in mind with the CPU still being able to do physics. Unless, of course, they don't like money or something.
Last edited by iddqd; 04-26-2010 at 06:06 AM.
Sigs are obnoxious.
NVIDIA are just idiots, though this kind of behaviour does not surprise me, considering the past stunts they have pulled.
AMD Threadripper 12 core 1920x CPU OC at 4Ghz | ASUS ROG Zenith Extreme X399 motherboard | 32GB G.Skill Trident RGB 3200Mhz DDR4 RAM | Gigabyte 11GB GTX 1080 Ti Aorus Xtreme GPU | SilverStone Strider Platinum 1000W Power Supply | Crucial 1050GB MX300 SSD | 4TB Western Digital HDD | 60" Samsung JU7000 4K UHD TV at 3840x2160
Bookmarks