Page 2 of 7 FirstFirst 12345 ... LastLast
Results 26 to 50 of 172

Thread: Nvidia implements 'time bomb' for ATI users running PhysX

  1. #26
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Meh I don't understand NVIDIA, if they want to make sure PhysX will live they should make sure it runs on both NVIDIA and ATI cards, it's just a question about when, another physics standard overtakes PhysX.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  2. #27
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by RPGWiZaRD View Post
    Meh I don't understand NVIDIA, if they want to make sure PhysX will live they should make sure it runs on both NVIDIA and ATI cards, it's just a question about when, another physics standard overtakes PhysX.
    i could see the licensing on that, but this its running on an NV card. should intel stop their nics from working on amd systems or how about amd stops NV gpus from working on NV or maybe better MS bans all non microsoft peripherals from windows
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  3. #28
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Posts
    800
    Quote Originally Posted by blindbox View Post
    Timebomb?

    Nobody even stopped to think that this is a bomb that slows down time? It's probably part of the SDK tools that the developers are given to play with.

    You guys totally missed this part of my post! Everyone is thinking too deep.

  4. #29
    Xtreme Member
    Join Date
    Jun 2007
    Posts
    226
    Why is nvidia shooting itself in the face?
    MSI P67 GD65 B3
    2500k 4.8
    GTX480 835/1650/2000
    8gb ram
    Win7

  5. #30
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by nsegative View Post
    Why is nvidia shooting itself in the face?
    becouse for phsyX to adopted as an integral game component and not some BS marketing tool it needs a high adoption rate, by deoptimizing the pc and actively stopping their cards from being used as a PPU they are stopping adoption so when people need physics for a game they go for havoc since it has proper cpu support.

    think of it like movies u can use a dvd (no physics or agea style phsyX cpu), an enhanced video dvd (NV phsyX with gpu) or a blu ray (havoc, u need a cpu but it has good adoption)
    Last edited by zanzabar; 04-26-2010 at 12:42 AM.
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  6. #31
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    This is just pathetic. NVIDIA, burn in hell with your infinite idiocy. One more reason to hate them, fu**in' morons. This actually confirms that their PR is being run my chimpanzees and orangutans. No, wait. That's discriminatory to the apes. Their PR is led by retarded amoebas. Yeah, that's more like it.
    This is like cutting your leg off to prevent infection instead curing it with antibiotics. They rather cut off a small margin of income created by existing Radeon users than eat their fat pride and let those few users use PhysX. Because no one is stupid enough to sell their HD5850 and wait for (still) non existing GTX 460 just to get dumb PhysX lmao.
    It just doesn't compute.
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  7. #32
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Andrew LB View Post
    If ATi would simply pay for the rights to use PhysX (along with the costs of driver development to allow both architectures to work properly together), none of this would even be an issue. It's odd that so many nVidia haters here complain about how crappy their cards and company are.... yet they still want to cherry pick technologies from them for their own use, going as far as hacking their drivers. Then they complain again once they find out nVidia is trying to stop such theft of their technology.

    The lack of the ability to use PhysX cards in conjunction with ATi main graphics cards is 100% the fault of ATi who refusing to pay to use the tech. Why don't any of the complainers put blame where it is belongs? If this situation was reversed, and ATi had a similar technology that I wanted to use with my nV card.... i'd be calling out nV to either create their own solution or pay usage rights to ATi. It's quite obvious though that ATi has nothing nV users currently want... so that theoretical is a moot point.
    LOL! Gullible to the Xtreme.

  8. #33
    Xtreme Cruncher
    Join Date
    Mar 2006
    Posts
    613
    With any luck, games will start using the OpenCL supporting Bullet physics engine. Then it will run on all cuda/opencl supporting GPUs, and we can all be happy. I think Havoc is working on OpenCL stuff too?

    Essentially, I expect these open platforms to eventually win over PhysX, however much money nV throw at developers to use it.

  9. #34
    Xtreme Member
    Join Date
    Feb 2008
    Location
    Jakarta, Indonesia
    Posts
    244
    Bah!! it looks like the bomb already exploded in their own hand

  10. #35
    Xtreme Member
    Join Date
    Apr 2008
    Location
    Stockholm, Sweden
    Posts
    324
    They opened a can of Whoop-ass!

  11. #36
    Xtreme Member
    Join Date
    Apr 2010
    Posts
    137
    So if i understand correctly, Nvidia doesn't want you to run physics if you buy any Nvidia card and combine it with a 785G mobo? If this is true, it is quite sad....

  12. #37
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Texas
    Posts
    1,663
    Quote Originally Posted by BoredByLife View Post
    So if i understand correctly, Nvidia doesn't want you to run physics if you buy any Nvidia card and combine it with a 785G mobo? If this is true, it is quite sad....
    You bring up a good point! Does this time bomb kick in if you have integrated ATI graphics on your motherboard? Somebody has to have tested this.
    Core i7 2600K@4.6Ghz| 16GB G.Skill@2133Mhz 9-11-10-28-38 1.65v| ASUS P8Z77-V PRO | Corsair 750i PSU | ASUS GTX 980 OC | Xonar DSX | Samsung 840 Pro 128GB |A bunch of HDDs and terabytes | Oculus Rift w/ touch | ASUS 24" 144Hz G-sync monitor

    Quote Originally Posted by phelan1777 View Post
    Hail fellow warrior albeit a surat Mercenary. I Hail to you from the Clans, Ghost Bear that is (Yes freebirth we still do and shall always view mercenaries with great disdain!) I have long been an honorable warrior of the mighty Warden Clan Ghost Bear the honorable Bekker surname. I salute your tenacity to show your freebirth sibkin their ignorance!

  13. #38
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Lancaster, UK
    Posts
    473
    Quote Originally Posted by Final8ty View Post
    LOL! Gullible to the Xtreme.
    This, cant stop laughing at the naivety
    CPU: Intel 2500k (4.8ghz)
    Mobo: Asus P8P67 PRO
    GPU: HIS 6950 flashed to Asus 6970 (1000/1400) under water
    Sound: Corsair SP2500 with X-Fi
    Storage: Intel X-25M g2 160GB + 1x1TB f1
    Case: Sivlerstone Raven RV02
    PSU: Corsair HX850
    Cooling: Custom loop: EK Supreme HF, EK 6970
    Screens: BenQ XL2410T 120hz


    Help for Heroes

  14. #39
    Xtreme Cruncher
    Join Date
    Dec 2008
    Location
    The Netherlands
    Posts
    896
    Quote Originally Posted by Andrew LB View Post
    If ATi would simply pay for the rights to use PhysX, none of this would even be an issue. It's odd that so many nVidia haters here complain about how crappy their cards and company are.... yet they still want to cherry pick technologies from them for their own use, going as far as hacking their drivers. Then they complain again once they find out nVidia is trying to stop such theft of their technology.
    Ok, I don't know where to start. First of all, ATI was offered Physx support if they paid licensing fees to nVidia. If they would do that, nVidia would help ATI run Physx on their Radeons. ATI, however, chose not to. Now how is it "theft of their technology" to BUY a nVidia product, which says it supports Physx on the BOX, to actually use it? It's not like the hacks are allowing you to run Physx on the Radeon cards. You are paying nVidia for a card, which supports Physx. Sounds fair to me you actually get to use Physx, no matter what other hardware you might be running in your PC.

    Quote Originally Posted by Andrew LB View Post
    (along with the costs of driver development to allow both architectures to work properly together)
    They already work together splendidly. This hack shows that. And it was possibly with older drivers without any hack. It not working together is 100% an artificial lock made by nVidia.

    Quote Originally Posted by Andrew LB View Post
    If this situation was reversed, and ATi had a similar technology that I wanted to use with my nV card....
    You're stating something different here. You are saying in your example "to use with my nV card". That's exactly the point. Ofcourse it's only fair to pay nVidia for using Physx, but people are already doing that by buying their cards. It would be totally unfair and theft of technology if the hack allowed you to use Physx on a Radeon card, but it does not.

    Add in the fact that people who bought the original Ageia Physx card can't even use theirs anymore, even though it's 100% possible to still use it by using this hack, and it's very clear nVidia is at fault here. Just think about it.
    Last edited by Musho; 04-26-2010 at 03:55 AM.

  15. #40
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Quote Originally Posted by Andrew LB View Post
    If ATi would simply pay for the rights to use PhysX (along with the costs of driver development to allow both architectures to work properly together), none of this would even be an issue. It's odd that so many nVidia haters here complain about how crappy their cards and company are.... yet they still want to cherry pick technologies from them for their own use, going as far as hacking their drivers. Then they complain again once they find out nVidia is trying to stop such theft of their technology.

    The lack of the ability to use PhysX cards in conjunction with ATi main graphics cards is 100% the fault of ATi who refusing to pay to use the tech. Why don't any of the complainers put blame where it is belongs? If this situation was reversed, and ATi had a similar technology that I wanted to use with my nV card.... i'd be calling out nV to either create their own solution or pay usage rights to ATi. It's quite obvious though that ATi has nothing nV users currently want... so that theoretical is a moot point.
    Why do ATI need to pay for Physx when they arent putting it on their graphics cards? Common sense fail?

    If people buy an Nvidia card to use Physx, they have paid in full to use a graphics card with Physx. ATI dont have to pay Nvidia because their customers also want to purchase Nvidia products for Physx.

    Its funny though, Nvidia only lose sales and damage their reutaton from this. People using ATI setups arent exactly going to ditch their ATI cards just to use Physx, which is what Nvidia are trying to make them do.

    I can live without it. The only game I play that supports Physx is bugged and crashes like crazy with it enabled according to other users (Sacred 2).

    IMO Intel and AMD should just stop Nvidia cards from working on their chipsets altogether, and not provide them with the rights to make their own chipsets as Intel already do. Thats pretty much doing the same thing that Nvidia want to do with Physx.
    Last edited by Mungri; 04-26-2010 at 04:21 AM.

  16. #41
    Xtreme Enthusiast
    Join Date
    Nov 2007
    Location
    Cincinnati, Ohio
    Posts
    614
    Quote Originally Posted by bhavv View Post
    IMO Intel and AMD should just stop Nvidia cards from working on their chipsets altogether, and not provide them with the rights to make their own chipsets as Intel already do. Thats pretty much doing the same thing that Nvidia want to do with Pyhsx.
    Agree 100%. See how their own medicine tastes. intel basically threw nvidia a bone by allowing sli to work on x58, right? Now make it not work. Or make it so slow its worthless. I think intel can manage a way to make that work.

    I don't know why the ftc is allowing stuff like this if it was obviously against the law for intel to do the same things to AMD.
    Aaron___________________________Wife________________________ HTPC
    intel i7 2600k_____________________AMD5000+ BE @ 3ghz___________AMD4850+ BE @ 2.5ghz
    stock cooling______________________CM Vortex P912_______________ Foxconn A7GM-S 780G
    AsRock Extreme 4_________________ GB GA-MA78GM-S2H 780G_______OCZ SLI 2gb PC6400
    4gb 1600 DDR3___________________ OCZ Plat 2GB PC6400___________Avermedia A180 HDTV tuner
    MSI 48901gb 950/999______________Tt Toughpower 600w___________ Saphire 4830
    Corsair HX620____________________ inwin allure case___________ ___ Coolmax 480w
    NZXT 410 Gunmetal________________Acer 23" 1080p________________ LiteOn BD player
    X2gen 22" WS
    ________________ ________________________________ nMediaPC 1000B case

  17. #42
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    ^^ Is it illegal under fair trading to buy an Nvidia GPU clearly advertised with supporting Physx, that doesnt work if you have hardware from either ATI or Intel?

    I would think it is. AMD should definately give Nvidia such a shove and disable support for their cards on AMD motherboards. Let them see exactly what its like.

  18. #43
    Xtreme Member
    Join Date
    Oct 2009
    Location
    South Mississippi
    Posts
    161
    First off is PissX worth the money?
    When I game my eyes are focused looking at my next move or action, not staring at how good a dust cloud looks or the way cans fall off a table, google or photochop is a lot cheaper to use to see the things, like pretty pictures.
    Dont get me wrong PissX looks great on a cartoon game like spongebob or maplestory if your into those games.
    I would better understand nvidia's move if they were still making chipsets, but for the price of PissX these days they should add a pack of lube with each card and call them FTF* editions*(F*#% the Fan).
    Gigagyte Z68X-UD3P
    i7 2600k@4.6/1.35v
    GSkill Rippers 2133
    MSI 7870@1275/1450
    Antec Kuhler 920

  19. #44
    Xtreme Addict
    Join Date
    Oct 2006
    Location
    new jersey
    Posts
    1,100
    i think they should send a eye gouge'in ninja over to anyones house that wants to use phx on any brand card.
    but thats just me
    _________________

  20. #45
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Quote Originally Posted by cowie View Post
    i think they should send a eye gouge'in ninja over to anyones house that wants to use phx on any brand card.
    but thats just me
    Talk about completely missing the point?

    People want to buy *NVIDIA* cards to use Physx, not a card by any brand to run it.

    Can anyone who doesnt get this care to explain what is wrong with adding an Nvidia card to a PC to use a physx accelerator? People would be using Nvidia hardware for Physx, not just 'any brand'.

  21. #46
    Xtreme Addict
    Join Date
    Oct 2006
    Location
    new jersey
    Posts
    1,100
    i did not miss the point at all
    it seems many miss the point by trying to run nv phx with an ati card installed no matter if they have an nv card to do it.
    _________________

  22. #47
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Honestly, after seeing what CPU accelerated Havok physics could do in Just Cause 2, I'm totally unconvinced that GPU accelerated physics should even be on the table anymore. That goes for PhysX, OpenCL apps, etc, etc.

    Why would I want the rendering power of my GPU compromised by doing physics calculations while my 4-8 core processor sits there twiddling its thumbs on most of its cores? I don't give two hoots if the GPU can calculate fluid dynamics, rigid body, AI, and so on more efficiently when my processor can do it inefficiently and actually USE the threads it isn't utilizing.

  23. #48
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    I wonder if Charlie is going to write anything about this one? I get the impression there is more to it then what's told.
    [SIGPIC][/SIGPIC]

  24. #49
    Xtreme Addict
    Join Date
    Dec 2004
    Location
    Flying through Space, with armoire, Armoire of INVINCIBILATAAAAY!
    Posts
    1,939
    Quote Originally Posted by SKYMTL View Post
    Honestly, after seeing what CPU accelerated Havok physics could do in Just Cause 2, I'm totally unconvinced that GPU accelerated physics should even be on the table anymore. That goes for PhysX, OpenCL apps, etc, etc.

    Why would I want the rendering power of my GPU compromised by doing physics calculations while my 4-8 core processor sits there twiddling its thumbs on most of its cores? I don't give two hoots if the GPU can calculate fluid dynamics, rigid body, AI, and so on more efficiently when my processor can do it inefficiently and actually USE the threads it isn't utilizing.
    Well, potentially a GPU (even an older one) could do way more simultaneous physics objects than a CPU. So if you have a random 8800gt lying around, it would make a way more efficient physics calculator than your new top of the line CPU.

    But on the other hand, current implementations of physics features don't really have that many physics objects to make it necessary. I'm guessing it's got something to do with most of the market not having any kind of consistent hardware physics solution. So most will design games in mind with the CPU still being able to do physics. Unless, of course, they don't like money or something.
    Last edited by iddqd; 04-26-2010 at 06:06 AM.
    Sigs are obnoxious.

  25. #50
    Xtreme Addict Chrono Detector's Avatar
    Join Date
    May 2009
    Posts
    1,142
    NVIDIA are just idiots, though this kind of behaviour does not surprise me, considering the past stunts they have pulled.
    AMD Threadripper 12 core 1920x CPU OC at 4Ghz | ASUS ROG Zenith Extreme X399 motherboard | 32GB G.Skill Trident RGB 3200Mhz DDR4 RAM | Gigabyte 11GB GTX 1080 Ti Aorus Xtreme GPU | SilverStone Strider Platinum 1000W Power Supply | Crucial 1050GB MX300 SSD | 4TB Western Digital HDD | 60" Samsung JU7000 4K UHD TV at 3840x2160

Page 2 of 7 FirstFirst 12345 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •