MMM
Page 4 of 5 FirstFirst 12345 LastLast
Results 76 to 100 of 122

Thread: Nvidia GPU required for PhysX in Win 7

  1. #76
    Xtreme Addict
    Join Date
    Jun 2006
    Location
    Florida
    Posts
    1,005
    Can't they just install earlier Nvidia drivers and keep their PhysX working with the Nvidia GPU? I'm mean you don't need to get a PhysX driver every month like you do with Video drivers. Hell, the PhysX drivers that are being packed with the video drivers haven't been updated in months as is. So who care if they have to use an older PhysX driver? At least that way they won't have to throw out the nvidia card they bought for PhysX.
    Core i7 3770K
    EVGA GTX780 + Surround
    EVGA GTX670
    EVGA Z77 FTW
    8GB (2x4GB) G.Skill 1600Mhz DDR3
    Ultra X3 1000w PSU PSU
    Windows 7 Pro 64bit
    Thermaltake Xaser VI

  2. #77
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by nr4 View Post
    So from my point of view, ATI or others can start developing.
    And from what alfaunits said in his post, i really wouldn't like to start making drivers or other stuff working with ATI drivers.
    I have had 1800xt 1900xt+1900CF 3870-Quad fire through 6 different motherboards & have had every ATI driver update for those cards & which is about 60 of them & have only ever had a problem with 2 sets ever.
    NV are talking complete crap as regards to using the gfx card as physics processor with other vendor cards as the NV card is just another computation device just like the CPU when doing physics but only quicker at it.

    You don't see Intel saying that Its CPU does not work with ATI/AMD gfx cards when running Havok nether did you see Ageia saying anything of the sort.
    Last edited by Final8ty; 08-08-2009 at 09:29 PM.

  3. #78
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by Final8ty View Post
    Nope this was going back years ago in 5 & 6 NV era.
    Well the FX series sucked hardware-wise too, but you don't see people frowning on NVIDIA now because of it. What's the point of looking that far back? Both companies are guilty of driver cheating, but most of that happened many years ago.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  4. #79
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    Quote Originally Posted by nr4 View Post
    So from my point of view, ATI or others can start developing.And from what alfaunits said in his post, i really wouldn't like to start making drivers or other stuff working with ATI drivers.
    and be shut down by nv at any time
    it's not open or a standard

  5. #80
    Registered User
    Join Date
    Jan 2009
    Posts
    67
    "Hello JC,
    [...]
    Physx is an open software standard any company can freely develop hardware or software that supports it.
    [...]
    Best Regards,
    Troy
    NVIDIA Customer Care"
    bill_d, is this qoute ringing any bells to you?

  6. #81
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Cybercat View Post
    Well the FX series sucked hardware-wise too, but you don't see people frowning on NVIDIA now because of it. What's the point of looking that far back? Both companies are guilty of driver cheating, but most of that happened many years ago.
    A track record is a track record & many years ago makes no difference when the same company is still up to its tricks with no sign of change.
    Its not as if i have not mention more resent events as well & im purely holding them to that one.

    Your not going to hire a someone with a history of stealing from the till even 1 time 40 years ago over some one who has never if that person has not shown signs of changing there ways.
    Last edited by Final8ty; 08-08-2009 at 09:52 PM.

  7. #82
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    Quote Originally Posted by nr4 View Post
    bill_d, is this qoute ringing any bells to you?
    they still own the rights to it
    saying it's a standard and open dosn't make it so
    you don't think NVIDIA Customer Care can lie

  8. #83
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Physx is an open software standard any company can freely develop hardware or software that supports it.

    Can someone explain that statement to me? Can't ATI hardware support PhysX? I don't understand that statement at all, sounds like marketing bull and double speak to me.

  9. #84
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by Final8ty View Post
    A track record is a track record & many years ago makes no difference when the same company is still up to its tricks with no sign of change.
    Its not as if i have not mention more resent events as well & im purely holding them to that one.

    Your not going to hire a someone with a history of stealing from the till even 1 time 40 years ago over some one who has never if that person has not shown signs of changing there ways.
    Um, you're not "purely" holding them to recent events if you mention something that happened several years ago as one of three reasons why NVIDIA isn't trustworthy. By the way, anyone could just as easily make a scintillating list of ATI's wrongdoings if they wanted, it's no different.

    Your employment analogy is a bit flawed, considering you're not hiring NVIDIA over a long period of time if it's a one-time transaction to buy a graphics card that you can get rid of any time if you don't want it. And I guess ATI is supposed to be that person that's never stole before, huh? Give me a break.

    I don't know about you, but I buy computer hardware based on its technology, not based on the technology and actions of the company back in 2003. Geez look at all these people with Intel processors, you think they care that the company that made them is one of the worst offenders of shady business in the industry?
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  10. #85
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    I couldn't care less. PhysX in a closed form like this is a massive fail. And NVIDIA knows how to fail massively the last couple of months. So let them fail even more.
    The sooner PhysX becomes open standard (not likely) or the sooner it dies, the better for us. This whole mumbo jumbo physics war is just holding back evolution of physics.
    If there was an unified standard for physics at the moment, i'm sure there would be games popping out with advanced hardware accelerated physics already.
    But now there is only camp of NVIDIA "bribed" studios and those that are holding back because of still unknown future of physics API's and hardware support.
    And we all know propertiary stuff just doesn't work in a long run. It never did. DirectX/Direct3D is another creature. It's propertiary, but all vendors use it so thats called a good adoption. I can't say the same for CUDA or PhysX since it's NVIDIA only thing. It's funny that they still haven't learn what happened to 3dfx and their Glide library... The same thing will eventually happen to them if they don't get their dumb heads out of the sand.
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  11. #86
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Cybercat View Post
    Um, you're not "purely" holding them to recent events if you mention something that happened several years ago as one of three reasons why NVIDIA isn't trustworthy. By the way, anyone could just as easily make a scintillating list of ATI's wrongdoings if they wanted, it's no different.

    Your employment analogy is a bit flawed, considering you're not hiring NVIDIA over a long period of time if it's a one-time transaction to buy a graphics card that you can get rid of any time if you don't want it. And I guess ATI is supposed to be that person that's never stole before, huh? Give me a break.

    I don't know about you, but I buy computer hardware based on its technology, not based on the technology and actions of the company back in 2003. Geez look at all these people with Intel processors, you think they care that the company that made them is one of the worst offenders of shady business in the industry?
    It is simple me not mentioning that one incident changes nothing because of the multitude of other things that they have done & are doing to this very day.
    Past incidents are mentioned & there is no rule on how far an individual can go back to including the incidents from now till then.
    I don't personally care about that incident but when asked about previous incidents i well mention all on them that i can remember at the time of reply.

    ATI had there day with the HDCP ready on some gfx cards that did not have the HDCP licensed code in them.

    But its not required to mention every other companies wrong doing every time a company wrong doing hits the forums.

    Its not about a company never doing no wrong & its plain silly to even suggest that anyone is even trying to claim that any company is totally devoid of wrong doing.
    Its all about severity & frequency in this world when it comes to wrong doing in this world of ours.

    NV are the topic & im discussing them.
    When Intel Is the topic of discussion is relation to wrong doing they get there day as well just like the recent EU ruling.
    Last edited by Final8ty; 08-08-2009 at 11:46 PM.

  12. #87
    I am Xtreme
    Join Date
    Jan 2006
    Location
    Australia! :)
    Posts
    6,096
    I bet NV will change mind - its just too.. dumb.. just look @ the reaction in this thread alone.. then there is everyone else on other forums etc.. na, just cant see them being that dumb to carry thru with this. then again..
    DNA = Design Not Accident
    DNA = Darwin Not Accurate

    heatware / ebay
    HARDWARE I only own Xeons, Extreme Editions & Lian Li's
    https://prism-break.org/

  13. #88
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    they didnt care, when they pulled sli from the 975 boards that had sli support from the company that made the ati SB on the 3200 chipset (i forgot the name of it) but when NV bought them they killed the sli license that they had and removed the drivers from the 975 before the core 2 came out, there was a huge push back to get it put back but NV didnt care and said that they could only use it with the old drivers
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  14. #89
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034
    Quote Originally Posted by tiro_uspsss View Post
    I bet NV will change mind - its just too.. dumb.. just look @ the reaction in this thread alone.. then there is everyone else on other forums etc.. na, just cant see them being that dumb to carry thru with this. then again..
    Think they care about forums? They didn't care with all stupid renaming schemes so why listen now?
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  15. #90
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by BrowncoatGR View Post
    Closed standards have almost always failed when there was an open alternative.
    That,s why its needs to be handled with care & in the right situations were it can be favourable to the company and the consumer.

  16. #91
    Xtreme Member
    Join Date
    May 2008
    Posts
    107
    Once again, great marketing by nvidia

    Why can't they do something useful, like fix the stutter in Left4Dead.

  17. #92
    Xtreme Cruncher
    Join Date
    Apr 2005
    Location
    TX, USA
    Posts
    898
    Quote Originally Posted by zanzabar View Post
    they didnt care, when they pulled sli from the 975 boards that had sli support from the company that made the ati SB on the 3200 chipset (i forgot the name of it) but when NV bought them they killed the sli license that they had and removed the drivers from the 975 before the core 2 came out, there was a huge push back to get it put back but NV didnt care and said that they could only use it with the old drivers
    ULi was the company's name, and they didn't make the sole SB for the 3200 (SB450/SB460 was around, and SB600 later), but the ULi M1575 was indeed a better alternative at the time.
    Interesting, I never realized that as a result Nvidia technically makes an x86 processor, albeit it's just a 386SX embedded microcontroller, so that doesn't really mean much to us.

    OT:
    So, you won't be able to run Physix when rendering with an AMD/ATI GPU or <insert future other company's GPU here>, yet still be able to run a full CUDA app at the same time [in the background]? If that's the case then there shouldn't be much weight to a potential issue of drivers conflicting...



  18. #93
    Xtreme Mentor
    Join Date
    Aug 2006
    Location
    HD0
    Posts
    2,646
    Quote Originally Posted by perkam View Post
    Those that chose an ATI card but still bought an nvidia card for Physx engine are now screwed.

    Congrats, Nvidia. You're losing money already, and now you'll lose a little bit more.

    Can you imagine a company punishing the consumer for buying another vendor's cards ??

    Perkam
    not fully screwed. They'll just sell their cards to someone used. This person who buys the used card no longer needs to buy a new nVidia card.

    the only entity really screwed would be nVidia.

  19. #94
    Xtreme Addict
    Join Date
    Feb 2008
    Location
    America's Finest City
    Posts
    2,078
    Quote Originally Posted by Pontos View Post
    With that, and the fact that they screwed early adopters of the Physx PPU, Nvidia is just giving me more reasons to not buy their stuff...
    Wait a sec... how in the hell did nvidia have any hand in the first instances of the Ageia PPU? You make it seem like Nvidia had planned obsolescence in a case when they didnt even own the company when that product was released. They merely took the PhysX API and integrated it into their graphics cards as a functionality of CUDA. I have no idea how what you said makes any logical sense.

    If you're talking about the old PhysX cards in general being phased out... those things are extremely outdated to begin with. It doesnt make any sense for nvidia to keep making entirely different physX cards at all.
    Quote Originally Posted by FUGGER View Post
    I am magical.

  20. #95
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    NV made it so that u need an NV gpu to use the PPU a long time ago
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  21. #96
    Xtreme Addict
    Join Date
    May 2003
    Location
    Peoples Republic of Kalifornia
    Posts
    1,541
    Quote Originally Posted by B.E.E.F. View Post
    Not necessarily. The smart companies actually give a crap because in the long run a squeaky clean image makes far more money. That and not pissing off your loyal customers.
    Keeping a good image is important, but only when it comes to the very small portion of video card consumers who actually read this kind of news chatter. The vast majority of video card buyers don't keep up on this stuff, nor do they know what G300 and NV30 actually mean. Heck.... i'll bet that over 50% of laptop owners don't even know what graphics chip they even have in it.


    Personally, I'm tired of hearing people complain about how their current favorite GPU or CPU manufacturer should somehow be "given" technology that the other company spent untold time and money to develop.

    The free market system is designed to encourage private companies to take risks and break new ground. And when those risks pay off when they field a new product, the system rewards them with more market share and most importantly.... Profits!

    What incentive does a company have in order to go spend hard earned capitol only to hand your competitor a free copy of what you just spent all that money to develop?

    ATI should have to license out the physics technology from nVidia if they want to utilize it. It's the only fair way to move on.

    "If the representatives of the people betray their constituents, there is then no resource left but in the exertion of that original right of self-defense which is paramount to all positive forms of government"
    -- Alexander Hamilton

  22. #97
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    how is fair to make ati pay to use a product that has nothing to do with them, the physX card isnt doing anything that the cpu cant do ur just using an add in card. although i dont see the big deal since physX as an api is bad, if i could use it with my gpu i dont even think that i would want to even with free parts
    Last edited by zanzabar; 08-09-2009 at 02:29 AM.
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  23. #98
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Canada
    Posts
    1,397
    Quote Originally Posted by Andrew LB View Post
    Personally, I'm tired of hearing people complain about how their current favorite GPU or CPU manufacturer should somehow be "given" technology that the other company spent untold time and money to develop.

    The free market system is designed to encourage private companies to take risks and break new ground. And when those risks pay off when they field a new product, the system rewards them with more market share and most importantly.... Profits!

    What incentive does a company have in order to go spend hard earned capitol only to hand your competitor a free copy of what you just spent all that money to develop?

    ATI should have to license out the physics technology from nVidia if they want to utilize it. It's the only fair way to move on.
    I quite agree, but that's a slightly different scenario. In this case, the Physx is being run on Nvidia's own GPU's - unless of course, someone else is doing the rendering. While there's likely some valid arguments in the programming/compatibility arena for this decision, it does smack a little of childishness - "You won't use our stuff 100%? Then take THIS - no accelerated Physx for you."

    Honestly, given the right titles, I wouldn't be entirely adverse to throwing down ~$100 for a secondary card to get some sweet eye-candy at playable framerates. But dictating what the primary card has to be... meh.
    i7 2600K | ASUS Maximus IV GENE-Z | GTX Titan | Corsair DDR3-2133

  24. #99
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    I'm not ready to say that I won't buy a Nvidia GPU, as I still think they are leading the GPU revolution.

    That is of course just my opinion...

    I am however willing to acknowledge that locking out PhysX on systems having an ATI card plugged in, was a Dirt Bag move. (Also just my opinion.)
    Last edited by Talonman; 08-09-2009 at 02:46 AM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  25. #100
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by MpG View Post
    I quite agree, but that's a slightly different scenario. In this case, the Physx is being run on Nvidia's own GPU's - unless of course, someone else is doing the rendering. While there's likely some valid arguments in the programming/compatibility arena for this decision, it does smack a little of childishness - "You won't use our stuff 100%? Then take THIS - no accelerated Physx for you."

    Honestly, given the right titles, I wouldn't be entirely adverse to throwing down ~$100 for a secondary card to get some sweet eye-candy at playable framerates. But dictating what the primary card has to be... meh.
    I can confirm that any driver after 181.72 PhysX does not work on Windows 7 with an AMD card present.
    From that the ATI gfx just has to be present no matter primary or not.

Page 4 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •