MMM
Page 5 of 5 FirstFirst ... 2345
Results 101 to 122 of 122

Thread: Nvidia GPU required for PhysX in Win 7

  1. #101
    I am Xtreme
    Join Date
    Jan 2006
    Location
    Australia! :)
    Posts
    6,096
    Quote Originally Posted by Talonman View Post
    I am however willing to acknowledge that locking out PhysX on systems having an ATI card plugged in, was a Dirt Bag move. (Also just my opinion.)
    'dirt bag move' is putting it mildly - what if I have a lowly PCI ATi card (say for extra monitors)... pathetic
    DNA = Design Not Accident
    DNA = Darwin Not Accurate

    heatware / ebay
    HARDWARE I only own Xeons, Extreme Editions & Lian Li's
    https://prism-break.org/

  2. #102
    Xtreme Addict
    Join Date
    May 2003
    Location
    Peoples Republic of Kalifornia
    Posts
    1,541
    I quite agree, but that's a slightly different scenario. In this case, the Physx is being run on Nvidia's own GPU's - unless of course, someone else is doing the rendering. While there's likely some valid arguments in the programming/compatibility arena for this decision, it does smack a little of childishness - "You won't use our stuff 100%? Then take THIS - no accelerated Physx for you."
    I don't think it's childish. But more like a smart (yet tough to make) business move.

    I don't recall seeing intel going out on a limb with their wallet to fix the issues nVidia had with the 680i and 780i chipsets. Why pay good money to let consumers use the PhysX technology that you paid a lot of money to design on a cheap piggy-backed card that's improving your competitors product?

    And again, it's not that nVidia is preventing anyone from using PhysX. They're just telling people that due to design compatibility issues, they don't think it's worth the investment to help their competitor use a feature that's designed for nVidia cards, so they can go the cheap way out and buy an ATI card with an older and much cheaper nVidia card for PhysX.


    Next thing you'll see is ATI users complaining because nVidia wont "give" ATI cGPU CUDA and SLI support.

    Like most businesses, the goal is to MAKE money by innovating, bringing this new technology to consumers.

    I'm not sure if it's because of the high European members here, but for some reason these very simple concepts regarding the free market system just seem to be overlooked.

    Or perhaps it's because most Western European languages lack a word in their vocabulary which is essential when understanding Capitalism. That word is "Earn", and last I checked, states like Germany, France, and Spain do not have a word which has the exact same meaning as our word "Earn". Most words that people tend to associate with "Earn" in Western Europe actually have a definition such as "To receive" or "To be Given". The actual definition of "Earn" is "to gain or get in return compensation through merit for one's labor or service: to earn one's living.


    Quote Originally Posted by perkam View Post
    Those that chose an ATI card but still bought an nvidia card for Physx engine are now screwed.
    Nah. You'll more than likely see mainstream PC users just go with nVidia from the beginning. And I doubt you'll get many regular nVidia users to sell their cards off just so they can buy ATI because they're mad.

    The only upset people I've seen are the ATI users who wouldn't buy an nVidia card for their main graphics.

    Did you honestly think ATI owners were going to see nVidia either let ATI run PhysX or let them piggy back an $80 card and later convert because "nVidia is just such a nice company..."?

    I personally don't even like the idea of running an ATI main card with a cheap nVidia card for physX. Getting two different model nVidia cards would bave been problematic... and you expect nVidia to pay out of their own pocket development costs that would allow ATI users to go cheap and buy their underpriced ATI main card and a second cheap nVidia card for PhysX? Would have been more of a problem than it was worth IMO.

    Congrats, Nvidia. You're losing money already, and now you'll lose a little bit more.
    Nah. You're going to see plenty of the mainstream "uninformed" public just go with nVidia because they get to run PhysX. I'm sure they'll draw far more new customers over this than they'll actually lose.

    I haven't owned an ATI card since the 9700pro and by the looks of things, It's going to stay that way.

    You guys should vent your anger at ATI for not bringing PhysX and their CUDA type technology to their own cards. Or if ATI is down to PAY nVidia to have them work out the compatibility bugs between the two platforms... i'm sure nVidia would accept.

    Can you imagine a company punishing the consumer for buying another vendor's cards ??

    Perkam
    Could you imagine a company PAYING out of their own pocket to develop the necessary driver adjustments to allow a consumer to use a competitors product instead of buying their own?

    ATI is the only one here to blame since they haven't gotten physX and their CUDA type technology to run on their own cards well.

    "If the representatives of the people betray their constituents, there is then no resource left but in the exertion of that original right of self-defense which is paramount to all positive forms of government"
    -- Alexander Hamilton

  3. #103
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,034
    Quote Originally Posted by Andrew LB View Post
    Could you imagine a company PAYING out of their own pocket to develop the necessary driver adjustments to allow a consumer to use a competitors product instead of buying their own?

    ATI is the only one here to blame since they haven't gotten physX and their CUDA type technology to run on their own cards well.
    They aren't paying anything out of their own pocket if one baught one of their cards dedicated for PhysX.

    That last part is bogus, go read up on it before making such a ridicolous statement.
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  4. #104
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    - On my system in nvidia drivers 181.71 the Nvidia Control Panel will crash on launch if any ATI monitors are enabled
    - On newer drivers (185.81 and up) the control panel works fine, but if the ATI card is not disabled in the Device Manager the PhysX option will not show at all.
    - Enabling the ATI card while the Nvidia Control Panel is open causes it to refresh, once again removing the PhysX option
    Well that's enough for me.

  5. #105
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Quote Originally Posted by Andrew LB View Post
    I'm not sure if it's because of the high European members here, but for some reason these very simple concepts regarding the free market system just seem to be overlooked.

    That word is "Earn", and last I checked, states like Germany, France, and Spain do not have a word which has the exact same meaning as our word "Earn". Most words that people tend to associate with "Earn" in Western Europe actually have a definition such as "To receive" or "To be Given". The actual definition of "Earn" is "to gain or get in return compensation through merit for one's labor or service: to earn one's living.
    Check again.
    receive - bekommen
    earn - verdienen

    Additionally, iIrc doing things just in order to hurt the competitors wasn't allowed in the US as well. Correct me if I'm wrong though.
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  6. #106
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    how can people defend nvidia. there wrong we will buy there cards to use physx but thats not good enough. cant wait for free and better open cl + dx11 byebye physx
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  7. #107
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Posts
    939
    Quote Originally Posted by Andrew LB View Post
    I don't think it's childish. But more like a smart (yet tough to make) business move.
    Wow, the people here are so polite to you. On the other forums I visit you would have been beaten in the hopes of retarding you beyond the point of being able to read and write.

  8. #108
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by Iconyu View Post
    Wow, the people here are so polite to you. On the other forums I visit you would have been beaten in the hopes of retarding you beyond the point of being able to read and write.
    hes just a fanboy ignore him. he probably has shares in them lol
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  9. #109
    Xtreme Addict
    Join Date
    Jun 2006
    Posts
    1,820
    Quote Originally Posted by Final8ty View Post
    I have had 1800xt 1900xt+1900CF 3870-Quad fire through 6 different motherboards & have had every ATI driver update for those cards & which is about 60 of them & have only ever had a problem with 2 sets ever.
    I did say in MY experience (and that of a few friends), ATI drivers sucked way more. I doubt they work yet with Srv03 with 4GB of RAM well. Before Cat 9.2 (or 3 not sure..) they would hang the OS before logon screen, later overlay didn't work with 4GB, then two videos at the same time caused issues... that's not my case - that would happen on any Srv03 with 4GB of RAM (maybe with more also). I know cause I tried several machines.
    nVidia had no issues in this config for ages before ATI.
    ATI drivers just hate PAE -> sucky drivers.
    Last edited by alfaunits; 08-09-2009 at 06:42 AM.
    P5E64_Evo/QX9650, 4x X25-E SSD - gimme speed..
    Quote Originally Posted by MR_SmartAss View Post
    Lately there has been a lot of BS(Dave_Graham where are you?)

  10. #110
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    [QUOTE=alfaunits;3951079]
    Quote Originally Posted by Final8ty View Post
    I have had 1800xt 1900xt+1900CF 3870-Quad fire through 6 different motherboards & have had every ATI driver update for those cards & which is about 60 of them & have only ever had a problem with 2 sets ever.QUOTE]

    I did say in MY experience (and that of a few friends), ATI drivers sucked way more. I doubt they work yet with Srv03 with 4GB of RAM well. Before Cat 9.2 (or 3 not sure..) they would hang the OS before logon screen, later overlay didn't work with 4GB, then two videos at the same time caused issues... that's not my case - that would happen on any Srv03 with 4GB of RAM (maybe with more also). I know cause I tried several machines.
    nVidia had no issues in this config for ages before ATI.
    ATI drivers just hate PAE -> sucky drivers.
    The problem is that your asking others to make there decisions based on your experience.
    Its about getting the right combination of hardware that works together & at driver level to get the best success rate & i have pulled it off with researching what motherboards work best with my choice of gfx card & ram & whether there are any possible BIOS issues that can effect both.

    I have never used less than 4GB of ram from day one.

  11. #111
    Xtreme Addict
    Join Date
    Nov 2006
    Location
    Red Maple Leaf
    Posts
    1,556
    Quote Originally Posted by Andrew LB View Post
    Keeping a good image is important, but only when it comes to the very small portion of video card consumers who actually read this kind of news chatter.
    We're not talking about consumers. We're talking about CUSTOMERS.

    Most customers are very well educated about the graphics chips. The majority of nVidia's sales are to OEMs and to the enthusiast market. Look at the recent nVidia BS. The company sold faulty GPUs (g80 and g92) to OEMs and recently had to pay two huge fines to them to partly correct the issues consumers have been having. You can bet that in the future the OEMs are less willing to buy from nVidia and will do so for a lower price.
    E8400 @ 4.0 | ASUS P5Q-E P45 | 4GB Mushkin Redline DDR2-1000 | WD SE16 640GB | HD4870 ASUS Top | Antec 300 | Noctua & Thermalright Cool
    Windows 7 Professional x64


    Vista & Seven Tweaks, Tips, and Tutorials: http://www.vistax64.com/

    Game's running choppy? See: http://www.tweakguides.com/

  12. #112
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Quote Originally Posted by alfaunits View Post
    [...]
    So, you're running a Server OS, experience problems and then you go on claiming that AMD/ATI's drivers suck in general? Way to go...

    I'm happy if it stays that way, as long as the drivers keep on being as stable as they've proven to me many times when using a regular desktop OS.

    Just because you know some who have problems with AMD/ATI's drivers proves nothing. I know some who had or still have problems with NVIDIA's drivers as well (one even sold his GTX280-SLI due to driver related problems) but still I don't go on and claim NVIDIA's drivers suck as hell. I never had any serious problems with either NVIDIA's nor AMD/ATI's drivers, period.
    Last edited by FischOderAal; 08-09-2009 at 06:59 AM.
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  13. #113
    Xtreme Addict
    Join Date
    Nov 2006
    Location
    Red Maple Leaf
    Posts
    1,556
    Quote Originally Posted by Andrew LB View Post
    I don't think it's childish. But more like a smart (yet tough to make) business move.
    You would not do well in business. This is not a move that nVidia can make because there are other alternatives to PhysX, like using the built in physics engines in most games which is perfect on its own.

    They're just telling people that due to design compatibility issues, they don't think it's worth the investment to help their competitor use a feature that's designed for nVidia cards, so they can go the cheap way out and buy an ATI card with an older and much cheaper nVidia card for PhysX.
    Which would be perfectly fine for PhysX. You can do that with a couple of cheap, old nVidia cards. This is locking out the competitor's product plain and simple.

    You're going to see plenty of the mainstream "uninformed" public just go with nVidia because they get to run PhysX. I'm sure they'll draw far more new customers over this than they'll actually lose.
    Only problem is that PhysX is an enthusiast product and most people that buy it are well informed.

    I haven't owned an ATI card since the 9700pro and by the looks of things, It's going to stay that way.
    Nobody cares.

    ATI is the only one here to blame since they haven't gotten physX and their CUDA type technology to run on their own cards well.
    Look into the PhysX and CUDA licensing issue before you run your mouth.
    Last edited by sierra_bound; 08-10-2009 at 12:18 AM.
    E8400 @ 4.0 | ASUS P5Q-E P45 | 4GB Mushkin Redline DDR2-1000 | WD SE16 640GB | HD4870 ASUS Top | Antec 300 | Noctua & Thermalright Cool
    Windows 7 Professional x64


    Vista & Seven Tweaks, Tips, and Tutorials: http://www.vistax64.com/

    Game's running choppy? See: http://www.tweakguides.com/

  14. #114
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    There's no need to call someone names B.E.E.F.. I think we are all grown-ups that are able to argue on a reasonable basis.
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  15. #115
    Xtreme Member
    Join Date
    Jan 2005
    Location
    No(r)way
    Posts
    452
    Quote Originally Posted by Andrew LB View Post
    And again, it's not that nVidia is preventing anyone from using PhysX.
    Yes they are, they are preventing me from using my 8800gt card for running physux. They advertise it as a feature of their GPUs, Windows 7 has a feature that allows us to run multiple vendor GPUs, and then they come out and retract the support for that feature unless we run nvidia GPUs only.
    This premise makes the entire rest of your post flawed.
    As for nvidia, bananas and bankruptcy to you.
    Obsolescence be thy name

  16. #116
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Posts
    939
    Quote Originally Posted by FischOderAal View Post
    There's no need to call someone names B.E.E.F.. I think we are all grown-ups that are able to argue on a reasonable basis.
    Nah I think BEEF summed him up pretty well, the guy implied that Europe doesn't even understand the concept of earning over receiving. Which is why I responded as I did.

  17. #117
    Xtreme Addict
    Join Date
    Dec 2005
    Location
    UK
    Posts
    1,713
    For a company that's losing money and is trying desperately to push its PhysX into mainstream sure has a FUBAR way of doing it. Its this kind of anti-competitive practices why IT world is so....FUBAR.
    TAMGc5: PhII X4 945, Gigabyte GA-MA790X-UD3P, 2x Kingston PC2-6400 HyperX CL4 2GB, 2x ASUS HD 5770 CUcore Xfire, Razer Barracuda AC1, Win8 Pro x64 (Current)

    TAMGc6: AMD FX, Gigabyte GA-xxxx-UDx, 8GB/16GB DDR3, Nvidia 680 GTX, ASUS Xonar, 2x 120/160GB SSD, 1x WD Caviar Black 1TB SATA 6Gb/s, Win8 Pro x64 (Planned)

  18. #118
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by zanzabar View Post
    what sources, look at red faction gorilla for how good havok is and it was designed for the cpu
    Don't be so easily fooled. Red faction looks good but it's mostly simple rigid body physics covered up with a bunch of particle effects. Try breaking a wall, you'll see some big blocks break off and a lot of tiny bits. All those tiny bits then disappear - i.e graphical particle effects, not physics.

    intel has said that it will support openCL on havok
    Hopefully, that remains to be seen.

    its not new that dx11 with have physics.
    You should tell Microsoft then cause they don't know somebody stuck physics middleware into DirectX when they weren't looking.
    Last edited by trinibwoy; 08-09-2009 at 08:08 AM.

  19. #119
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Jamesrt2004 View Post
    hes just a fanboy ignore him. he probably has shares in them lol
    i have shares of nvidia. do you want me to flush out the board of directors?quit trolling its just physx its in like 3 good games.

  20. #120
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by Andrew LB View Post
    Or perhaps it's because most Western European languages lack a word in their vocabulary which is essential when understanding Capitalism. That word is "Earn", and last I checked, states like Germany, France, and Spain do not have a word which has the exact same meaning as our word "Earn". Most words that people tend to associate with "Earn" in Western Europe actually have a definition such as "To receive" or "To be Given". The actual definition of "Earn" is "to gain or get in return compensation through merit for one's labor or service: to earn one's living
    It seems you know nothing about you're talking about. Way, way before you were even colonized we had the verb "ganar", and the expression "ganarse la vida", which means exactly "to earn one's living". And I can say it in 5-10 different ways. Every European here will tell you the same about their respective languages. And last time I checked you were talking an european language.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  21. #121
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by trinibwoy View Post
    Don't be so easily fooled. Red faction looks good but it's mostly simply rigid body physics covered up with a bunch of particle effects. Try breaking a wall, you'll see some big blocks break off and a lot of tiny bits. All those tiny bits then disappear - i.e graphical particle effects, not physics.



    Hopefully, that remains to be seen.



    You should tell Microsoft then cause they don't know somebody stuck physics middleware into DirectX when they weren't looking.

    dont know about the game....


    Ahh tbh I think they will but not for a while (until there finalizing larrabee etc, so they can get it to work niceley on their hardware and do a co-launch kind of thing)

    iirc directx comes with some open standard of physx? have heard it for a long time also but not too sure.




    Quote Originally Posted by STaRGaZeR View Post
    It seems you know nothing about you're talking about. Way, way before you were even colonized we had the verb "ganar", and the expression "ganarse la vida", which means exactly "to earn one's living". And I can say it in 5-10 different ways. Every European here will tell you the same about their respective languages.
    Ignore him he's totally ignorant, and a fan boy anyways.
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  22. #122
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Jamesrt2004 View Post
    iirc directx comes with some open standard of physx? have heard it for a long time also but not too sure.
    The physics in DX nonsense is dreamed up by people on forums. If there is to be a CS or OpenCL based competitor to PhysX somebody has to write it. Think of it this way, does DX give you graphics? No it doesn't, game developers have to code their games. In any case, why don't people just read Microsoft's own documentation?

Page 5 of 5 FirstFirst ... 2345

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •