'dirt bag move' is putting it mildly - what if I have a lowly PCI ATi card (say for extra monitors)... pathetic :down:
Printable View
I don't think it's childish. But more like a smart (yet tough to make) business move.Quote:
I quite agree, but that's a slightly different scenario. In this case, the Physx is being run on Nvidia's own GPU's - unless of course, someone else is doing the rendering. While there's likely some valid arguments in the programming/compatibility arena for this decision, it does smack a little of childishness - "You won't use our stuff 100%? Then take THIS - no accelerated Physx for you."
I don't recall seeing intel going out on a limb with their wallet to fix the issues nVidia had with the 680i and 780i chipsets. Why pay good money to let consumers use the PhysX technology that you paid a lot of money to design on a cheap piggy-backed card that's improving your competitors product?
And again, it's not that nVidia is preventing anyone from using PhysX. They're just telling people that due to design compatibility issues, they don't think it's worth the investment to help their competitor use a feature that's designed for nVidia cards, so they can go the cheap way out and buy an ATI card with an older and much cheaper nVidia card for PhysX.
Next thing you'll see is ATI users complaining because nVidia wont "give" ATI cGPU CUDA and SLI support.
Like most businesses, the goal is to MAKE money by innovating, bringing this new technology to consumers.
I'm not sure if it's because of the high European members here, but for some reason these very simple concepts regarding the free market system just seem to be overlooked.
Or perhaps it's because most Western European languages lack a word in their vocabulary which is essential when understanding Capitalism. That word is "Earn", and last I checked, states like Germany, France, and Spain do not have a word which has the exact same meaning as our word "Earn". Most words that people tend to associate with "Earn" in Western Europe actually have a definition such as "To receive" or "To be Given". The actual definition of "Earn" is "to gain or get in return compensation through merit for one's labor or service: to earn one's living.
Nah. You'll more than likely see mainstream PC users just go with nVidia from the beginning. And I doubt you'll get many regular nVidia users to sell their cards off just so they can buy ATI because they're mad.
The only upset people I've seen are the ATI users who wouldn't buy an nVidia card for their main graphics.
Did you honestly think ATI owners were going to see nVidia either let ATI run PhysX or let them piggy back an $80 card and later convert because "nVidia is just such a nice company..."?
I personally don't even like the idea of running an ATI main card with a cheap nVidia card for physX. Getting two different model nVidia cards would bave been problematic... and you expect nVidia to pay out of their own pocket development costs that would allow ATI users to go cheap and buy their underpriced ATI main card and a second cheap nVidia card for PhysX? Would have been more of a problem than it was worth IMO.
Nah. You're going to see plenty of the mainstream "uninformed" public just go with nVidia because they get to run PhysX. I'm sure they'll draw far more new customers over this than they'll actually lose.Quote:
Congrats, Nvidia. You're losing money already, and now you'll lose a little bit more.
I haven't owned an ATI card since the 9700pro and by the looks of things, It's going to stay that way.
You guys should vent your anger at ATI for not bringing PhysX and their CUDA type technology to their own cards. Or if ATI is down to PAY nVidia to have them work out the compatibility bugs between the two platforms... i'm sure nVidia would accept.
Could you imagine a company PAYING out of their own pocket to develop the necessary driver adjustments to allow a consumer to use a competitors product instead of buying their own?Quote:
Can you imagine a company punishing the consumer for buying another vendor's cards ?? :stick:
Perkam
ATI is the only one here to blame since they haven't gotten physX and their CUDA type technology to run on their own cards well.
Well that's enough for me.Quote:
- On my system in nvidia drivers 181.71 the Nvidia Control Panel will crash on launch if any ATI monitors are enabled
- On newer drivers (185.81 and up) the control panel works fine, but if the ATI card is not disabled in the Device Manager the PhysX option will not show at all.
- Enabling the ATI card while the Nvidia Control Panel is open causes it to refresh, once again removing the PhysX option
how can people defend nvidia. there wrong we will buy there cards to use physx but thats not good enough. cant wait for free and better open cl + dx11 byebye physx
I did say in MY experience (and that of a few friends), ATI drivers sucked way more. I doubt they work yet with Srv03 with 4GB of RAM well. Before Cat 9.2 (or 3 not sure..) they would hang the OS before logon screen, later overlay didn't work with 4GB, then two videos at the same time caused issues... that's not my case - that would happen on any Srv03 with 4GB of RAM (maybe with more also). I know cause I tried several machines.
nVidia had no issues in this config for ages before ATI.
ATI drivers just hate PAE -> sucky drivers.
[QUOTE=alfaunits;3951079]The problem is that your asking others to make there decisions based on your experience.
Its about getting the right combination of hardware that works together & at driver level to get the best success rate & i have pulled it off with researching what motherboards work best with my choice of gfx card & ram & whether there are any possible BIOS issues that can effect both.
I have never used less than 4GB of ram from day one.
We're not talking about consumers. We're talking about CUSTOMERS.
Most customers are very well educated about the graphics chips. The majority of nVidia's sales are to OEMs and to the enthusiast market. Look at the recent nVidia BS. The company sold faulty GPUs (g80 and g92) to OEMs and recently had to pay two huge fines to them to partly correct the issues consumers have been having. You can bet that in the future the OEMs are less willing to buy from nVidia and will do so for a lower price.
So, you're running a Server OS, experience problems and then you go on claiming that AMD/ATI's drivers suck in general? Way to go... :down:
I'm happy if it stays that way, as long as the drivers keep on being as stable as they've proven to me many times when using a regular desktop OS.
Just because you know some who have problems with AMD/ATI's drivers proves nothing. I know some who had or still have problems with NVIDIA's drivers as well (one even sold his GTX280-SLI due to driver related problems) but still I don't go on and claim NVIDIA's drivers suck as hell. I never had any serious problems with either NVIDIA's nor AMD/ATI's drivers, period.
You would not do well in business. This is not a move that nVidia can make because there are other alternatives to PhysX, like using the built in physics engines in most games which is perfect on its own.
Which would be perfectly fine for PhysX. You can do that with a couple of cheap, old nVidia cards. This is locking out the competitor's product plain and simple.Quote:
They're just telling people that due to design compatibility issues, they don't think it's worth the investment to help their competitor use a feature that's designed for nVidia cards, so they can go the cheap way out and buy an ATI card with an older and much cheaper nVidia card for PhysX.
Only problem is that PhysX is an enthusiast product and most people that buy it are well informed.Quote:
You're going to see plenty of the mainstream "uninformed" public just go with nVidia because they get to run PhysX. I'm sure they'll draw far more new customers over this than they'll actually lose.
Nobody cares.Quote:
I haven't owned an ATI card since the 9700pro and by the looks of things, It's going to stay that way.
Look into the PhysX and CUDA licensing issue before you run your mouth.Quote:
ATI is the only one here to blame since they haven't gotten physX and their CUDA type technology to run on their own cards well.
There's no need to call someone names B.E.E.F.. I think we are all grown-ups that are able to argue on a reasonable basis.
Yes they are, they are preventing me from using my 8800gt card for running physux. They advertise it as a feature of their GPUs, Windows 7 has a feature that allows us to run multiple vendor GPUs, and then they come out and retract the support for that feature unless we run nvidia GPUs only.
This premise makes the entire rest of your post flawed.
As for nvidia, bananas and bankruptcy to you.
For a company that's losing money and is trying desperately to push its PhysX into mainstream sure has a FUBAR way of doing it. Its this kind of anti-competitive practices why IT world is so....FUBAR.
Don't be so easily fooled. Red faction looks good but it's mostly simple rigid body physics covered up with a bunch of particle effects. Try breaking a wall, you'll see some big blocks break off and a lot of tiny bits. All those tiny bits then disappear - i.e graphical particle effects, not physics.
Hopefully, that remains to be seen.Quote:
intel has said that it will support openCL on havok
You should tell Microsoft then cause they don't know somebody stuck physics middleware into DirectX when they weren't looking.Quote:
its not new that dx11 with have physics.
It seems you know nothing about you're talking about. Way, way before you were even colonized we had the verb "ganar", and the expression "ganarse la vida", which means exactly "to earn one's living". And I can say it in 5-10 different ways. Every European here will tell you the same about their respective languages. And last time I checked you were talking an european language.
dont know about the game....
Ahh tbh I think they will but not for a while (until there finalizing larrabee etc, so they can get it to work niceley on their hardware and do a co-launch kind of thing)
iirc directx comes with some open standard of physx? have heard it for a long time also but not too sure. :)
Ignore him he's totally ignorant, and a fan boy anyways.
The physics in DX nonsense is dreamed up by people on forums. If there is to be a CS or OpenCL based competitor to PhysX somebody has to write it. Think of it this way, does DX give you graphics? No it doesn't, game developers have to code their games. In any case, why don't people just read Microsoft's own documentation? :shrug: