MMM
Page 5 of 5 FirstFirst ... 2345
Results 101 to 120 of 120

Thread: Nvidia: Hybrid PhysX Is Technically Impossible

  1. #101
    Xtreme Cruncher
    Join Date
    Jul 2006
    Location
    Overland Park, KS
    Posts
    468
    Quote Originally Posted by Final8ty View Post
    I would like to see quotes that users here said that they want physX on ATI cards.

    And for the blaming what's new when it comes to pointing fingers as to why someone's setup is playing up, that's part of the platform & all involved get blamed daily for things that are not of there doing & if NV cant take it then they should get off the open PC platform & make there own closed one like Apple.
    Quote Originally Posted by brinox
    I want to run my HD 4890 with my 9800 GT EE picking up the offloaded hardware physx abilities. Better yet, I want to use my BFG/AGEIA PPU that I purchased forever ago to work with my ATI card, as it was originally marketed.
    quoted. 10 chars
    for the glory of bardob!



  2. #102
    Xtreme Addict
    Join Date
    Oct 2007
    Location
    Chicago,Illinois
    Posts
    1,182
    Shift installs 9.09.072 drivers which disables ageia ppu.You need 9.09.0814,and Modded .dll

    Also

    THE LAST REMNANT and GOTHIC 3 WORKS
    Last edited by Hell Hound; 05-19-2010 at 01:52 PM.



  3. #103
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by brinox View Post
    quoted. 10 chars
    Originally Posted by brinox
    I want to run my HD 4890 with my 9800 GT EE picking up the offloaded hardware physx abilities. Better yet, I want to use my BFG/AGEIA PPU that I purchased forever ago to work with my ATI card, as it was originally marketed.
    That is not someone asking for PhysX on ATI cards that is someone who has a PPU that stopped being supported & now wants to see if he can use a 9800 GT EE to do the job of the PPU.

    The difference is that on= running on ATI cards with=running ATI card with NV/PPU card to do the PhysX
    Last edited by Final8ty; 05-06-2010 at 06:32 PM.

  4. #104
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Hell Hound View Post
    Just checked Legendary it uses the AGEIA PPU and also my AMBX KIT.
    Well ill have to pick one up .
    The thing about a standalone PPU PhysX card is that it only adds & can never take away performance directly from the GPU.

    NV should of given the option for selling there lower end GPU's as a PPU by having no display ports & i Bios change & of coarse the current options with standard GPUs.

  5. #105
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by Andrew LB View Post
    Really? Quite a few ATi users on this forum have commented on how they want to use PhysX on their ATi cards, as recently as the post just prior to yours. And while the majority simply want to be able to use an nVidia card as a dedicated PhysX card, plenty of members here continue to say that nV should allow everyone to use their API.
    Could not be more incorrect. There has not been one single ATi user in any thread asking for PhysX on ATi hardware.

    The only thing ATi users want is for PhysX not to be disabled on their nV secondary GPU when their primary GPU is not an nV GPU.

    The graphics drivers do not interact when one is performing PhysX operations and the other is performing graphics rendering work. nVidia is only throwing this in as a bone to themselves. I guess saying "using PhysX in a system with a non nVidia GPU may cause performance and/or stability issues" is too easy for nV.

    All along the watchtower the watchmen watch the eternal return.

  6. #106
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    I failed to understand the logic of NVIDIA with this. Every GeForce sold is better than none right? If they expect existing Radeon users to switch from great Radeon to a higher range GeForce just because of PhysX, i don't know from what planet they came up with this idea. Because i don't know anyone that would do that. But i know plenty of ppl who would buy an extra mid range GeForce card just for PhysX to work along their existing Radeon.
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  7. #107
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by saaya View Post
    bs... if you set physix to max even a dedicated 250 isnt enough... even with a dedicated 260 the fps drop below 30 in physix heavy scenes... if those scenes would look mouthwatering, who would care... but they dont... not at all...
    Did you even click on the link? The link shows that a GTS 250 can achieve very playable frames with maxed out PhysX at 1050p and below. It's only when you start increasing the resolution/AA that it becomes unplayable.

    This mirrors my own experience. When I had my 24 inch monitor (1920x1200), I could easily play Batman AA with maxed out PhysX and settings on a single overclocked GTX 285.

    When I got my 30 inch monitor though, I had to get a dedicated PhysX card because the 3D load had increased substantially.

    gpu physix are dead imo... gpu physix this and that... all the hype... for almost 10 years now... and still, today gpu physics cant offer anything worth the effort that cpu physix couldnt do as well...
    Premature to say the least. Anyway, unless you believe CPUs and GPUs are equally capable in FPU work (which we all know isn't true), then it's ridiculous to state that GPU accelerated physics can't offer anything that CPU physics can already do.

    sure the gpu CAN do physix as well, im sure there are a lot of things it can do better than a cpu... but do those things really matter and add value to a game that is in any reasonable ratio to the work it requires to implement? clearly not...
    This assertion has nothing to do with the intrinsic value of GPU accelerated PhysX, and everything to do with GAME DEVELOPERS.

    After all, it's game developers that decide how to implement physics in their games. If games have lackluster physics, do you blame the physics API or the game developers?

    A reasonable person would place the blame squarely on the developer, much as they would if a game had ty graphics, boring storyline and crashed to the desktop every 10 minutes.

    Granted, some 3D engines or Physics APIs are inherently better than others, but ultimately, it's the talent, skill and creativity of the developer that makes or breaks a game.
    Last edited by Carfax; 05-06-2010 at 10:36 PM.
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  8. #108
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by aztec View Post
    The only spin-meisters here are you and some of the other Nv crackheads.
    Well at least you show your true colors now.. From one FanATIc to another

    Yes, we'll want to run PhysX on a single card...if you still have a Tandy 13" dia. screen/640 res/16 colors and load in King's Quest on floppy. Oughta get some playable FPS then, yep.

    Or...maybe not.
    I provided evidence that PhysX can run on a single graphics card with decent performance.

    You on the other hand, haven't provided squat to back up your claims other than some nonsensical fanATIcal diatribe

    I've used Nv and ATI and liked them both. But this Nv + PhysX is just another steaming pile from their marketing division, as others have pointed out.

    Get over it.
    From the bitter tone of your comment, it sounds like you're the one that needs to get over it..
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  9. #109
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    I looked at your link. 34fps avg at 1920x1080 4xAA PhysX On with a 285 single is hardly playable.. and drops to a 19fps average on a more intensive (scarecrow) benchmark. You can get by with 30-40fps on some engines and in some single player games, but honestly single player games have a very short shelf life and engines which pull your FPS unrealistically for their graphical display prowess down suffer from the Crysis effect: useless benchmarks only.

    All along the watchtower the watchmen watch the eternal return.

  10. #110
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by STEvil View Post
    I looked at your link. 34fps avg at 1920x1080 4xAA PhysX On with a 285 single is hardly playable.. and drops to a 19fps average on a more intensive (scarecrow) benchmark. You can get by with 30-40fps on some engines and in some single player games, but honestly single player games have a very short shelf life and engines which pull your FPS unrealistically for their graphical display prowess down suffer from the Crysis effect: useless benchmarks only.
    The link was in reference to the single GTS 250 getting an average FPS of 38 in the Scarecrow level with 2xAA at 1680x1050, with maxed settings.

    Disable AA and the frames will increase even more..

    Same thing with the GTX 285. At 1920x1200 4xAA with all settings maxed, the framerate isn't playable because the GPU is over burdened with 3D rendering.

    However, if you turn down the AA to 2xAA or disable it entirely, you will have playable frames.

    But the point is that you guys are making "absolute" statements that you can't run PhysX and 3D on the same card, which is false. I did it before when I had my GTX 285, and there are many others that have done, and are doing it..

    May you have to turn down some 3D settings a bit? Perhaps.. But, that doesn't disqualify the assertion that PhysX and 3D can be run on the same card.
    Last edited by Carfax; 05-07-2010 at 12:21 AM.
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  11. #111
    Xtreme Member
    Join Date
    Apr 2006
    Location
    United Corporate States of Neo-Feudal Amurika, Inc.
    Posts
    464
    Quote Originally Posted by Carfax View Post
    Well at least you show your true colors now.. From one FanATIc to another



    I provided evidence that PhysX can run on a single graphics card with decent performance.

    You on the other hand, haven't provided squat to back up your claims other than some nonsensical fanATIcal diatribe

    That would depend on what one calls "decent" performance. lol Keep trying, CF. :yawn;

    As I've said, I've used and like both brands but I know BS when I see it. You OTOH...

    So...who is really the fanatic here??

    AUDIO-ASUS Xonar DX SPKR-audioengine 5 CASE-Cooler Master Stacker RC-810-SSN1 CPU-E8400 - Q815 @ 4 GHz @ 1.23V FANS-Noctua GPU-EVGA GTX 660/2GB HDD-Raptor 150 ADFD + WD1600YS HSF-Noctua NH-U12 LCD-NEC 20WMGX² @ 1680x1050 MOBO-abit IP35 Pro - BIOS 16 + bolt mod OS-XP Pro x64 PSU-XFX 750W Black Edition RAM-G.Skill PC2-8800 Pi 2x2GB @ 1,128 @ 1.92v TIM-Arctic Cooling MX-2 UPS-TRIPPLITE SU1000XLa + Noctua fan mod

  12. #112
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Carfax View Post
    Did you even click on the link?
    no... i plated BAA on my core i7 rig, i dont need an article to tell me what i experienced first hand... i was on dual 250s, had to set one as deciated physix to get ok perf, even then it wasnt great, swapped them for 2 260s and even with a dedicated 260 with physiox set to max it STILL stuttered and min fps were way below what youd expect from such a highend system... ESPECIALLY since the effects were nothing special at all...

    Quote Originally Posted by Carfax View Post
    it's ridiculous to state that GPU accelerated physics can't offer anything that CPU physics can already do.
    i didnt say that... i said nothing worthwile...
    im sure there is a lot you can do on a gpu you cant do on a cpu... but is it worth it? i havent seen anything so far that is...

    Quote Originally Posted by Carfax View Post
    A reasonable person would place the blame squarely on the developer, much as they would if a game had ty graphics, boring storyline and crashed to the desktop every 10 minutes.

    Granted, some 3D engines or Physics APIs are inherently better than others, but ultimately, it's the talent, skill and creativity of the developer that makes or breaks a game.
    so physix is awesome but there isnt ONE game dev that can use it properly, thats why physix implementations arent really overwhelming and cause massive fps drops?

    Quote Originally Posted by Carfax View Post
    The link was in reference to the single GTS 250 getting an average FPS of 38 in the Scarecrow level with 2xAA at 1680x1050, with maxed settings.
    with physix what you need to look at is minfps... thats what tells you if you need a dedicated card for physix or a more powerful card for it...

  13. #113
    Xtreme Mentor
    Join Date
    Feb 2009
    Location
    Bangkok,Thailand (DamHot)
    Posts
    2,693
    i see those FAQ / Q&A many months ago
    Intel Core i5 6600K + ASRock Z170 OC Formula + Galax HOF 4000 (8GBx2) + Antec 1200W OC Version
    EK SupremeHF + BlackIce GTX360 + Swiftech 655 + XSPC ResTop
    Macbook Pro 15" Late 2011 (i7 2760QM + HD 6770M)
    Samsung Galaxy Note 10.1 (2014) , Huawei Nexus 6P
    [history system]80286 80386 80486 Cyrix K5 Pentium133 Pentium II Duron1G Athlon1G E2180 E3300 E5300 E7200 E8200 E8400 E8500 E8600 Q9550 QX6800 X3-720BE i7-920 i3-530 i5-750 Semp140@x2 955BE X4-B55 Q6600 i5-2500K i7-2600K X4-B60 X6-1055T FX-8120 i7-4790K

  14. #114
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Carfax View Post
    The link was in reference to the single GTS 250 getting an average FPS of 38 in the Scarecrow level with 2xAA at 1680x1050, with maxed settings.

    Disable AA and the frames will increase even more..

    Same thing with the GTX 285. At 1920x1200 4xAA with all settings maxed, the framerate isn't playable because the GPU is over burdened with 3D rendering.

    However, if you turn down the AA to 2xAA or disable it entirely, you will have playable frames.

    But the point is that you guys are making "absolute" statements that you can't run PhysX and 3D on the same card, which is false. I did it before when I had my GTX 285, and there are many others that have done, and are doing it..

    May you have to turn down some 3D settings a bit? Perhaps.. But, that doesn't disqualify the assertion that PhysX and 3D can be run on the same card.
    do you know of any benchmarks that show the fps loss when turning on phyxs? i would like to see how much performance is required to handle those things. and if you can, find one that shows how of the cpu is being used. and finally, one with a hacked version that runs it on the cpu entirely.

    what i expect to see is that you loose probably 30% of your avearge fps with physx on for any average card (meaning not spending 500$ on the gpu setup)
    and i bet you see that these games are duel core optimized and if you run physx on the cpu, it will be about the same, except your gpu can get more fps, even if there are a few spikes here an there due to cpu bottleneck.

  15. #115
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    No, there is no reason why NVIDIA should allow AMD to use PhysX as well, but I see little reason to artificially block PhysX when a competitors card is present, too. This is just the same marketing-BS like back in the days when NVIDIA said Intel's chipsets didn't have enough bandwith for SLI, hence blocking SLI on their chipsets.

    The only ones losing are the customers, that's why I find it hard to understand those defending NVIDIA's actions.
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  16. #116
    Xtreme Addict
    Join Date
    Oct 2007
    Location
    Chicago,Illinois
    Posts
    1,182
    Also the AGEIA ppu has a lot of life left in it why wont they support it.I remember when they were bought they said they would still support the ppu. LIARS,Carmack was right they didn't care about the growth of the gaming industry they just wanted some Money.



  17. #117
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by Katanai View Post
    Stop the Nvidia hate already. You are reading that statement wrong:

    "Can I use an NVIDIA GPU as a PhysX processor and a non-NVIDIA GPU for regular display graphics? No. There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, NVIDIA PhysX technology has been fully verified and enabled using only NVIDIA GPUs for graphics."

    I've bolded the important part. That's their reason behind it. Just to make you understand: physx is just as much about hardware as it is about software. For example, physx may work now with the current ATI cards and drivers. But what if, let's say, a new ATI driver or card breaks this compatibility. What should Nvidia do then? Invest money and time to fix something their main competitor broke? Why should they do that? And why should they help their competitor in the first place? Physx is a selling point for Nvidia hardware. Why should they share that with ATI? If Nvidia would allow right now, through a new driver physx to work with ATI cards they would instantly loose sales as people would upgrade to ATI cards and not Nvidia. I don't know how many people are held back on the green team by physx but if it's only one guy, in Korea somewhere, that would buy a 5870 instead of a GTX470 and keep his 8800GT for physx, they would loose a couple of hundred dollars to ATI. Why would they do that? Aren't you asking much of them here?
    If people were complaining about issues with PhysX not working properly because of an actual issue or glitch while using ATI graphics, it wouldn't be on Nvidia to fix it. I don't think many people would complain about it. However, this is not what the issue is. PhysX has and does work perfectly fine with ATI in the system and Nvidia is purposely disabling it and lying about the reasons.

    Nvidia is a bunch of bold faced liars, who in my legal opinion, has committed (in the US) illegal marketing practices. They advertise the ability for cards to use PhysX (says so right on the box), then when put in the system it does not work if another card other than Nvidia is detected.

    I would wager to say that had Nvidia not done this you would see wider adoption of PhysX by us, the customers. Because regardless if we choose Nvidia or ATI for our main graphics we could still buy an Nvidia card for PhysX support. But this isn't about PhysX and it certainly isn't about the money they paid for it. It is about marketing, because PhysX is another marketing gimmick by Nvidia not an actual physics platform.
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  18. #118
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Stukov View Post
    If people were complaining about issues with PhysX not working properly because of an actual issue or glitch while using ATI graphics, it wouldn't be on Nvidia to fix it. I don't think many people would complain about it. However, this is not what the issue is. PhysX has and does work perfectly fine with ATI in the system and Nvidia is purposely disabling it and lying about the reasons.

    Nvidia is a bunch of bold faced liars, who in my legal opinion, has committed (in the US) illegal marketing practices. They advertise the ability for cards to use PhysX (says so right on the box), then when put in the system it does not work if another card other than Nvidia is detected.

    I would wager to say that had Nvidia not done this you would see wider adoption of PhysX by us, the customers. Because regardless if we choose Nvidia or ATI for our main graphics we could still buy an Nvidia card for PhysX support. But this isn't about PhysX and it certainly isn't about the money they paid for it. It is about marketing, because PhysX is another marketing gimmick by Nvidia not an actual physics platform.
    Indeed Havoc is used for its Physics because everyone can run it while PhysX is used to add superficial gfx,

  19. #119
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by Carfax View Post
    The link was in reference to the single GTS 250 getting an average FPS of 38 in the Scarecrow level with 2xAA at 1680x1050, with maxed settings.

    Disable AA and the frames will increase even more..

    Same thing with the GTX 285. At 1920x1200 4xAA with all settings maxed, the framerate isn't playable because the GPU is over burdened with 3D rendering.

    However, if you turn down the AA to 2xAA or disable it entirely, you will have playable frames.

    But the point is that you guys are making "absolute" statements that you can't run PhysX and 3D on the same card, which is false. I did it before when I had my GTX 285, and there are many others that have done, and are doing it..

    May you have to turn down some 3D settings a bit? Perhaps.. But, that doesn't disqualify the assertion that PhysX and 3D can be run on the same card.
    34fps is unplayable, but 38fps isnt?

    I agree turning down AA should boost performance, but unfortunately the benchmarks are incomplete and do not give performance for lesser AA levels.

    Given the minimum FPS numbers displayed with low average numbers and the "intensive" (scarecrow) benchmarks i'm going to sit with the "rendering and PhysX on the same card results in a poor gaming experience" crowd.

    All along the watchtower the watchmen watch the eternal return.

  20. #120
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Location
    London
    Posts
    577
    Quote Originally Posted by Hell Hound View Post
    Also the AGEIA ppu has a lot of life left in it why wont they support it.I remember when they were bought they said they would still support the ppu. LIARS,Carmack was right they didn't care about the growth of the gaming industry they just wanted some Money.
    This. Its quite unfortunate for the PPU owners as well. Nvidia are turning into a true bunch of :banana::banana::banana:gots

    Honestly, even the Driver issue is bull$hit. Nvidia cards being used as a PPU should really be supported, i am sure an ATI driver update will not break that. Is there even any driver level interaction between the two cards? no. why would a secondary card being used for physics stop working then?

    I guess Nvidia cannot bear the fact that their cards would only be run as a physics processing card only while the ati card would be the primary card. Yep
    i7 920@4.34 | Rampage II GENE | 6GB OCZ Reaper 1866 | 8800GT (zzz) | Corsair AX750 | Xonar Essence ST w/ 3x LME49720 | HiFiMAN EF2 Amplifier | Shure SRH840 | EK Supreme HF | Thermochill PA 120.3 | MCP355 | XSPC Reservoir | 3/8" ID Tubing

    Phenom 9950BE @ 3400/2000 (CPU/NB) | Gigabyte MA790GP-DS4H | HD4850 | 4GB Corsair DHX @850 | Corsair TX650W | T.R.U.E Push-Pull

    E2160 @3.06 | ASUS P5K-Pro | BFG 8800GT | 4GB G.Skill @ 1040 | 600W Tt PP

    A64 3000+ @2.87 | DFI-NF4 | 7800 GTX | Patriot 1GB DDR @610 | 550W FSP

Page 5 of 5 FirstFirst ... 2345

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •