Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 88

Thread: Futuremark: NVIDIA GPU PhysX not allowed

  1. #26
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Quote Originally Posted by Marios View Post
    Why is it OK for a CPU to do the job of a PPU and not a GPU?
    Unless we get zero points from a CPU doing the job of a PPU we should allow a GPU doing the job of a PPU. Are we serious now?
    Are we going to define the PPU GPU and CPU terms now?
    Who cares about terms as soon as something out there does the same job?
    You got it backwards though. The test is for CPU Physics which the PPU accelerates. It jsut so happens that Futuremark has added to the number of threads of that test when using a ppu, but they have not adjusted for the gpu, and hence the huge score increase.

    In other words, that test does what nV thinks it should do on the gpu, not FUturemark, and as such, Futuremark cannot guarnatee the validity of the scores. As Someone said, it's about Futuremark adding the functionality to the app, not nVidia.

    There are already GPU Physics test incorporated into the other tests. That specific test merely inflates CPU scores, not gpu scores. If nVidia had used this to accelerate the gpu cloth physics in the Jane Nash test, then there'd be no issue, as that test measures gpu performance, and is coded for such, but the cpu test is not coded to be run properly on a gpu.
    Last edited by cadaveca; 07-18-2008 at 09:42 AM.

  2. #27
    Xtreme Member
    Join Date
    Dec 2005
    Posts
    427
    I thought FM included a physics test to accelerate physics innovation and wider physics support from games.
    Test calibration is not an excuse to allow CPU doing physics and exclude the GPU.
    Physics acceleration is not a cheat. It is real and we expect better gaming in the future from this.
    I cannot see your point about calibration, since all three rivals are able to include physics hardware acceleration.
    3DMark Vantage X (1920x1200) is not greatly affected from the CPU or the PPU score.
    This is a graphics card benchmark and should reflect real world gaming.
    It is not a CPU benchmark anyway.

  3. #28
    Tell your futuremark friend that they need to do a better job of "banning" the scores sampsa. FM made the decision to remove the scores awhile back but still can't seem to get it right.
    The whole point was to make the scores more comparable across platforms and now the scores aren't even comparable across the same platform lol.
    Last edited by k|ngp|n; 07-18-2008 at 10:18 AM.

  4. #29
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    494
    Quote Originally Posted by Yukon Trooper View Post
    You'd definitely have an argument if PhysX was used in all games. Considering it's not, it shouldn't be used in benchmarks.
    Well from going by what your saying, if it's not in all games. They should just remove the PhysX test all together from the test....
    Ryba's Ver 4 DryIce/LN2 pot"
    Cryo-Z Phase-R-507A
    AMD 955
    GD70FX
    Crucial2GB kit (1GBx2),DDR3 PC3-16000(2Ghz)
    2 x 4890 MSi OC Cards
    Maze5, Maze4, 120 Black Ice EX, one 240 Black Ice EX
    Fixed speed pump powered by a 17v meanwell



  5. #30
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    Quote Originally Posted by DaMulta View Post
    Well from going by what your saying, if it's not in all games. They should just remove the PhysX test all together from the test....
    No, no. If you want to run a separate Physx test that's fine. However, in a "competitive" environment where benchmarks are compared and documented, Physx should be disallowed. I agree with their decision 100%.
    Last edited by YukonTrooper; 07-18-2008 at 10:40 AM.

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  6. #31
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    494
    Well AGEIA has been bought, and soon you will not be able to buy those cards anymore. The test will still use AGEIA PhysX cards. So for the people that still have them down the road(a year or 2 from now) when it becomes almost imposable to find. Will have an unfair advantage as well because the whole test will not just be done on the CPU for those people.

    The only way to really keep the AGEIA PhysX test in the program fairly(because AGEIA PhysX was bought out) would to allow it to be done on the GPU. AMD could implement this into their cards as well if they want to.
    Ryba's Ver 4 DryIce/LN2 pot"
    Cryo-Z Phase-R-507A
    AMD 955
    GD70FX
    Crucial2GB kit (1GBx2),DDR3 PC3-16000(2Ghz)
    2 x 4890 MSi OC Cards
    Maze5, Maze4, 120 Black Ice EX, one 240 Black Ice EX
    Fixed speed pump powered by a 17v meanwell



  7. #32
    Xtreme Legend
    Join Date
    Sep 2002
    Location
    Finland
    Posts
    1,707
    Quote Originally Posted by k|ngp|n View Post
    Tell your futuremark friend that they need to do a better job of "banning" the scores sampsa. FM made the decision to remove the scores awhile back but still can't seem to get it right.
    The whole point was to make the scores more comparable across platforms and now the scores aren't even comparable across the same platform lol.
    New updated filter should be up and running in couple of days which should recognize PPU PhysX and GPU PhysX. Now there seems to be some results which shouldn't be there. Lets hope they will get all issues fixed soon so we can concentrate to benchmarking
    Favourite game: 3DMark
    Work: Muropaketti.com - Finnish hardware site
    Views and opinions about IT industry: Twitter: sampsa_kurri

  8. #33
    Xtreme Legend
    Join Date
    Sep 2002
    Location
    Finland
    Posts
    1,707
    Quote Originally Posted by DaMulta View Post
    The only way to really keep the AGEIA PhysX test in the program fairly(because AGEIA PhysX was bought out) would to allow it to be done on the GPU. AMD could implement this into their cards as well if they want to.
    That's not how things work, designing a 3DMark benchmark tool is huge and long process and the "problem" they had with Vantage was that NVIDIA bought AGEIA when benchmark was already on finishing stages.
    Favourite game: 3DMark
    Work: Muropaketti.com - Finnish hardware site
    Views and opinions about IT industry: Twitter: sampsa_kurri

  9. #34
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    494
    Very true, and I'm not saying that Vantage took them a long time to produce(I know it did). What I'm saying is that your not going to be able to buy AGEIA PhysX cards for very much longer. Then those people that have them will be the only ones to see the advantage if they don't allow GPUs to take the place of those cards.
    Last edited by DaMulta; 07-18-2008 at 03:52 PM.
    Ryba's Ver 4 DryIce/LN2 pot"
    Cryo-Z Phase-R-507A
    AMD 955
    GD70FX
    Crucial2GB kit (1GBx2),DDR3 PC3-16000(2Ghz)
    2 x 4890 MSi OC Cards
    Maze5, Maze4, 120 Black Ice EX, one 240 Black Ice EX
    Fixed speed pump powered by a 17v meanwell



  10. #35
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    They shouldn't be called Futuremark, they should call themselves Ancienthistorymark, because clearly they have absolutely nothing to do with advancing future abilities in hardware. I have a good many reasons to have no love of Nvidia right now, but this decision is ridiculous. Replacing Vantage files? Foul and that should be something no one is allowed to do. But beyond that, if you have hardware to accelerate a process, any process it should be quite useful to do it. How else do you advance the industry? You don't want to know what 5 year old cards could do! You want to know what a card will do someday when the programming becomes common. This should have been pressure on ATI to get some code that could keep up or shut up and deal.

    You want comparables? Filter out the test out of the final score, plain and easy. But no, not Futuremark. They don't care in the slightest about the end product, they only care that getting to the end product is done in some ridiculous outdated way, as some demonstration that the future of gaming is being demonstrated by such a test.

    Nvidia are idiots, and can dig their own pit (proof is happening right now even). But if there is hardware acceleration available to complete a test faster that's not a foul. Replacing files, that's bad. But the acceleration itself is fine. If you have a wish to keep it fair, or judge "world records" on a fairness basis, just exclude the test and take a score without it.

    Heck they are both such smart companies, I'm thinking Nvidia and Futuremark are siblings who just fight all the time and are both dumb as posts.

  11. #36
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Would be better if they just released a patch that ran the physics test twice. Once on CPU once on GPU.

    All along the watchtower the watchmen watch the eternal return.

  12. #37
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Philippines
    Posts
    793
    what about those using dual GPU's where one is used as a physx card?


    Rig Specs
    Intel Core 2 Extreme QX9650 4.0ghz 1.37v - DFI Lanparty UT P35 TR2 - 4x1GB Team Xtreem DDR2-1066 - Palit 8800GT Sonic 512MB GDDR3 256-bit
    160GB Seagate Barracuda 7200RPM SATA II 8MB Cache - 320GB Western Digital Caviar 7200RPM SATA II 16MB Cache - Liteon 18X DVD-Writer /w LS
    640GB Western Digital SE16 7200RPM SATA II 16MB Cache - Corsair HX 620W Modular PSU - Cooler Master Stacker 832
    Auzen 7.1 X-Plosion - Zalman ZM-DS4F - Sennheiser HD212 Pro - Edifier M2600



    Custom Water Cooling
    Dtek Fusion Extreme CPU Block - Swiftech MCR-220 - Swiftech MCP655-B - Swiftech MCRES-MICRO Reservior - 7/16" ID x 5/8" OD Tubings
    Dual Thermaltake A2018s 120mm Blue LED Smart fans.


    www.mni-photography.site88.net

  13. #38
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by fireice2 View Post
    what about those using dual GPU's where one is used as a physx card?
    Currently it's not supported, but wouldn't it apply too?

    SLI boost in game test + PhysX scores in CPU test = same double standard going nowhere.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  14. #39
    no sleep, always tired TheGoat Eater's Avatar
    Join Date
    Oct 2006
    Location
    Iowa, USA
    Posts
    1,832
    This whole debacle leaves a bad taste in my mouth and I am happy to continue benching the older versions and AQ3 - I am a '01 addicted... I think i am starting to enjoy it more than AQ3.

  15. #40
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by STEvil View Post
    Would be better if they just released a patch that ran the physics test twice. Once on CPU once on GPU.
    I soo agree with this, it could put a end to this drama finally -_- by other hand I think is lame because nvidia invested alot of money on this technology is not their fault AMD/ATi don't have it, so its unfair to call it "unfair" or "cheating" because nvidia gets a boost with their own technology in my opinion...

  16. #41
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Philippines
    Posts
    793
    Quote Originally Posted by Macadamia View Post
    Currently it's not supported, but wouldn't it apply too?

    SLI boost in game test + PhysX scores in CPU test = same double standard going nowhere.
    Yes not supported for now, but what about if Nvidia already fully implements such solution? It would be unfair to those using Nvidia based Physx solutions.

    The situation is different on an intel chipset where SLI not normally possible.


    Rig Specs
    Intel Core 2 Extreme QX9650 4.0ghz 1.37v - DFI Lanparty UT P35 TR2 - 4x1GB Team Xtreem DDR2-1066 - Palit 8800GT Sonic 512MB GDDR3 256-bit
    160GB Seagate Barracuda 7200RPM SATA II 8MB Cache - 320GB Western Digital Caviar 7200RPM SATA II 16MB Cache - Liteon 18X DVD-Writer /w LS
    640GB Western Digital SE16 7200RPM SATA II 16MB Cache - Corsair HX 620W Modular PSU - Cooler Master Stacker 832
    Auzen 7.1 X-Plosion - Zalman ZM-DS4F - Sennheiser HD212 Pro - Edifier M2600



    Custom Water Cooling
    Dtek Fusion Extreme CPU Block - Swiftech MCR-220 - Swiftech MCP655-B - Swiftech MCRES-MICRO Reservior - 7/16" ID x 5/8" OD Tubings
    Dual Thermaltake A2018s 120mm Blue LED Smart fans.


    www.mni-photography.site88.net

  17. #42
    Xtreme 3D Mark Team Staff
    Join Date
    Nov 2002
    Location
    Juneau Alaska
    Posts
    7,607
    just another example of how little 3dmark actually relates to gaming.

    if a videocard can be used as a PPU, and games can take advantage of it, and the benchmark has a specific area for PPU testing....

    wheres the problem?




    "The command and conquer model," said the EA CEO, "doesn't work. If you think you're going to buy a developer and put your name on the label... you're making a profound mistake."

  18. #43
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    near Boston, MA, USA
    Posts
    1,955
    if a videocard can be used as a PPU, and games can take advantage of it, and the benchmark has a specific area for PPU testing....
    wheres the problem?
    QFT

    In my mind this would be like disabling CF or SLI from the scores and say that really, since 99% of the market only has a single GPU, then you shouldn't be able to run two or for to boost scores. At what point does a legitimate hardware ability become something you don't want to allow? ATI could do this. They don't want to spend the resources on it. And frankly, with the bugs in Nvidia drivers sometimes, I think I can see that point of view, hehe. But if hardware can do a think better, then that is ok as long as it's not circumventing the "proper" code paths from operating.

    Replacing files in the Futuremark software was a no-no. And that still stands as the thing "done wrong" on this issue. Having a suddenly higher score is like SLI before there was Crossfire. You don't have something comparable tough cookies.

  19. #44
    Xtreme Member
    Join Date
    Mar 2005
    Posts
    276
    Quote Originally Posted by CraptacularOne View Post
    About time. Artificially inflating their scores is all Nvidia was doing.
    How is it artificial? They are doing physics work with the driver and hardware implemention in their new GPUs right? Maybe I'm not understanding this whole physX situation properly.
    Ultra Aluminus/Silverstone SUGO 05/Thermaltake SPEDO
    Asus P6TD Deluxe/Zotac Mini-ITX GeForce 9300
    Core I3 3ghz/Core i7 920 D0
    TRUE Black/CM Hyper 212 Plus
    G.Skill 4GB PC8500/3GB Corsair DDR3-1866/6GB G.Skill DDR3-2000
    ATI HD4850 1GB/ATI HD4890 1GB/ATI HD5850
    Samsung 305T Plus
    Asus Xonar HDAV 1.3/Creative X-Fi PCI-E
    4 Intel X-25M 80GB SSD RAID-0/Areca 1231ML
    LG 8X Blu-Ray Burner/LG 10X Blu-Ray Burner/Sony Slimline Blu-Ray Optical
    Ultra X3 1600W/Ultra X3 1000W

  20. #45
    Xtreme Enthusiast Shocker003's Avatar
    Join Date
    Jul 2007
    Location
    Germany
    Posts
    725
    Futuremarkīs 3dmark benching is not a true test of system preformance as far as physx is not allowed. While gaming UT3 makes use of physx for example(i thought we run these benches to know how our rigs will fare while gaming or running certain programs)

    I presume Intel is flexing itīs powers again... or maybe am wrong. I will bet that Larabee will be accepted by Futuremark for physics rendering(havoc), but Nvidiaīs physx ainīt good enough for them. Something fishy is going on. If AMD/ATI is willing to support physx they should only ban the use of gpu physx physics for a while until ATI gpu are fully supported(fair play). I hate the fact that we have upgrade our cpu + motherboard for physics, instead of just installing a free Nvidiaīs physx program. I donīt think Nvidia is cheating as they planned using gpu to run physics after taking over AGEIA or should they simiply throw their invested money in a drawer until Intel is thur with their Larabee(HAVOC), so that we can say fair play.
    Last edited by Shocker003; 07-23-2008 at 03:42 AM.


    MAIN RIG--:
    ASUS ROG Strix XG32VQ---:AMD Ryzen 7 5800X--Aquacomputer Cuplex Kryos NEXT--:ASUS Crosshair VIII HERO---
    32GB G-Skill AEGIS F4-3000C16S-8GISB --:MSI RADEON RX 6900 XT---:X-Fi Titanium HD modded
    Inter-Tech Coba Nitrox Nobility CN-800 NS 800W 80+ Silver--:Cyborg RAT 8--:Creative Sound BlasterX Vanguard K08

  21. #46
    Xtreme Member
    Join Date
    Jun 2005
    Location
    S.W. Desert USA
    Posts
    195
    Quote Originally Posted by Kunaak View Post
    just another example of how little 3dmark actually relates to gaming.
    if a videocard can be used as a PPU, and games can take advantage of it, and the benchmark has a specific area for PPU testing....
    wheres the problem?
    I whole heartedly agree! According to FM's thinking, the next thing FM should ban will be the first 8-core CPUs since I'm sure one company will have this before the other.
    i7/930 - Noctua NH-C12P
    Rampage III Extreme - (2) evga 9800GTX
    Corsair AX-1200 - 12GB PGS312G1600ELK
    (2) VelociRaptor 150GB - (2) Raptor 160GB - WD Caviar 2TB
    LG GGW-H20L Blu-Ray - Plextor PX-880SA - X-Fi Titanium Fatal1ty
    Silverstone TJ07 (mostly) Aluminum Case

  22. #47
    Muslim Overclocker
    Join Date
    May 2005
    Location
    Canada
    Posts
    2,786
    Quote Originally Posted by Kunaak View Post
    just another example of how little 3dmark actually relates to gaming.

    if a videocard can be used as a PPU, and games can take advantage of it, and the benchmark has a specific area for PPU testing....

    wheres the problem?
    But isn't there a difference when you run ONLY physx on your GPU, and not intense 3D + physx?

    My watercooling experience

    Water
    Scythe Gentle Typhoons 120mm 1850RPM
    Thermochill PA120.3 Radiator
    Enzotech Sapphire Rev.A CPU Block
    Laing DDC 3.2
    XSPC Dual Pump Reservoir
    Primochill Pro LRT Red 1/2"
    Bitspower fittings + water temp sensor

    Rig
    E8400 | 4GB HyperX PC8500 | Corsair HX620W | ATI HD4870 512MB


    I see what I see, and you see what you see. I can't make you see what I see, but I can tell you what I see is not what you see. Truth is, we see what we want to see, and what we want to see is what those around us see. And what we don't see is... well, conspiracies.



  23. #48
    Registered User
    Join Date
    Jul 2008
    Posts
    88
    I have understood this like this:

    NVidia GPU PhysX is not allowed since it runs CPU tests with a GPU. Not because of unfair advantage etc.

    If I am correct, there is just no other way to approach this than to not allow that.


    Edit: And I had understood this wrong. It uses Ageia PhysX so it should use GPU PhysX too.
    Last edited by MoF; 07-23-2008 at 03:42 AM.

  24. #49
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Potosi, Missouri
    Posts
    2,296
    Quote Originally Posted by ahmad View Post
    But isn't there a difference when you run ONLY physx on your GPU, and not intense 3D + physx?
    Exactly. During the CPU test if the card had to render game-like graphics the score would be nowhere close to what we are seeing. As such the score is completely artificial as the card would not be capable of the same performance in a gaming environment.

  25. #50
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    Where was all the crying when ATI cards produced a higher score in 3DMrk06, but slower in actual game play. Why didn't the 3D makers fix that...

    That violated everything that was just in the Universe...

    Now Nvidia spend millions to get our GPU's to do PhysX calculations, to aid in Game speed, and the 3DMark boys cry unfair?

    To add insult to injury if your PC can do Physx calculations on another card, it counts, and is considered fair....

    LOL

    We seem to be always waiting for ATI to catch up in speed or features...

    If your CPU is allowed to farm Physx calculations off to another card, don't call it cheeting if Nvidia is doing ours as a feature of the GPU.
    We don't care if the card starts with an A or N!!

    looks Vantage is for benching tomorrow's games, as long as ATI can do it too.
    Last edited by Talonman; 07-22-2008 at 06:56 PM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •