Page 4 of 4 FirstFirst 1234
Results 76 to 88 of 88

Thread: Futuremark: NVIDIA GPU PhysX not allowed

  1. #76
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Lawsuit would be for modifying their code without permission.

    The catch though is that the file nVidia replaces isnt made my FM anyways so its really a moot point there. They could make the case that the DLL is acting like the "MMO Glider Bots" that Blizzard dislikes and modifying the output of the program.. which is partly true, but really if FM wanted to take legal action they'd have done it a while ago.

    All along the watchtower the watchmen watch the eternal return.

  2. #77
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    They may not loose in a cort of law for ruling that Nvidias PhysX calculations shouldn't count, and Ageia's PhysX Card should... But in every just persons heart in the world that holds truth and honor sacred, the shame of it all will ring true.

    Thanks 3DMark for selling us Nvidia guy's a benchmark, that would measure our systems PhysX calc's accurately, but then change it to not let it be reported, putting the air of cheating into the mix. Mabey they will also fix thoes bogus ATI slanted 3DMark06 scores too while their at it? Naaaa, just kidding... We both know that won't happin! Their heart just wouldn't be in it.

    Good for Agea!! Their cards still count and will be reported as valid numbers.

    Mabey after ATI get's their cards in the game, PhysX calculations will be fair game to measure again, even on a Nvidia made card too?
    We can only hope...
    Last edited by Talonman; 07-23-2008 at 05:06 PM.

  3. #78
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    I cant disagree that the situation seems to have been handled poorly, however when you've been around a while you learn to sit back and let things play out some.

    If I ran futuremark i'd release a patch that ran the physics benchmark portion once for each accelerator in the system (1x CPU, 1x Physics card, 1x GPU) and gave seperate scores for each. Maybe they're doing that and just not talking (making them look bad) or maybe they're just clamming up (making them look bad). There's a win situation, but unfortunately Futuremark (and most game dev's) are poorly connected to the enthusiast community so all we can do sit back and race our PC chairs while whining to each other about how they dont do this or that or the other thing...

    To test physics + gaming performance you would have to also give more numbers.


    Realistically Vantage is a synthetic performance measurement tool because you cant measure the effect on performance of enabling options (other than AA/AF that is).

    All along the watchtower the watchmen watch the eternal return.

  4. #79
    I am Xtreme
    Join Date
    Feb 2005
    Location
    SiliCORN Valley
    Posts
    5,543
    i get over 16k with my 8800 GTX and my extra special Q9650
    there's no nvidia physX support for the 8800 GTX yet, so ..
    if they saw that score would they immediately throw it out ?

    i had a physX card installed with my Q6600 and was only getting 15k points....
    "These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
    Welcome to the Roughnecks"

    "Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
    You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"

    Heat Ebay Feedback

  5. #80
    Xtreme Addict
    Join Date
    Jun 2006
    Location
    Florida
    Posts
    1,005
    Don't forget your Agea Physx card helping the score.
    Core i7 3770K
    EVGA GTX780 + Surround
    EVGA GTX670
    EVGA Z77 FTW
    8GB (2x4GB) G.Skill 1600Mhz DDR3
    Ultra X3 1000w PSU PSU
    Windows 7 Pro 64bit
    Thermaltake Xaser VI

  6. #81
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    Is this going to be adding more performance to Nvidia GPU's that 3DMark Vantage is going to miss out on measureing too?

    http://www.nordichardware.com/news,7979.html

    "NVIDIA has something big coming, according to its own estimates something about the same size as SLI when it launched. It's not the introduction of the rumored Radeon HD 4870X2 killer, the 55nm G200b core. That would be a bit too weak. There is a new PhysX driver that will enable PhysX on all CUDA-capable GPUs that will launch soon, but it should arrive sooner than that. Instead, the bang comes from updated CUDA support and improved drivers.

    SLI will be updated, connectivity will be updated, CUDA looks like it will go commercial, and better quality and performance overall. The last part may also include improved scaling in various SLI configurations. Whether it will have the same impact as SLI had remains to be seen. More news should follow as we approach the next Big Bang".


    Can ATI cards also not run CUDA?

    Hoping for SLI on intel mobo's too...
    I would jump on a second 280 for sure.
    Last edited by Talonman; 07-24-2008 at 05:02 PM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  7. #82
    Xtreme Addict
    Join Date
    Jun 2006
    Location
    Florida
    Posts
    1,005
    The new X58 chipsets that support Nehalem will be able to use SLI similarly to how the Skull Trail board did. You will likely never see SLI support on any current gen board, so if you want an Intel board and SLI, you will have to swap motherboards anyway.
    Core i7 3770K
    EVGA GTX780 + Surround
    EVGA GTX670
    EVGA Z77 FTW
    8GB (2x4GB) G.Skill 1600Mhz DDR3
    Ultra X3 1000w PSU PSU
    Windows 7 Pro 64bit
    Thermaltake Xaser VI

  8. #83
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    Mabey, but there won't be enouigh X58 boards out to call this a Bang...

    They came so close with the GX2 by putting an SLI chip on the GPU.

    I wonder if a magic driver might do the rest? One 280 with SLI chip on board, and my current 280 with Big Bang driver installed.

    Or better yet, a new SLI bridge with SLI chip on board. Keep the 280's as is.

    A guy can dream...

    I still love my Maximus...

    It's probably just driver related:
    http://www.tweaktown.com/news/9858/n...soonindex.html

    I bet it's cuda about to flip the graphics processing world upside down.

    I wonder how much performance a new CUDA driver could add to a single 280..
    There might be an entire new high gear on my GPU yet to find? Shift baby!!

    It would be fun to have it measured, if it will indeed make games run faster.
    Last edited by Talonman; 07-25-2008 at 01:38 AM.

  9. #84
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    What are the odds that the 3DMark boys got the word about CUDA being released later this year, and new it would be a Big Bang of a performance increase, and wanted to make sure to not measure it...

    I bet not measureing PhysX calculations on Nvidia cards was just the tip of the iceburg...
    CUDA on Nvidia cards is probably out too?

    Will Agea's Physx cards be able to run CUDA? Im sure if they can, Vantage will count their calculations as valid.

    Is CUDA a Physx feature, or a totally seperate animal?

    UPDATE: I think ATI cards can run CUDA:
    http://www.tomshardware.com/reviews/...pu,1954-6.html
    "But, Brook’s critical success was enough to attract the attention of ATI and Nvidiahttp://en.wikipedia.org/wiki/Nvidia , since the two giants saw the incipient interest in this type of initiative as an opportunity to broaden their market even more by reaching a new sector that had so far been indifferent to their graphics achievements".

    I think ATI might be in the game with CUDA?

    Looks like CUDA does not support SLI if this is still accurate.
    "If you want to use several GPUs for a CUDA application, you’ll have to disable SLI mode first, or only a single GPU will be visible to CUDA".
    Last edited by Talonman; 07-25-2008 at 06:40 AM.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  10. #85
    Xtreme Member
    Join Date
    Dec 2005
    Posts
    427
    http://www.fudzilla.com/index.php?op...8600&Itemid=35
    How about Havendales? Do they count as CPU, GPU or what?

  11. #86
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    A truly excellent question!!

    Using FM logic, if CPU calculations being done on a GPU is cheating, doing GPU calculations on a CPU must shurly be a clean foul too.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  12. #87
    Xtreme Enthusiast
    Join Date
    Jul 2008
    Location
    Portugal
    Posts
    811
    I don't really care about 3DMark, but if the calculations can be made (and are in real life) by the GPU, then by all means why not? This is a step towards the past if you ask me.
    ASUS Sabertooth P67B3· nVidia GTX580 1536MB PhysX · Intel Core i7 2600K 4.5GHz · Corsair TX850W · Creative X-Fi Titanium Fatal1ty
    8GB GSKill Sniper PC3-16000 7-8-7 · OCZ Agility3 SSD 240GB + Intel 320 SSD 160GB + Samsung F3 2TB + WD 640AAKS 640GB · Corsair 650D · DELL U2711 27"

  13. #88
    Banned
    Join Date
    Jul 2008
    Posts
    165
    Quote Originally Posted by youngpro View Post
    i guess it comes down to personal preference,

    ive never really cared too much about orb and i will just bench whatever gives me the BEST scores, in this case the physx driver is very innovative for that gpu test and I will be using it
    too bad it's a CPU test

Page 4 of 4 FirstFirst 1234

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •