Results 1 to 25 of 82

Thread: Linux vs. Windows on HCC - a study

Threaded View

  1. #1
    Back from the Dead
    Join Date
    Oct 2007
    Location
    Stuttgart, Germany
    Posts
    6,602

    Linux vs. Windows on HCC - a study

    I originally posted this in artemm's thread, but since he actually did something different (HCC-only vs. all projects on Linux) I thought this deserved its own thread.

    What I'd like to do here is find out which OS is best for crunching. This is done by comparing data from one or multiple machines. I started off by calculating some numbers for you - Windows 7 x64 vs. Ubuntu 10.04 x64 with no changes to the hardware (clockspeed etc).

    All numbers are from the "result statistics" page on the WCG website. I just picked the first 15 (or mostly 30) valid HCC WUs, added them up, and divided through the number of WUs taken into account to generate an approximate average.

    -----------------------------------------------------------

    L5640: (Hexacore; Frequency: 3,4Ghz; Cache: 12MB L3; Threads: 12)

    Runtime per WU on Win7 (15-WU average): 1,976h
    Granted credit per WU on Win7 (15-WU average): 43,83

    Credits per hour (per thread): 22,18
    Credits per hour (CPU total): 266,17
    -----------------------------------------------------------

    Runtime per WU on Ubuntu 10.04 (30-WU average): 1,227h
    Granted credit per WU on Ubuntu 10.04 (30-WU average): 22,82

    Credits per hour (per thread): 18,6
    Credits per hour (CPU total): 223,18

    ----------------------------------------------------------

    Core i7 980X: (Hexacore; Frequency: 3,68Ghz; Cache: 12MB L3; Threads: 12)

    Runtime per WU on Win7 (30-WU average): 1,876h
    Claimed credit per WU on Win7 (30-WU average): 52,61
    Granted credit per WU on Win7 (30-WU average): 42,82
    Claimed vs. granted ratio on Win7 (30-WU average): 81,4%

    Granted credits per hour (per thread): 22,83
    Granted credits per hour (CPU total): 273,9
    -----------------------------------------------------------

    Runtime per WU on Ubuntu 10.04 (30-WU average): 1,072h
    Claimed credit per WU on Ubuntu 10.04 (30-WU average): 30,71
    Granted credit per WU on Ubuntu 10.04 (30-WU average): 22,55
    Claimed vs. granted ratio on Ubuntu 10.04 (30-WU average): 73,4%

    Credits per hour (per thread): 21,04
    Credits per hour (CPU total): 252,43

    So... what have we here.
    From the looks of it, the Linux client is definitely more efficient. It completes the WUs a lot faster. However, the Windows client gets higher granted credit.
    Now the real question is whether Linux and Windows client are getting the exact same work units.

    Either way, I'm a little disappointed. I had hoped to present to everyone the "better" OS for crunching, yet it seems people have to choose between getting more science done (if the WUs are the same) or getting higher PPD.

    --------------------------------------------------------------

    To better illustrate my findings; I attached a graph. Higher is better except for runtime (which I multiplied by 20 to better show the difference in the chart)
    Attached Images Attached Images
    Last edited by jcool; 05-27-2010 at 03:04 AM.
    World Community Grid - come join a great team and help us fight for a better tomorrow![size=1]


Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •