1 Attachment(s)
Linux vs. Windows on HCC - a study
I originally posted this in artemm's thread, but since he actually did something different (HCC-only vs. all projects on Linux) I thought this deserved its own thread.
What I'd like to do here is find out which OS is best for crunching. This is done by comparing data from one or multiple machines. I started off by calculating some numbers for you - Windows 7 x64 vs. Ubuntu 10.04 x64 with no changes to the hardware (clockspeed etc).
All numbers are from the "result statistics" page on the WCG website. I just picked the first 15 (or mostly 30) valid HCC WUs, added them up, and divided through the number of WUs taken into account to generate an approximate average.
-----------------------------------------------------------
L5640: (Hexacore; Frequency: 3,4Ghz; Cache: 12MB L3; Threads: 12)
Runtime per WU on Win7 (15-WU average): 1,976h
Granted credit per WU on Win7 (15-WU average): 43,83
Credits per hour (per thread): 22,18
Credits per hour (CPU total): 266,17
-----------------------------------------------------------
Runtime per WU on Ubuntu 10.04 (30-WU average): 1,227h
Granted credit per WU on Ubuntu 10.04 (30-WU average): 22,82
Credits per hour (per thread): 18,6
Credits per hour (CPU total): 223,18
----------------------------------------------------------
Core i7 980X: (Hexacore; Frequency: 3,68Ghz; Cache: 12MB L3; Threads: 12)
Runtime per WU on Win7 (30-WU average): 1,876h
Claimed credit per WU on Win7 (30-WU average): 52,61
Granted credit per WU on Win7 (30-WU average): 42,82
Claimed vs. granted ratio on Win7 (30-WU average): 81,4%
Granted credits per hour (per thread): 22,83
Granted credits per hour (CPU total): 273,9
-----------------------------------------------------------
Runtime per WU on Ubuntu 10.04 (30-WU average): 1,072h
Claimed credit per WU on Ubuntu 10.04 (30-WU average): 30,71
Granted credit per WU on Ubuntu 10.04 (30-WU average): 22,55
Claimed vs. granted ratio on Ubuntu 10.04 (30-WU average): 73,4%
Credits per hour (per thread): 21,04
Credits per hour (CPU total): 252,43
So... what have we here.
From the looks of it, the Linux client is definitely more efficient. It completes the WUs a lot faster. However, the Windows client gets higher granted credit.
Now the real question is whether Linux and Windows client are getting the exact same work units.
Either way, I'm a little disappointed. I had hoped to present to everyone the "better" OS for crunching, yet it seems people have to choose between getting more science done (if the WUs are the same) or getting higher PPD.
--------------------------------------------------------------
To better illustrate my findings; I attached a graph. Higher is better except for runtime (which I multiplied by 20 to better show the difference in the chart)
I came to the same conclusion...
I'm not at home right now, but I did a 14 day test between 2 identical computers with the same exact BIOS settings, and using WIndows 7 x64 Ultimate and Ubuntu 10.4. I came up with numbers pretty close to other posters. That is, Linux does more work but benchmarks a lower value and therefore receives less points based on the way WCG calculates points. It's complete BS that LInux can do more work but get less PPD. Their system is broke.
Here's my devil's advocate comment: We're all spending money on computer parts and electricity to do this work. We have proven that their system for calculating points isn't entirely accurate, but we're supposed to trust that the results that we are all providing and paying for (via electricity, heat, and computer parts) actually provide accurate scientific data?
If I went into my previous posts I asked for any comments that they data WCG is using is actually scientifically valid and being used around the world for great studies. I can read lots of posts from people linking to WCG's pages and a study or two using WCG data, but I am beginning to have serious doubts as to how useful this project REALLY is.
I switched from folding at home to WCG about 2 months ago, and I'm starting to having second thoughts on my decision, which is quite disappointing. I'm starting to question the validity of this project and how useful it really is to mankind. Stay tuned because I've decided that I want to do some deep level research on the net and figure out WTH is really going on and if these work units really are as useful as we all have been told.
Are we sure Microsoft isn't involved in this?
Forgive me for being a conspiracy theorist, but I've been pondering this all afternoon. Believe me, i am not a conspiracy theorist generally. I just don't see how Linux could be as fast as it appears, yet still have less PPD than Windows.
Is it at all possible that Microsoft is somehow involved in the reason why Windows gives more points than Linux despite all of the information provided. I did look at the WCG "partners" and Microsoft is not directly listed, but Microsoft has been known to be involved via other subsidiaries. There's 426 partners listed on WCG, so I'm not sure we'd be able to identify if Microsoft is a partner at all. If Microsoft was involved in this, I wouldn't be too surprised. It definitely won't be the first time that MS has been caught being involved in projects solely to make their products look better than their competition.
Truth be told, if people were building lots of machines just to fold (and gee, nobody does that here, right?) alot of people would be buying extra copies of Windows, wouldn't they? That's ALOT of money if you think about it, so Microsoft does stand to lose some easy money off of those that build a farm of computers to crunch if Linux were used instead of Windows...
I want to reiterate that this is ALL open for conjecture, I have ZERO proof of this, just a hypothesis.