As long as you are not getting "computational errors" on work units, you should be fine. On the drop-down menu, there is a section for errors.
Printable View
Just joined my OC'd 920 system. I will look into putting some systems from work on here. Got ~30 8-16 core machines that could help.
http://files.me.com/dougmeyer/zegz1u
Thanks for helping out. The more machines, the merrier. :)
Welcome to the team!
I changed some of the text in the guide so it is more like the original.
Angmaar, one tweak I forgot in my post before. The very first pic in the guide is a duplicate of one of the pics below. We also have two sets of "intro" text. The pics should start after my intro text. You can delete my intro text and use yours, or use chunks of both. Whatever you like better. Hope this makes sense...
Thanks much for re-constructing this. It's pprobably one of the most important posts we have right now with all the new folks coming on board.
Regards,
Bob
Yeah, got some errors too, because I've been messing around. ;) But manged to get a few point too :) It seams to take some time to update after the first attempt, but it looks to be working fine now. Please advise if something seams to be wrong. :)
http://img.techpowerup.org/100415/Capture030609.jpg
After a few days, you should be averaging around 45,000 WCG points per day at your CPU speed, maybe a little more. I know because I also have an X5650.:D
i just joined the team after seeing the announcement for the big MAY 1st-8th crunching challenge,im bringing my i7-920 to the fight.
Almost forgot about WCG. Until I got a E-Mail form MM.From Yesterday on Bionic is running again on my 24/7 Homeserver. :)
Hey Sierra, congrats on the promotion!! You're all red now! :up::toast:
Bob
Lucky you, too :) There ere not many X5650 around, and we should try to share experiences, specially on VVT/BCLK limitations.
Have you posted your OC setting/results somewhere i could take a look?
We must get a OC-able double socket MB, and first then the X5650 will show it's real muscles :D
My set-up is similar to yours. Same cooler (Noctua NF-D14), RAM is Corsair 7-8-7-20, slightly different board (Gigabyte GA-X58A-UD3R), and different PSU (SeaSonic 650W). The graphics card is something I've had for five years.
To be honest, I have not done a lot of tweaking. The CPU is running right now at 170 x 22 which is a lower overclock than what you have. RAM timings are all default. I assembled the entire system in about 15 minutes. I just wanted to get it crunching as soon as possible. VTT is at 1.32. I can't even remember what VCore is set at. I think it's below 1.27v but I would have to check.
I know that there are problems getting BCLK above 180 and that running VTT above 1.35 is very risky. I think after it's been crunching for another month, I may play around with the BIOS settings a little more.
Yeah, I've almost the same. @3.8Ghz(173x22) 1.21v, VVT=1.32v. I've used a lot of time (and still haven't given up :eek:), to get around the locked x20 Uncore, and getting over BCLK180, but it meeds VVT=1.36v :p:, at leat .
According to Intel these chips should take max 1.4v and maybe I dare to raise it later, but not before I've tried this on a double socket MB :D . I'm staying at 1.32v for now. Not bad you got pretty on spot in 15 minutes :)
Okay guys we need to get this video out. I just did a video on how to sign up for WCG and install and setup BOINC. Uploading it to youtube right now will post it in roughly 30 mins. Hope this makes it easier for all the new teammates :up:
Well it seems the screen capture program messed up some of the video and audio. Does anyone have suggestions for a great video capture program.
Heres a buggy version, wow my voice sounds so nasally wtf?
What do you guys say about cranching 6c/12T @6.26Ghz?
I would say :D :D :D
http://img.techpowerup.org/100415/Capture031758.jpg
*Hints at sig*
:wasntme: :rofl:
I did run one of the X5650's on a single Gigabyte board though - for a few weeks. It did 4,2 but I had to use 1,395V Bios VTT setting. I say bios because all Gigabyte boards are said to undervolt VTT - since they have no readout (:shakes:) we can only guess though.
In my experience with these CPUs, the 1,36 or 1,355V VTT setting (changes depending on the CPU) is definitely safe. I am using 1,375V on my X5680, that equals a measured ~1,36V real on my EX58-Extreme.
One thing that is consistent with B0 CPUs is the rather weak IMC. Thankfully, WCG doesn't care.
Always use the 8x dram divider for B0 CPUs. 10x Dram at >170 BCLK would require >1,4V VTT to be stable.
Also, even mults are still worse than uneven ones. My X5680 doesn't like 26, and neither is the X5650 too fond of the 22x mult. Unfortunately, you still have to use it for decent clocks, due to the locked uncore multiplier.
My X5660's were perfect for air - 23x mult is a good one, they did 4,3 with safe voltages easily.
Yeah, I miss the x23, it would shake it up dramatically. Specially when have found the limitation with the even mults. The X5650 doesn't have a x21-option, so the first alternative is x19 and that's too low. The only option on 5650 is x22. for sure! I'm using x8 RAM too, and it pretty stable, with good performance, more than I need.
I really like this CPU, and the Core is OCing really easy. I did some testing @4.2Ghz (191x22) 1,26v VVT=1.44v :p:. It was running pretty cool and stable too, but I don't dare to use that much VVT it for 24/7 yet. Not before I've tried this chip on a double socket.
Who knows, maybe these can take more VVT than Intel's recommendation, just like 45nm. We need to exchange experiences and get to know these chips better.
What is Intel's recommended VTT? 1.35V?
LOL :rofl: believe me, I tried my best to formulate it in a moderate way, because I knew you would say this. :)
I'm keeping the VVT=1.32v for now, :yepp: just trowing some ideas because all X56xx-owners seams to be in the thread.
1.4v
run 1.4VTT and 1.4Vcore and be prepared to go and purchase a coffin for that cpu..one or the other but not both..
My X5680's are at 1.3675vcore and 1.3VTT
They will live to a ripe old age...like me!:p:
According to Intel's data sheets, the maximum safe limit for VTT is 1.35. This is for i7-900 series CPUs. I think I read that the VTT max is much lower for LGA1156 processors, something like 1.21.
http://download.intel.com/design/pro...hts/320834.pdf
The one I've seen at the first days of 980x-release, (and I saved it on my local drive), says 1.4v for max VVT. Has Intel changed this presently?
http://img.techpowerup.org/100415/Capture032961.jpg
Weird. Check this document, page 23:
http://download.intel.com/design/pro...hts/323252.pdf
Clearly says 1,4V max Vcc - and I am 100% sure on that. 1,55 will kill most 32nm chips on standard cooling.
This also states 1,4V for max VTT...
I searched around for data sheets on Gulftown and Westmere chips but couldn't find any. Sometimes Intel releases the data sheet after it launches the product.
This is the main page on Intel's site for the X5650.
http://ark.intel.com/Product.aspx?id...ec-codes=SLBV3
On the right it says "No Datasheet Available".
You are right, but these are almost identical to 980x. Probably with a better and more resilient Uncore that can take more torture and even higher VVT too?. Hopefully yes.
But I believe MM has a good point too, it is a good idea to be careful until we get to know tese chips better. But om the other hand, the VVT is the holy grail of high OC on these chips, and we should try yo figure out what they can really take in real life.
They sure seem to change numbers a lot. I distinctly remember seeing 1.3125V as max. VTT somewhere :shrug:
Worth noting is the fact that these are MAXIMUM voltages, Intel does NOT sanction the sustained use of said peak voltages. They are talking about voltage spikes with the CPU at stock clocks (and probably idle) - the high currents our overclocked and fully loaded Hexas are pulling make for a whole different environment.
Lets agree, right now Intel says max VVT=1.4v for 980x.
Intel has been conservative enough to reduce to vCore to 1.4v (from 1.55v),
does this VVT-increase means Intel has improved the Uncore dramatically?
If true, Intel said max VVT=1.35v for i7 45nm, and it has proven to take even 1.5v ,easily, for 24/7 use.
It is to early too say anything for sure about 24/7 VVT, but I believe there is a good hope that these chips can take a lot of VVT-torture. On the other hand, I agree with MM that we should be careful until we get to know these chips.
I guess those constant changes are due to the different CPU steppings and the advancements made in the manufacturing process.
I killed a Q3QR A0 hexa after ~10 minutes on SS (-30C), and I was only running Cinebench at ~1,65V Vcore. So it stands to reason that the same CPU would have died running WCG @ 1,55V Vcore on water.
As I said, I would not recommend going over 1,4/1,35V Vcore/VTT for long term crunching unless you are prepared to face the consequences.
If vCore has killed your A0, does it means that VVT didn't kill it?
I'm not recommending to use too much VVT, just suggesting that Intel's recommendation on VVT-increase for 980x, plus these chips probably has a better and more resilient Uncore than 980x, plus i7 45nm ability to take 1.5v VVT for 24/7, all togteher suggests that these chips may be able to take a good deal of VVT for 24/7. But nobody knows how much torture they can take yet, and it is a good idea to be careful.
Not sure what killed it, but I didn't run high VTT so I am guessing it must have been Vcore.
Don't forget the 5650's are still B0, and thus potentially more vulnerable to high voltage. If you read FUGGER's OC report on Gulftowns, he said the same things really... A0 die pretty easily, B0 are a lot tougher but still not perfect, and B1 he found "to be unkillable under controlled conditions" (-100C and colder) so that gives at least some security.
I used 1,52V vcore and 1,43V Bios VTT for short-term benching on the 5680, but I turned down the chiller beforehand so the CPU never exceeded 30 or 35C.
Plus I only ran short tests like 3DM Vantage and Cinebench, crunching is a whole other story.
I see what you mean. My X5650 is a B1 :cool:.
I will thank you all for your inputs. It has been a great help to get a better idea about these chips. :)
Sorry guys if this got OT for others, I couldn't find this X56xx-gang all together in another thread. So when i found them, couldn't resist to push (and rub) it. ;)
Well I've got this up and running and will let it contribute for the next month on my i7
http://img191.imageshack.us/img191/8...unitygridy.jpg
It is a B1, according to EVEREST. A Model C, Stepping:2, according to CPU-z too :) Both means it is B1 ES :clap:
EDIT:
http://img.techpowerup.org/100416/Capture034.jpg
Well, just finally joined (I used SATI years ago), with the laptop in my sig :D I know it's not much, have to run it at 70% or the fans are 100% all the time, and it'll take ~8 hours per task, but I'm glad to help.
I also can't assure 24h, but I'll be there for the May week, let's break some records!
One question:
I can think to use this to test the stability of my new OC, it seams to be as good as Prime at 100%, but then it may BSOD frequently.
Would a BSOD harm running tasks dramatically? or just those last minutes?
A BSOD may damage the WU and cause it to "Compute Error" and upload for no credit. You can also get "Compute Errors" if your OC is not quite stable enough to process properly, yet not bad enough to BSOD.
If you want to use WCG to test stability then I would suggest you follow the steps below carefully because if you piss the WCG server off it will shut you off from getting new WUs to crunch temporarily (kind of like a quick 1 day ban here on XS except it is automated so you can' t try and talk your way out of it:rofl:).
From BOINC Manager ...
1. Suspend Network communication
2. Suspend BOINC
3. Turn BOINC off completely. When exiting say "Yes to unload when closing", if you do not get this question then go into Options, Advanced and recheck the exit question box.
4. Copy the BOINC data directory
5. Apply your OC
6. Open BOINC Manager and turn activity to Run Always (leave network communication off)
If you get lots of errors then you can stop BOINC completely (like in step 3) and replace the data dir with the copy you made in step 4 (make sure to create another copy).
If you are good to go (no errors) then turn Network communications back on.
Rinse and repeat for each round of testing.
The reason for this method is that you are limited by the number of WUs per core you can download in a day and if you are OC testing you can throw compute errors very quickly (even without a BSOD) and you will then be shut off until 12:00 AM UTC at which time WCG will give you another chance ... BUT ... that chance makes you earn the number of WUs it will download based on your ability to complete and upload valid WUs.
That vs. I have never had a WU error if my system can Prime Blend with custom settings that use all free RAM for 2 hours.
Your choice ... let us know how you do :up:
^^ Oops, I didn't know all this. Thx for good explanation, good to know. :)
I'm glad I asked before trying it. ;)
Hi everyone,
Add me to the list as well.
My system should add a few results as well
Core I7 at 4400 mhz :)
Inimicuss ... nice rig ... thanks for helping out :welcome:
Welcome to the team!
Is it possible to run some tasks now and let the points hit the server after 1. May? Any tips how I could accumulate points now and let them loose after 1. May?
Just joined! looking forward to the may 1 through may 8 "showdown"
I think you can build a cache and disconnect it from the net and let them run. I don't know how to do that but I'm pretty sure you can. Just make sure your WU's get turned in within their expiration date and it should be fine. Maybe some one with more intel can chime in here and tell me if I'm off my rocker :D
Open BOINC manager and look at the "Report Deadline" for the wu and you'll see the answer is no to this..
WU are sent with a 10 day deadline on average and stockpiling them also defeats the purpose of the work we do.
We want the finished work back to them asap so they can use that finished work.
I do admit to stockpiling work myself back in October 2007 I think to try and send in one million points in a day..God that was tough and onely made it to 700,000 but it was aq mistake to do.
Sort of like putting the cart before the horse if you know that expression.
OK, was just checking. I'm a noob :stick: in these stuff you know. :) I'll try to run them normally.
I'll do my best to hit that round number, at least for that week. :)
I didn't run anything in past days, we have the volcano ashes falling all over Europe. I've seen some strange fine particles on my fans yesterday, which always has been clean and shiny. I don't think it will do any harm, but trying to use the laptop until it clears up, hopefully in a few days.
MM, quick question, do you think it would be better to add another over clocked 920 rig or just upgrade my current to an overclocked 980? The cost is almost the same but the power draw would be double for a second rig.
Your talking about a 1500-2000 extra pts if you pick up a 920 rig over the 980x upgrade. I would personally pick up the 980x though due to power savings and the fact you'll get more usage out of your system at a later time, also the 980x does typically yield a higher OC.
I was thinking the same thing, I could also make it my media server for all pc's and the ps3.
ps nice sig:)
http://bfbc2.statsverse.com/sig/deta...252eHunt3r.png
The answer is what do you want to achieve?
How high is the elec rate where you live? That's a big factor.
Going from your 920 to a 980 is app a $800 net cost(1000 minus the 200 that you could sell the 920 for) and gets you an additional 15K per day(40K to 55K)
BUT the elec of one machine.
Adding a 920 that generally gets 30-35K per day for the same money but that adds elec..
The elec cost is the key, then add in additional heat generated so where you live( hot or cold climate) also factors into the equation.
I found the data sheets for the 5600 series Xeons.:)
http://www.intel.com/Assets/en_US/PD...eet/323369.pdf
http://www.intel.com/Assets/en_US/PD...eet/323370.pdf
Voltage limits are the same as the i7-980X which is probably no big surprise.
Great find S_B thanks! :)
Just signed up, Hope this helps. i7 1156 ;)
Yes, every machine is important. Thanks for joining.:)
Hello. Joined in prep for your week in May... and will continue after that. I'm a total noob at this but wanted to help.... that said, I have 2 questions I'm wondering if someone could help me with.
Does BOINC have to start up with windows, and is it bad to close it out sometimes? (vista 64). I have a 24/7 machine I plan do dedicate to this, but would also like to use my gaming system when I'm not busy on it and startup / shutdown of BOINC is a concern.
Also, to use the 2nd computer, do I just need to download the client to that and log in? Anything else to do?
I apologize if these have been answered before. I'm new to this and didn't know where else to ask.
Thank you. :up:
Don't worry about startup/shutdown it runs in low priority so whatever application you run it will throttle BOINC back, so no point in closing it. On the install question, yes just install it and enter your login information. Don't forget to change the preferences to 100% :up:
The search function towards the top right of the forum page is very helpful, but so are the members. ;)
Once you have your account created you just need to log in additional machines with that same username and password. Simple as that.
As for shutting it down, it does no harm. You can also configure the system to stop on it's own when you start up nominated applications, though you have to make manual entries to a file to do that, or with the latest BOINC clients you can set it to suspend if other CPU use goes over a certain percentage (the default is 25% I think) so generally it shouldn't interfere with your games.
The points are confusing me, when you say 50,000 movieman how is that calculated? I have 8 Phenom cores running 4 @ 3.5 and 4 @ 4.0ghz. How would this compute in points per day?
Mainly they are calculated because of what people have already gotten on those setups. What kind of phenoms are you running? Phenom l X4 or Phenom ll X4? There are also 2 different point standards. First one being WCG and seconds being BOINC.
Dave will be referring to WCG points, not BOINC points.
Basically, WCG didn't always use the BOINC client. Originally WCG used an off-shoot of the software produced by a mob known as United Devices. They were the first lot to use distributed computing technology for bio-medical research. UD eventually became known as Grid.org and proceeded to ... do bad things to those volunteering their machines before stopping development, or at least free development, of the client software. WCG carried on using that software for a time but eventually moved to the BOINC server and client package instead. From there the old UD client was phased out but people still had the work credit that had been built up using that client and it's own scoring system. In order to not screw over it's volunteers (as Grid.org discovered that's a VERY BAD IDEA :mad: ) WCG decided to keep all the accrued credit and just multiply the normal BOINC numbers by 7 to give returns roughly equivalent to what people were getting on their machines with the old client. Hence WCG's internal stats are often different to those you see when you check the various BOINC stats lists.
Alright count me in!!!
My mother passed away from cancer back in 1992
so anything I can do to help with the research I
certainly will do it.
This is a great use of all of our supercomputers that we
normally game on or do other tasks in which our rigs
are not remotely challenged!
:up:
I am still not done overclocking properly my rig, so i'm still getting some BSOD's here and there - still trying to find the minimum voltages to run @4GHz 24/24. Should be running fine by the 1st, i hope (long work days - little time to play at home). BOINC makes for a nice stress test, and it's actually useful to something !
I have a quick question: i see some units are quickly getting an "error" status (usually in the first few minutes of running), is it normal for WCG units or is that due to CPU errors (as in, not enough voltage somewhere) ? At first i thought it was the OC but i'm also getting errors on a completely stock rig (E8400 running at stock clocks).
It's not normal for them to error out at all. On your stock rig first make sure your system is clean. Virus infections can cause issues. Next run Memtest86+ (download here) and make sure your RAM isn't misbehaving. Yes it does happen even if everything else seems fine. I don't know what OS/version you're running so that's as much as I can suggest for now.
The stock rig is my computer at work, i can tell it's clean from any virus, it runs XP 32 bits. Maybe the memory is crap.. Thanks for the answer, so time for more voltages !
More voltage won't correct a memory fault. Only replacing the RAM will fix that, and if you're running Norton on that work machine you're probably pwnd. Just coz someone paid for an AV solution doesn't mean it's worth a pinch of proverbial.
As for the OC'd system, it could still be an unstable 'clock. Not every chip will clock as well as every other. Some just won't go over n no matter what voltages or dividers you mess with. For that matter it could be the motherboard that's causing it. If the VREGS (or whatever parts) aren't up to it, they're not up to it and that's all there is to it.
The guys who win overclocking competitions aren't "script kiddies" that just change settings to a formula, they're artists who know their kit inside out and go out of their way to find the right steppings of the right chips, the right boards and everything else. Even after all that, the best guys still sometimes get a chip that just won't clock like it should.
Haha no worries my main rig is totally watercooled, memory is running at stock speeds (since i run an EX 975 i just play with multis basically). So nothing is overheating - VRM's are under water too. Like i said, i'm not done yet stabilizing that OC, trying to figure out the proper minimal voltages... i could slap 1.45V core but that would produce a lot of heat and waste electricity for nothing if it can run at 1.26V.. Currently fiddling with the QPI voltages and such... Would be surprising if i couldnt run that watercooled i975 at just 4GHz ;) (especially that it's just 60C at full load...)
(edit) ha found the cause of those errors. Not hardware related (running win7 / 64bit) :)
I'm the admin of my own computer at work, so i can run pretty much all i want (that's why it's running BOINC too). Using proper anti spyware and antivirus ;) I just had one error on one unit in 4 days of continuous operation, but if you say it's hardware related it's annoying (i cannot replace the parts so easily since the computer is not mine).
My WCG account has been idle since early 2005, restarted today and joined your team, but don't expect to do many WUs, hate paying the elec bill.
An extra WU here and there is better then nothing I guess. ;)
I had a scare the other day on my 3 yr old's q6600 it returned an inconclusive, but today when I checked it was marked as valid :up:
Don't stress even if you get a couple of errors, sometimes you just get a dud unit. If you're getting a bunch of them, or a couple a day, look into it.
So what should I do, use CPU or GPU to fold? Which would return better results?
So your team is not on GPUGrid also?
Yes we are on GPUgrid we are actually the #1 ranked team after last week :D
So do I do both or just one? And if just one, which one yields more power?
You can easily do both. Many do, including myself. If you have any questions about GPUGrid, be sure to ask, we're more than willing to help!
And for WCG - well, you're already here. :up:
joined a lil while ago too!
may i ask a question? how do i check my stats? kinda lost when it comes to this folding stuff
Ok, your stats are available in the My Grid pages on the WCG website ... and this is crunching, not folding. Yes there is a difference, no it's not just in the project name.
The only dumb question is.....
I am thinking of helping.. looked at the directions... got to 'BONIC' info, and you show Language Selection of 'Finnish'.. correct for you, but... I am in the US. I can join if I want, can't I? English.. I choose English... it still makes me part of the Extreem group.. because I am on your Team..
Download this software:
http://boinc.berkeley.edu/download_all.php
Sign up for an account:
https://secure.worldcommunitygrid.or...iewRegister.do
Join this group:
http://www.worldcommunitygrid.org/te...mId=69B8364XP1
It's all in English!
This is probably the most generic question out here, but, if I wanted to get this up and running in as little time as possible, with the least non-sense, would I want Windows or Linux? I'm totally comfortable with either. I plan to slap together a build when I get home from work. It'll be:
i7, and three GTX275's
I'm trying to get in on the may event, and I have been waiting on part after part. Today should be the day though.
But yeah, trying to get this up in record time (for me) and I hope someone will know which is best. I can use GPU grid too right?
Joined!
PPD should be roughly equal between Windows and Linux. Running a 64bit OS gives a ~10% boost in PPD over their 32bit counterpart though. Just go with whatever is easier for you. You cannot use those GPUs on World Community Grid, though. You should probably look into GPUgrid for that.
what will be my score with dual E5520 24/7 ?