I'm grateful for that, movieman.
I still remember when Linpack binaries first appeared, and I think they appeared during the beginning core 2 days, and there was a note saying that these binaries were used for *internal* CPU testing. But someone or something requested that they release them, and then came the "new stress tool released: now find if your cpu is truly stable! more accurate than prime" etc...

While I do understand Agentgod's point that this was used to be a time saving mechanism, I also fully understand the other person's post in that he said "there is no substitute for time."

I guess a question I can ask Movieman AND AgentGod is:
What is more harmful to a 1.45v 5 ghz CPU?
5 passes of Linpack/LinX/IBT, or 8 hours of prime 95?
Which will cause more degradation?

The problem of course is, even prime 95 can degrade cpu's if they are running highly out of specification. I've already degraded two 2600k's with prime, and neither one of them even saw a linpack run, except a quick 5 loop test at 4 ghz stock voltages, which was safe, and that was a gflop test. I'm lucky that I can still do 5 ghz at 1.45v. And since I'm a gamer, after all, I think some good old Black Ops and Battlefield 3 works better for me than burning my CPU up with hours of prime.

Of course, for folders, who crunch data that needs to be accurate, prime can be a good test, but I STILL remember a post on the [H], where a person who folded 24/7 at 5 ghz 1.5v, had to reduce his overclock to 4.6 ghz after 3 months, due to degradation.

Degradation has happened to enough people now, so that no one questions whether it's "real" or not (unlike that famous GNDS thread, where many of us were called incompetent overclockers, until the problem became so widespread, no one could deny it anymore).

I guess this is all old hat now. The BIG question is how Ivy Bridge will overclock, and how well the 22nm process tolerates voltages over long term. And of course, no one will know without guinea pigs to test their processors....