There are 2 sure fire ways of killing a CPU*, well any electrical component for that matter ...heat and voltage, well 3 if you want to count hitting it with a hammer as well.
As for the AMD specs I am sure the they have room for the cheepo main boards and PSU's that dont have a very stable hold on the voltage and send loads of ripple and odd order harmonics down the lines. So there is almost no doubt that a small over voltage, as long as the running temps are not pushed up will cause any problems. In fact I am under volt'ing mine (its been hot and running very long jobs without me around) and still getting ~2.4 GHz. I normally run at 1.38 to 1.4 ish (depending on what does the reporting) and that is happy to keep my 4200 at 2.6 video encoding for weeks on end, and I know of people that are happy to run at above 1.5 on air (most with optys) but TBH all they do is play games and the like so its not a big deal if there is a big spike in the supply and the chips get toasted. That said, I think its worth getting a good PSU to help smooth the line, the poster above 'd1b1' looks as if a no name PSU is getting used and I am surprised that they have done so well with it so far, the power supply is a very underrated piece of kit dont skimp on it.
I have seen some people put 1.7V thru but I am not even going over 1.5 with my chips because I use my computers for a long time, but there is some sort of myth that chips ware out with use it does not happen. I have 30 plus years with computers and my dad started working with them in the late 1950's and has never heard of a chip that just get worn out, they dont, there will always be something else that caused the chip to fail see above*
If the process goes quite a few orders of magnitude smaller the life might come down from 10,000 years to 9999. But I still think that most people will want to upgrade before they have used 1% of their CPU's lifespan.




Reply With Quote

Bookmarks