Here's my calculations based on bit-hour:
Lemme know if I messed up again:
Assume I average 8GB of ram usage:
Probability of not encountering an error in
one hour:
(1 - 10^-13)^(8 bits * 8 * 2^30 bytes) =
0.9931516101651806070663949027917378
Probability of not encountering an error in
one day:
(1 - 10^-13)^(8 bits * 8 * 2^30 bytes * 24 hours) =
0.847955819523137202583397727187477
Probability of not encountering an error in
one month:
(1 - 10^-13)^(8 bits * 8 * 2^30 bytes * 24 hours * 30 days) =
0.007098993078000191887076602721466
Probability of not encountering an error in
3 months:
(1 - 10^-13)^(8 bits * 8 * 2^30 bytes * 24 hours * 30 days * 3 months) =
3.5775874478084587076749241202893770 * 10^-7
or about
one in 2.795 * 10^6.
k, it makes more sense now. One in 2.795 * 10^6 is still hard to believe. But that's with 10^-13 error rate.
If 10^-13 is what my non-ECC ram is rated for, then it's perfectly reasonable to believe that they were intentionally overbuilt to have a lower error rate than that.
Using 10^-14 as an error rate, the probability of not encountering an error in 3 months of sustained 8GB usage is:
(1 - 10^-14)^(8 bits * 8 * 2^30 bytes * 24 hours * 30 days * 3 months) =
0.226651723475289943371528045846591
or about
1 in 4.4...
Now we're talking.
I guess I'll try to get back on topic?
Bookmarks