The following R code is sufficient to replicate the results in table 1: ```{r} #minimum reported failure rate in memory is .2 per errors GB/day rate_GBday_wikipedia <- 0.2 # assumed quantity that new data remains resident in memory for 1 second newdata_Gb_sec <- 1E9 # p value for randomized response from Wang et al. 2016 randp <- function(eps) { exp(eps)/(1+exp(eps)) } # predicted bit errors per day dayerr <- function(eps, flow = newdata_Gb_sec ) { res <- (1-randp(eps)) * flow * 60 * 60 * 24 res } # find epsilon with 5% change of observed error in an hour optim(par=8, fn=function(x){abs(.05-dayerr(eps=x)/24)}, method="Brent", lower=.1, upper=42) # find epsilon for a given bits/day error rate optim(par=8, fn = function(x) { abs(rate_GBday_wikipedia - dayerr(eps=x))}, method="Brent", lower=.1, upper=42) ``` The following code was used to to conduct the experiment reported in section 7: ```{r} blockSize <- 200000000 simmem <- integer(length=blockSize) simmem[1:length(simmem)]<-as.integer(0) if (object.size(simmem)/1000000 < 50) {warn("test vector may be smaller than processor cache")} startTime <- Sys.time() simsum <- 0 while (simsum==0) { # alter calculation to avoid optimizer converting entire loop to while(TRUE) oneout <- trunc(runif(1,1,length(simmem))) simsum <- sum(simmem[-oneout]) } endTime <- Sys.time() elapsed <- endTime-startTime print(paste("Time",endTime ,"sum", simsum, "elapsed", elapsed)) ```