Is it possible to monitor the amount of memory that is in use or has been used by R to call a function? For example, I have an arbitrary function, e.g:
smallest.sv <- function(){
A <- matrix(rnorm(1e6), 1e3);
mysvd <- svd(A);
return(tail(mysvd$d, 1));
}
Running the function simply returns a scalar, but a lot of memory was used to calculate the function. Now I need to do performance benchmarking. Processing time is easy:
system.time(x <- smallest.sv())
However I would also like to know how much memory was needed for this call, without modifying the function (it should work for arbitrary functions). Is there any way to do this?
Edit: to clarify a bit. I am mostly interested in the upper bound of memory that was in use during the call of the function, i.e. how much physical memory is required to be able to process the function call. In many cases this is significantly less than the total amount of allocated memory I think.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…