Here is a phenomenon that happens all to often. I try to manipulate some sort of big data, for example
a <- matrix( rnorm( 1e4 * 200 ), ncol= 1e4 ) gr <- factor( rep( 1:2, each= 100 ) ) l <- lm( a ~ gr ) covs <- estVar( l ) cors <- cov2cor( covs ) Quite often, the following error is reported: Error: cannot allocate vector of size 509.5 Mb
Fine. I remove some variables I don't need any more and call the garbage collector:
rm( a, l ) gc( TRUE ) However, the error persists. Now I save R and start it again. And -- a miracle happens: the memory is now available. Why? If there was not enough memory for R to allocate before, but there is enough now, what changed? Can I force R somehow to clean up without saving the data to disk and waiting until it loads them again? I don't get it.
my sessionInfo():
> sessionInfo() R version 3.0.1 (2013-05-16) Platform: i486-pc-linux-gnu (32-bit) locale: [1] LC_CTYPE=en_US.utf8 LC_NUMERIC=C LC_TIME=en_US.utf8 LC_COLLATE=en_US.utf8 LC_MONETARY=en_US.utf8 [6] LC_MESSAGES=en_US.utf8 LC_PAPER=C LC_NAME=C LC_ADDRESS=C LC_TELEPHONE=C [11] LC_MEASUREMENT=en_US.utf8 LC_IDENTIFICATION=C attached base packages: [1] graphics utils datasets grDevices stats methods base P.S.: The system appears to have plenty of unused memory left, as reported by free. top reports that my R process (before the error) is using up ~ 2GB out of my 8, and there is still plenty more left.
rm( list = ls() ) ; gc()and run the code again and see if it works.gc()a few times in a row helps as well--just wrap a(for i in 1:n)around it. But in essence: "get more RAM" or "use smaller objects" is where it is at.