Tested on R versions 3.0.X through 3.3.1
Last update: 15 August 2016
Simply put, R is not very efficient in its use of memory. Although this inefficiency occurs on both x32- and x64-bit CPUs, it is really only of concern in older x32-bit CPUs with <4GB RAM. Newer x64-bit with larger RAM rarely encounter processing that exceeds the amount of available RAM on a personal CPU. Nonetheless, you will, on occasion, be running some sort of R code and suddenly the error message …
Cannot allocate vector of size ……
appears, indicating a possible memory shortage issue.
In short, R is telling you it cannot find sufficient memory to complete the analysis.
RAM is capped at ~3.5GB in x32 Windows systems, and at the RAM installed in x64 Windows (W7/W8/W10) / MAC OS / Linux-build CPUs. Two calls, memory.limit()
and memory.size()
return the amount of RAM in your CPU, and how much is being used by your current R session, respectively. memory.size()
will grow as your R session progresses.
# memory management fxns
memory.limit() # how much do you have?
## [1] 65460
memory.size() # how much is being used?
## [1] 32.58
The first return (in kb) indicates the ~64GB available in my office CPU, while the second shows usage on the order of ~30 something kb (yours will differ a bit). Note that not all RAM is truly available for R; your CPU OS uses a substantial portion of RAM as well for basic background OS operations, as well as other programs you may be running.
Your options for dealing with RAM revolve around two basic calls that clear your workspace, rm()
and gc()
. rm(NameofObject)
deletes objects - permanently. Once they are removed they cannot be recovered except by re-running the code that created the object in the first place. You forget this piece of advice at your own peril. Multiple objects can be removed by separating them with a “,” (comma).
# cleaning up workspace; start by creating some simple objects
x <- 1 # object x; x is assigned value of 1
y <- 2 # object y; y is assigned value of 2
z <- 3 # object z; z is assigned value of 3
ls() # what's in workspace?
## [1] "x" "y" "z"
rm(x) # rm object x
ls() # what's in workspace now?
## [1] "y" "z"
rm(y, z) # rm objects y,z
ls() # what's in workspace now? nothing ...
## character(0)
gc()
releases memory, freeing it up to used once again. This is in contrast to rm()
, which merely removes the object but does not release the memory to which the object was attached. These commands are also applicable to CPUs with larger available RAM.
# "free up" previously used memory
gc() # garbage clean-up - frees all memory
## used (Mb) gc trigger (Mb) max used (Mb)
## Ncells 375705 20.1 592000 31.7 460000 24.6
## Vcells 570587 4.4 1308461 10.0 850637 6.5
Systematic memory issues (i.e., x32 RAM) more difficult to resolve. Package bigmemory
and those it references can be useful for memory management x32 CPUs
# observe how memory changes
memory.size() # current memory occupied
a1 <- c(1:1e+08) # take lots of memory w/object
a2 <- a1 * 2 # take even more ...
memory.size() # how much memory used now?
rm(a1, a2) # rm objects a1,a2
memory.size() # how much memory used now?
gc() # clean up memory
memory.size() # how much memory now?