R on MacOS Error: vector memory exhausted (limit reached?)

Anjan Bharadwaj picture Anjan Bharadwaj · Jul 12, 2018 · Viewed 92k times · Source

I'm trying to run an R script (in particular, I am using the "getLineages" function from the Bioconductor package, Slingshot.

I'm wondering why the error "vector memory exhausted (limit reached?)" is showing up when I use this function, as it doesn't seem to be the most memory-intensive function compared to the other functions in this package (with the data I am analyzing).

I do understand that there are other questions like this on Stackoverflow, but they all suggest to switch over to the 64-bit version of R. However, I am already using this version. There seem to be no other answers to this issue so far, I was wondering if anyone might know?

The data is only ~120mb in size, which is far less than my computer's 8GB of RAM.

R 64 bit version

Answer

Graeme Frost picture Graeme Frost · Oct 2, 2018

For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000), as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:

Error: vector memory exhausted (limit reached?)

After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:

Step 1: Open terminal,

Step 2:

cd ~
touch .Renviron
open .Renviron

Step 3: Save the following as the first line of .Renviron:

R_MAX_VSIZE=100Gb 

Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine