[R] memory increase for large r simulation

Ernesto Jardim ernesto at ipimar.pt
Tue Sep 28 11:22:57 CEST 2004

On Tue, 2004-09-28 at 07:09, shane stanton wrote:
> Hi,
> I am running R from windows 95. I have a large
> simulation, which R does not get very far through
> before telling me:
> "error  cannot allocate vector of size 871875 Kb"
> and warning message "reached total allocation of 127
> Mb"
> I have been trying to increase the memory allocation
> to R from my computer, using various commands at the R
> prompt, such as memory.limit(size=......) to which R
> responds "NULL" or "cannot decrease memory limit" (no
> matter how large I try to make the argument of
> memory.limit. 
> Does anybody have any ideas re how I can get this
> simulation to run?
> Many thanks,
> Shane Stanton
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Hi Shane,

I had a similar problem and the way to go over it is to save simulation
results to the disk and clean the workspace. 

Imagine you're running 1000 simulations and you save each result into a
list. You're object will grow and eat all you're memory. One way to go
is to save each simulation result to you're hard drive with "save" and
run the next simulation reusing the results object. That way you keep
memory requirements controlled. Off course you must be able to run one
simulation, if not than you should check you're code and try to improve
it. Something that might also help is to remove the results object after
saved to the hard drive.



More information about the R-help mailing list