[R] R crashes with memory errors on a 256GB machine (and system shoes only 60GB usage)
Ben Bolker
bbolker at gmail.com
Thu Jan 2 21:35:50 CET 2014
Xebar Saram <zeltakc <at> gmail.com> writes:
>
> Hi All,
>
> I have a terrible issue i cant seem to debug which is halting my work
> completely. I have R 3.02 installed on a linux machine (arch linux-latest)
> which I built specifically for running high memory use models. the system
> is a 16 core, 256 GB RAM machine. it worked well at the start but in the
> recent days i keep getting errors and crashes regarding memory use, such as
> "cannot create vector size of XXX, not enough memory" etc
>
> when looking at top (linux system monitor) i see i barley scrape the 60 GB
> of ram (out of 256GB)
>
> i really don't know how to debug this and my whole work is halted due to
> this so any help would be greatly appreciated
I'm very sympathetic, but it will be almost impossible to debug
this sort of a problem remotely, without a reproducible example.
The only guess that I can make, if you *really* are running *exactly*
the same code as you previously ran successfully, is that you might
have some very large objects hidden away in a saved workspace in a
.RData file that's being loaded automatically ...
I would check whether gc(), memory.profile(), etc. give sensible results
in a clean R session (R --vanilla).
Ben Bolker
More information about the R-help
mailing list