[R] R crashes with memory errors on a 256GB machine (and system shoes only 60GB usage)
Milan Bouchet-Valat
nalimilan at club.fr
Sat Jan 4 15:42:23 CET 2014
Le vendredi 03 janvier 2014 à 22:40 +0200, Xebar Saram a écrit :
> Hi again and thank you all for the answers
>
> i need to add that im a relatively R neewb so i apologize in advance
>
> i started R with the --vanilla option and ran gc()
>
> this is the output i get:
>
> > gc()
> used (Mb) gc trigger (Mb) max used (Mb)
> Ncells 182236 9.8 407500 21.8 350000 18.7
> Vcells 277896 2.2 786432 6.0 785897 6.0
>
> also this is the memory.profile()
>
> > memory.profile()
> NULL symbol pairlist closure environment promise
> 1 5611 86695 2277 314 4175
> language special builtin char logical integer
> 21636 44 637 6361 4574 11089
> double complex character ... any list
> 782 1 20934 0 0 8023
> expression bytecode externalptr weakref raw S4
> 1 6271 1272 364 365 831
> >
>
> im running on linux (arch linux) and 'free' shows this:
>
>
> zeltak at zuni ~ ↳ free -h
> total used free shared buffers cached
> Mem: 251G 99G 152G 66G 249M 84G
> -/+ buffers/cache: 14G 237G
> Swap: 0B 0B 0B
>
> im not running any parrallel stuff at all
>
> milan: how does one know if the memory is fragmented?
AFAIK you cannot. One thing you can do is compare the report from gc()
to was the Linux top command says about the actual memory used by the R
process (see the RES column, hit '<' three times to sort on it).
But are you saying that with the situation you describe above (i.e. 152G
of free RAM), you get errors about memory allocation?
Regards
> thank you all again i really appreciate the help
>
> best
>
> Z
>
>
>
> On Thu, Jan 2, 2014 at 10:35 PM, Ben Bolker <bbolker at gmail.com> wrote:
>
> > Xebar Saram <zeltakc <at> gmail.com> writes:
> >
> > >
> > > Hi All,
> > >
> > > I have a terrible issue i cant seem to debug which is halting my work
> > > completely. I have R 3.02 installed on a linux machine (arch
> > linux-latest)
> > > which I built specifically for running high memory use models. the system
> > > is a 16 core, 256 GB RAM machine. it worked well at the start but in the
> > > recent days i keep getting errors and crashes regarding memory use, such
> > as
> > > "cannot create vector size of XXX, not enough memory" etc
> > >
> > > when looking at top (linux system monitor) i see i barley scrape the 60
> > GB
> > > of ram (out of 256GB)
> > >
> > > i really don't know how to debug this and my whole work is halted due to
> > > this so any help would be greatly appreciated
> >
> > I'm very sympathetic, but it will be almost impossible to debug
> > this sort of a problem remotely, without a reproducible example.
> > The only guess that I can make, if you *really* are running *exactly*
> > the same code as you previously ran successfully, is that you might
> > have some very large objects hidden away in a saved workspace in a
> > .RData file that's being loaded automatically ...
> >
> > I would check whether gc(), memory.profile(), etc. give sensible results
> > in a clean R session (R --vanilla).
> >
> > Ben Bolker
> >
> > ______________________________________________
> > R-help at r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide
> > http://www.R-project.org/posting-guide.html
> > and provide commented, minimal, self-contained, reproducible code.
> >
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help
mailing list