[R] R process taking over memory

Ramiro Barrantes ramiro at precisionbioassay.com
Mon Apr 2 19:50:14 CEST 2012


Oops, yes,

I am using CentOS Linux 6.0, R 2.14.1 and nlme 3.1-103

I looked at the problem more carefully. For some datasets applied to nlme, nlme gets "stuck" in one of the iterations and the memory usage just grows and grows.

nlme works by alternating between solving two conditional optimizations.  And the number of iterations and parameters for each of those steps can be modified. I can reduce the number of iterations but even then it seems that difficult cases in my simulation eventually appear.

I  am trying to stop the program after it trespasses certain memory limit. However, that might mean going into the nlme code myself and making some modifications.  I will also try making the change in the shell so that no processes can go beyond a certain amount of memory.  Anybody has experience with nlme getting out of control in terms of memory usage? 

Thanks,
Ramiro

_________________________ _______________
From: Prof Brian Ripley [ripley at stats.ox.ac.uk]
Sent: Sunday, April 01, 2012 4:04 PM
To: Ramiro Barrantes
Cc: r-help at r-project.org
Subject: Re: [R] R process taking over memory

You haven't even told us your OS (see the posting guide).

But the usual way is to get your OS to set a memory limit for a
process (usually via your shell), and to run things under
try/tryCatch.  Then the OS will stop R allocating more than the limit,
the current task in R will fail, and the loop can move on to the next.

I would just caution that these OS facilities do not always work as
advertised.  E.g. the current man pages on Fedora 16 are not actually
up-to-date.


On Sun, 1 Apr 2012, Ramiro Barrantes wrote:

> Hello,
>
> I have a general question on the possibility of how to "catch and stop" a function when it uses too much memory.
>
> The problem is that some datasets, when applied to nlme (a relatively older version), cause the nlme function to just hang forever and start taking over memory (this afternoon one of those calls was about 40GB!) and not returning an answer. Other datasets work fine.
>
> I am trying to debug nlme by varying its parameters but I have a general question in the interim. I have the following situation:
>
> for i in (1:N) {
>    dataset <- createDataset(i)
>    try(nlme(dataset, otherParameters))
> }
>
> If one of those datasets starts using, say more than 2GB of memory I would like to just stop nlme, get an error, record it, and move on with the next dataset.  Right now with some datasets nlme takes over the computer memory and the system ends up killing the entire process.
>
> Any suggestions appreciated.
>
> Thank you,
>
> Ramiro
>
>       [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

--
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595



More information about the R-help mailing list