[R] error loading huge .RData

Liaw, Andy andy_liaw at merck.com
Wed Apr 24 14:53:10 CEST 2002

> Hmm. You could be running into some sort of situation where data
> temporarily take up more space in memory than they need to. It does
> sound like a bit of a bug if R can write images that are bigger than
> it can read. Not sure how to proceed though. Does anyone on R-core
> have a similarly big  system and a spare gigabyte of disk? Is it
> possible to create a mock-up of similarly organized data that displays
> the same effect, but takes less than three days?
>  BTW: Did we ever hear what system this is happening on?

Yes, in the original post:
R-1.4.1/Mandrake Linux 7.1 (kernel 2.4.3)
Dual P3-866 Xeon with 2GB RAM and 2GB swap.

Prof. Tierney has been trying to help me off-list.  I monitored the R.bin
process through ktop as Prof. Tierney suggested.  The strange thing is
that the memory usage for the R.bin process would reach nearly 1000MB
and then R just quits with the vector heap exhausted error.

I ran gdb on R, also as Prof. Tierney suggested, and he said that 
malloc was not able to get more memory.  I check ulimit and it says
"unlimited".  Prof. Tierney also suggested running strace, but I haven't
gotten arround to that.

Will keep you folks posted...


Notice: This e-mail message, together with any attachments, contains information of Merck & Co., Inc. (Whitehouse Station, New Jersey, USA) that may be confidential, proprietary copyrighted and/or legally privileged, and is intended solely for the use of the individual or entity named on this message. If you are not the intended recipient, and have received this message in error, please immediately return this by e-mail and then delete it.


r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch

More information about the R-help mailing list