[R] R --nsize 2M runs havoc (under linux)

Peter Dalgaard BSA p.dalgaard at biostat.ku.dk
Wed Oct 6 15:05:43 CEST 1999

Joerg Kindermann <Joerg.Kindermann at gmd.de> writes:

> Dear All,
> I am running R version 0.65.0 under
> a) Suse-Linux 6.1, and Suse-Linux 6.2, compiler gcc-2.95, CPUs pentium pro
> 200, 128MB, and pentium II 450, 128MB
> b) Solaris 5.7, compiler gcc-2.95, cpu SUN sparc, 4000MB
> When I set --nsize to more than 1M, R's internal storage management runs
> havoc. gc() indicates the requested sizes, but the overall process size is
> much too big: Running R with --vsize 10M --nsize 3M will for example result
> in a process size of 63.276 MB! Using such an R process will lead to a
> segmentation fault sooner or later, usually inside the storage allocation
> routine of R. I cannot reproduce the strange behavior under Solaris,
> however.

Er, I get 63 thousand and something too, but that is in *Kilo*bytes... 
Are you sure you didn't misread the output of 'top'??

   O__  ---- Peter Dalgaard             Blegdamsvej 3  
  c/ /'_ --- Dept. of Biostatistics     2200 Cph. N   
 (*) \(*) -- University of Copenhagen   Denmark      Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk)             FAX: (+45) 35327907
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch

More information about the R-help mailing list