[R] about memory
jon butchar
butchar.2 at osu.edu
Wed Mar 30 14:34:13 CEST 2005
How much memory is free when R fails (e.g., what does "top" show while trying to run your clustering)? If there's still a sizeable amount of free memory you may have to look into the system limits, maximum data segment size in particular. Many Linux distros have it set to "unlimited" but default Debian may not. If this turns out to be the problem, please do not, _do not_ raise it to "unlimited," but only to enough for R to work.
hth,
jon b
On Wed, 30 Mar 2005 18:36:37 +0800
ronggui <0034058 at fudan.edu.cn> wrote:
> here is my system memory:
> ronggui at 0[ronggui]$ free
> total used free shared buffers cached
> Mem: 256728 79440 177288 0 2296 36136
> -/+ buffers/cache: 41008 215720
> Swap: 481908 60524 421384
>
> and i want to cluster my data using hclust.my data has 3 variables and 10000 cases.but it fails and saying have not enough memory for the vector size. I read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under debian linux.but it still can not get the solution.so is my pc'memory not enough to carry this analysis or my mistake on setting the memory?
>
> thank you.
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
More information about the R-help
mailing list