[R] not allocate of vactor size

jim holtman jholtman at gmail.com
Mon Nov 23 00:53:10 CET 2015

My general rule of thumb is that I should have 3-4 times as much RAM as the
largest object that I am working with.  So hopefully you have at least 4 GB
of RAM on your system.  Also exactly what processing (packages, functions,
algorithms, etc.) are you using.  So functions may create multiple copies,
or they may create temporary objects bigger than the original.  So help us
out and provide more information.  You might be able to add virtual memory,
but this may slow down your process quite a bit with paging.  If you do go
this direction, then learn how to use the performance monitoring tools on
your system to see what is happening.​

Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.

On Sun, Nov 22, 2015 at 10:08 AM, Tamsila Parveen via R-help <
r-help at r-project.org> wrote:

> Hello,           Is there anyone to help me out how can I resolve memory
> issue of R, when I want to analyze data of 1Gb file, R returns me Error:
> not allocate of vector size of 1.8 GB.I tried on linux as well as on
> windows with 64 bit system and using 64 bit R-3.2.2 version. So anyone who
> knows please guide me to resolve this issue
>         [[alternative HTML version deleted]]
> ______________________________________________
> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

	[[alternative HTML version deleted]]

More information about the R-help mailing list