[R] memory problem in exporting data frame

Thomas W Blackwell tblackw at umich.edu
Tue Sep 9 01:27:31 CEST 2003

Simplest is to save your workspace using  save.image(),
then delete a bunch of large objects other than the data
frame that you want to export, and run  write.table()
again, now that you've made space for it.  A quick calc
shows  17000 x 400 x 8 = 55 Mb, and that's just the size
of the object that chokes R below.

-  tom blackwell  -  u michigan medical school  -  ann arbor  -

On Mon, 8 Sep 2003, array chip wrote:

> I am having trouble of exporting a large data frame
> out of R to be used in other purpose. The data frame
> is numeric with size 17000x400. It takes a quite some
> time to start R as well. my computer has 1GB RAM. I
> used the following command to write the data frame to
> a text file and got the error message below:
> > write.table(xxx, "C:\\xxx", sep="\t",
> row.names=FALSE,col.names=FALSE,quote=FALSE)
> Error: cannot allocate vector of size 55750 Kb
> In addition: Warning message:
> Reached total allocation of 1023Mb: see
> help(memory.size)
> I tried to increase the memory size by
> memory.size(size=), but it seems running the above
> command takes forever.
> what can I do with this error message to get the data
> out?
> Thanks

More information about the R-help mailing list