[R] sqldf for Very Large Tab Delimited Files
HC
hcatbr at yahoo.co.in
Fri Feb 3 21:22:33 CET 2012
Bad news!
The readLines command works fine upto a certain limit. Once a few files have
been written the R program crashes.
I used the following code:
*************************
iFile<-"Test.txt"
con <- file(iFile, "r")
N<-1250000;
iLoop<-1
while(length(Lines <- readLines(con, n = N)) > 0 & iLoop<41) {
oFile<-paste("Split_",iLoop,".txt",sep="")
write.table(Lines, oFile, sep = "\t", quote = FALSE, col.names= FALSE,
row.names = FALSE)
iLoop<-iLoop+1
}
close(con)
********************
With above N=1.25 million, it wrote 28 files of about 57 mb each. That is a
total of about 1.6 GB and then crashed.
I tried with other values on N and it crashes at about the same place in
terms of total size output, i.e., about 1.6 GB.
Is this due to any limitation of Windows 7, in terms of not having the
pointer after this size?
Your insight would be very helpful.
Thank you.
HC
--
View this message in context: http://r.789695.n4.nabble.com/sqldf-for-Very-Large-Tab-Delimited-Files-tp4350555p4355679.html
Sent from the R help mailing list archive at Nabble.com.
More information about the R-help
mailing list