[R] How to more efficently read in a big matrix

Gabor Grothendieck ggrothendieck at gmail.com
Sat Nov 10 05:47:21 CET 2007

1. You might be able to speed it up somewhat by specifying

2. Another possibility is that the devel version of
the sqldf package provides an interface which simplifies reading a data file
into sqlite and from there into R.  This is particularly useful if you
don't want to read it all in.  See example 6 on the home page:

3. If it doesn't change and its ok to read it in slowly once then just
read it in slowly and save() it.  Then you can load()
it on subsequent runs which should be fast.

On Nov 9, 2007 11:39 PM, affy snp <affysnp at gmail.com> wrote:
> Dear list,
> I need to read in a big table with 487 columns and 238,305 rows (row names
> and column names are supplied). Is there a code to read in the table in
> a fast way? I tried the read.table() but it seems that it takes forever :(
> Thanks a lot!
> Best,
>    Allen
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

More information about the R-help mailing list