[R] problem
Philipp Pagel
p.pagel at wzw.tum.de
Wed Mar 5 13:06:37 CET 2008
On Wed, Mar 05, 2008 at 12:32:19PM +0100, Erika Frigo wrote:
> My file has not only more than a million values, but more than a million
> rows and moreless 30 columns (it is a productive dataset for cows), infact
> with read.table i'm not able to import it.
> It is an xls file.
read.table() expects clear text -- e.g. csv or tab separated in the case
of read.delim(). If your file is in xls format the simplest option would
be to export the data to CSV format from Excel.
If for some reason that is not an option please have a look at the "R
Data Import/Export" manual.
Of course neither will solve the problem of not enough memory if your
file is simply too large. In that case you will may want to put your
data into a database and have R connect to it and retrieve the data in
smaller chunks as required.
cu
Philipp
--
Dr. Philipp Pagel Tel. +49-8161-71 2131
Lehrstuhl für Genomorientierte Bioinformatik Fax. +49-8161-71 2186
Technische Universität München
Wissenschaftszentrum Weihenstephan
85350 Freising, Germany
and
Institut für Bioinformatik und Systembiologie / MIPS
Helmholtz Zentrum München -
Deutsches Forschungszentrum für Gesundheit und Umwelt
Ingolstädter Landstrasse 1
85764 Neuherberg, Germany
http://mips.gsf.de/staff/pagel
More information about the R-help
mailing list