[R] Huge memory comsumption with foreign and RPgSQL

Perttu Muurimäki Perttu.Muurimaki at Helsinki.Fi
Wed Jan 17 10:06:54 CET 2001


I know this is something R isn't meant to do well but I tried it anyway :)

I have this SPSS-datafile (size 31 MB). When I converted it to a R object
with read.spss("datafile.sav") I ended up with a .RData-file which was 229
MB big. Is this considered normal?

Then I tried to dump that object into a database with RPgSQL-package
function db.write.table(object) (Memory ran out first time I tried to
convert SPSS-file into a R-object so I was quite prepared for the
database manouvre ; I increased the size of swap (working with linux) to
2500 MB) The process kept going and going and getting bigger and bigger.
After 6 hours and 30 minutes I aborted it. At that time the process had
grown into 1400 MB:s. Again, is this considered normal? And further more,
am I likely to succeed if I'm patient enough?

-perttu-

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list