[R] (nlme, lme, glmmML, or glmmPQL)mixed effect models with large spatial data sets
Seth
sjmyers at syr.edu
Sat Jan 23 06:53:09 CET 2010
Hi,
I have a spatial data set with many observations (~50,000) and would like to
keep as much data as possible. There is spatial dependence, so I am
attempting a mixed model in R with a spherical variogram defining the
correlation as a function of distance between points. I have tried nlme,
lme, glmmML, and glmmPQL. In all case the matrix needed (seems to be
(N^2)/2 - N) is too large for my machine to handle even when maxed
(memory.limit and virtual memory in vista). Past the range of my variogram
(which I have a good estimate of), the matrix that R is trying to allocate
will have 0 values (I believe). Therefore, it seems wasteful to allocate
the full matrix. Is there a way to 'trim' it so that the matrix size (and
hopefully processing wait time) is decreased? Further, it seems the matrix
is now being filled with double precision data. Is there a way to lessen
precision and so save memory? If I do find a way (probably will be forced to
decrease N), for a logistic regression, which of the functions I mentioned
is likely to execute more quickly with usual settings/output? I'm asking
for a rough idea in advance because of processing time limits. I believe
glmmPQL will likely be slower due to the multiple calls to lme. Thanks for
any advice/insight. -seth
--
View this message in context: http://n4.nabble.com/nlme-lme-glmmML-or-glmmPQL-mixed-effect-models-with-large-spatial-data-sets-tp1217808p1217808.html
Sent from the R help mailing list archive at Nabble.com.
More information about the R-help
mailing list