[R] VAR (pckg: vars) and memory problem
herrdittmann at yahoo.co.uk
herrdittmann at yahoo.co.uk
Sat Aug 15 15:46:56 CEST 2009
Hi all,
When I tried to estimate a VAR (package vars) of a rather large dataset with 5 lags:
> dim(trial.var)
[1] 20388 2
I ran into memory troubles:
> summary(VAR(trial.var, type="none", p=5))
Error: cannot allocate vector of size 3.1 Gb
In addition: Warning messages:
1: In diag(resids %*% solve(Sigma) %*% t(resids)) :
Reached total allocation of 1535Mb: see help(memory.size)
2: In diag(resids %*% solve(Sigma) %*% t(resids)) :
Reached total allocation of 1535Mb: see help(memory.size)
3: In diag(resids %*% solve(Sigma) %*% t(resids)) :
Reached total allocation of 1535Mb: see help(memory.size)
4: In diag(resids %*% solve(Sigma) %*% t(resids)) :
Reached total allocation of 1535Mb: see help(memory.size)
Luckily, I was able to slice and dice my dataset into individual days with ca. 3000 lines each and estimated each subset.
Now, I nonetheless would like to run the VAR over the whole set.
Is there any way I can extend the memory used by R? Perhaps forcing it? I am running R on a XP box with 1GB RAM.
Many thanks for any pointers.
Bernd
------------------
More information about the R-help
mailing list