[R] Execution halts after some time.

Adaikalavan Ramasamy gisar at nus.edu.sg
Fri Mar 14 09:53:52 CET 2003


Dear all,

I fit independent GLMs for a 2x2 factorial problem on the data matrix of
size 9500 x 12 (genes x arrays) and get 9500 observed t-values using the
apply() function. Now, I wish to get the permutated p-values. Therefore
I random sample the class labels and perform the glm fitting to get the
t-values from which I can get the p-values. This is done using a for()
loop. Is there a more efficient way to do this. Each loop currently
takes 5 minutes approximately. 

More importantly I need to repeat this at least 1000 times which
requires 3-4 days but the process halts after some time. 

To isolate the problem, I rewrote the script with 10 chunks of 100
loops. The first 2 chunk runs fine and the results are ok but on the
third (sometimes fourth, fifth or sixth) chunk, I get the following
error message:

Error in FUN(newX[, i], ...) : subscript out of bounds
Execution halted

Does R have a "time out" when I use 'R --no-save < script.file' on the
UNIX platform ? 

I have checked with my system administrator and according to him there
is no upper limit to process time. I have explicitly removed every
unneccassary object at the end of each loop to keep the reserve memory.
I have tried the same on Windows and different chunks sizes and
different machines. Sometime it runs fine to completion and when it dies
it does not appear systematic.

Now I am reduced to writing scripts with chunks of 100 loops and then
collecting the chunks that were successful. I have to repeat the 1000
loops for many many different experiments and it is getting very
tedious. 

If you have any idea or had similar experience, please let me know.
Thank you.


Regards, Adai.



More information about the R-help mailing list