[R] parallel execution in R
Uwe Ligges
ligges at statistik.tu-dortmund.de
Tue Feb 26 15:40:19 CET 2013
On 26.02.2013 14:00, Alaios wrote:
> Dear all,
> I have a piece of code that I want to run in parallel (I am working in system of 16 cores)
>
>
> foreach (i=(seq(-93,-73,length.out=21))) %dopar%
> {
> threshold<-i
>
> print(i)
> do_analysis1(i,path)
> do_analysis2(i,path)
> do_something_else_analysis1(i,path)
> something_else_now(i,path)
> }
We do not know how your cluster was set up, hence cannot respond.
I'd just use the parallel (an R base package) and do:
library("parallel")
cl <- makeCluster(.....)
result <- parSapply(cl, seq(-93,-73,length.out=21), function(i){
threshold<-i
print(i)
do_analysis1(i,path)
do_analysis2(i,path)
do_something_else_analysis1(i,path)
something_else_now(i,path)
})
stopCluster(cl)
(untested, of course)
Uwe Ligges
>
> as you can see I have already tried to make this run in parallel, meaning for every i value each of the 16 processor shoule take a block of the body such as:
>
> threshold<-i
>
> print(i)
> do_analysis1(i,path)
> do_analysis2(i,path)
> do_something_else_analysis1(i,path)
> something_else_now(i,,path)
>
>
>
>
> and execute it . Unfortunately this does not work and oonly one processor looks utilized.
>
> Alternatively, mclapply have worked well in the past, but in this case I am not sure how to convert the serial execution of the body of the loop to a list that would be compatible with the mclapply.
>
> I would like to thank you in advance for your help
>
> Regards
> Alex
>
> [[alternative HTML version deleted]]
>
>
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
More information about the R-help
mailing list