[R] Suggestions for poor man's parallel processing
David Kane <David Kane
a296180 at mica.fmr.com
Wed May 8 14:45:47 CEST 2002
Almost all of the heavy crunching I do in R is like:
> for(i in long.list){
+ do.something(i)
+ }
> collect.results()
Since all the invocations of do.something are independent of one another, there
is no reason that I can't run them in parallel. Since my machine has four
processors, a natural way to do this is to divide up long.list into 4 pieces
and then start 4 jobs, each of which would process 1/4 of the items. I could
then wait for the four jobs to finish (waiting for tag files and the like),
collect the results, and go on my happy way. I might do this all within R
(using system calls to fork off other R processes?) or by using Perl as a
wrapper.
But surely there are others that have faced and solved this problem already! I
do not *think* that I want to go into the details of RPVM since my needs are so
limitted. Does anyone have any advice for me? Various postings to R-help have
hinted at ideas, but I couldn't find anything definitive. I will summarize for
the list.
To the extent that it matters:
> R.version
_
platform sparc-sun-solaris2.6
arch sparc
os solaris2.6
system sparc, solaris2.6
status
major 1
minor 5.0
year 2002
month 04
day 29
language R
Regards,
Dave Kane
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list