[R] After writing data in MMF using SEXP structure, can i reference in R?
나여나
dllmain at hanmail.net
Mon Jul 26 04:35:40 CEST 2010
Hi all,
After writing data in MMF(Memory Map File) using SEXP structure, can i
reference in R?
If input data is larger than 2GB, Can i reference MMF Data in R?
my work environment :
R version : 2.11.1
OS : WinXP Pro sp3
Thanks and best regards.
Park, Young-Ju
from Korea.
---------[ ¹Ãú ¸ÃÃà ³»¿ë ]----------
æ¸ñ : R-help Digest, Vol 89, Issue 25
³¯ÃÂ¥ : 2010³â 7¿ù 25Ãà ÃÿäÃÃ, 19½à 00ºà 07Ãà +0900
º¸³½Ãà :r-help-request at r-project.org
¹Ã´ÃÃà :r-help at r-project.org
Send R-help mailing list submissions to
[1]r-help at r-project.org
To subscribe or unsubscribe via the World Wide Web, visit
https://stat.ethz.ch/mailman/listinfo/r-help
or, via email, send a message with subject or body 'help' to
[2]r-help-request at r-project.org
You can reach the person managing the list at
[3]r-help-owner at r-project.org
When replying, please edit your Subject line so it is more specific
than "Re: Contents of R-help digest..."
Today's Topics:
1. Re: how to calculate the product of every two elements in two
vectors (Dennis Murphy)
2. Re: Question regarding panel data diagnostic
(Setlhare Lekgatlhamang)
3. Re: Question regarding panel data diagnostic
(Setlhare Lekgatlhamang)
4. Re: Question regarding panel data diagnostic
(Setlhare Lekgatlhamang)
5. Re: Trouble retrieving the second largest value from each row
of a data.frame (David Winsemius)
6. local polynomial with differnt kernal functions
(assaedi76 assaedi76)
7. Weights in mixed models (David R.)
8. Re: Odp: Help me with prediction in linear model
(Research student)
9. Re: union data in column (Hadley Wickham)
10. Re: UseR! 2010 - my impressions (Frank E Harrell Jr)
11. Re: , Updating Table (Charles C. Berry)
12. Re: , Updating Table (Duncan Murdoch)
13. Re: glm - prediction of a factor with several levels (Ben Bolker)
14. Doubt about a population competition function
(Bruno Bastos Gon?alves)
15. Doubt about a population competition function (Gmail)
16. Book on R's Programming Language (Matt Stati)
17. Re: how to calculate the product of every two elements in two
vectors (Henrique Dallazuanna)
18. Re: Book on R's Programming Language (Joshua Wiley)
19. Re: how to calculate the product of every two elements in two
vectors (Gabor Grothendieck)
20. Re: Book on R's Programming Language (Joseph Magagnoli)
21. Re: Constrain density to 0 at 0? (Greg Snow)
22. matest function for multiple factors (shabnam k)
23. Re: How to deal with more than 6GB dataset using R? (Greg Snow)
24. Using R to fill ETM+ data gaps? (Abdi, Abdulhakim)
25. How to generate a sequence of dates without hardcoding the
year (Felipe Carrillo)
26. Re: How to generate a sequence of dates without hardcoding
the year (Henrique Dallazuanna)
27. Re: Trouble retrieving the second largest value from each row
of a data.frame ([4]mpward at illinois.edu)
28. Re: (no subject) (Paul Smith)
29. Re: How to generate a sequence of dates without hardcoding
the year (jim holtman)
30. Re: glm - prediction of a factor with several levels (zachmohr)
31. Re: Trouble retrieving the second largest value from each row
of a data.frame (David Winsemius)
32. Re: Trouble retrieving the second largest value from each row
of a data.frame (Joshua Wiley)
33. Re: Trouble retrieving the second largest value from each row
of a data.frame (David Winsemius)
34. Re: Trouble retrieving the second largest value from each row
of a data.frame (David Winsemius)
35. c-statiscs 95% CI for cox regression model (paaventhan jeyaganth)
36. Re: UseR! 2010 - my impressions (Dirk Eddelbuettel)
37. Re: c-statiscs 95% CI for cox regression model
(Frank E Harrell Jr)
38. Equivalent to go-to statement (Michael Haenlein)
39. Outlier Problem in Survreg Function (Vipul Agarwal)
40. Re: Equivalent to go-to statement (Gabor Grothendieck)
----------------------------------------------------------------------
Message: 1
Date: Sat, 24 Jul 2010 03:02:47 -0700
From: Dennis Murphy <[5]djmuser at gmail.com>
To: aegea <[6]gcheer3 at gmail.com>
Cc: [7]r-help at r-project.org
Subject: Re: [R] how to calculate the product of every two elements in
two vectors
Message-ID:
<AANLkTinUJjoCiG47ptPq5Eo_fXYZXuQbUzB=+[8]KLCO6QX at mail.gmail.com>
Content-Type: text/plain
as.vector(t(outer(A, B)))
[1] 9 10 11 12 18 20 22 24 27 30 33 36
HTH,
Dennis
On Fri, Jul 23, 2010 at 8:11 AM, aegea <[9]gcheer3 at gmail.com> wrote:
>
> Thanks in advance!
>
> A=c(1, 2, 3)
> B=c (9, 10, 11, 12)
>
> I want to get C=c(1*9, 1*10, 1*11, 1*12, ....., 3*9, 3*10, 3*11, 3*12)?
> C is still a vector with 12 elements
> Is there a way to do that?
> --
> View this message in context:
>
[10]http://r.789695.n4.nabble.com/how-to-calculate-the-product-of-every-tw
o-elements-in-two-vectors-tp2300299p2300299.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> [11]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> [12]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
[[alternative HTML version deleted]]
------------------------------
Message: 2
Date: Sat, 24 Jul 2010 12:54:11 +0200
From: "Setlhare Lekgatlhamang" <[13]SetlhareL at bob.bw>
To: "amatoallah ouchen" <[14]at.ouchen at gmail.com>,
<[15]r-help at r-project.org>
Subject: Re: [R] Question regarding panel data diagnostic
Message-ID:
<[16]25D1D72D6E19D144AB813C9C582E16CF03F7EA27 at BOB-EXCHANGE.bob.bw>
Content-Type: text/plain; charset="iso-8859-1"
My thought is this:
It depends on what you have in the panel. Are your data cross-section data
observed over ten years for, say, 3 countries (or regions within the same
country)? If so, yes you can perform integration properties (what people
usually call unit root test) and then test for cointegration. But if the
data are quarterly or monthly, these techniques are not relevant.
Hope this helps.
Lexi
-----Original Message-----
From: [17]r-help-bounces at r-project.org
[mailto:[18]r-help-bounces at r-project.org] On Behalf Of amatoallah ouchen
Sent: Friday, July 23, 2010 12:18 AM
To: [19]r-help at r-project.org
Subject: [R] Question regarding panel data diagnostic
Good day R-listers,
I'm currently working on a panel data analysis (N=17, T=5), in order
to check for the spurious regression problem, i have to test for
stationarity but i've read somewhere that i needn't to test for it as
my T<10 , what do you think? if yes is there any other test i have
to perform in such case (a kind of cointegration test for small T?)
Any hint would be highly appreciated.
Ama.
*
* For searches and help try:
* [20]http://www.stata.com/help.cgi?search
* [21]http://www.stata.com/support/statalist/faq
* [22]http://www.ats.ucla.edu/stat/stata/
______________________________________________
[23]R-help at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
[24]http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
DISCLAIMER:\ Sample Disclaimer added in a VBScript.\ ...{{dropped:3}}
------------------------------
Message: 3
Date: Sat, 24 Jul 2010 13:00:05 +0200
From: "Setlhare Lekgatlhamang" <[25]SetlhareL at bob.bw>
To: "Setlhare Lekgatlhamang" <[26]SetlhareL at bob.bw>, "amatoallah ouchen"
<[27]at.ouchen at gmail.com>, <[28]r-help at r-project.org>
Subject: Re: [R] Question regarding panel data diagnostic
Message-ID:
<[29]25D1D72D6E19D144AB813C9C582E16CF03F7EA28 at BOB-EXCHANGE.bob.bw>
Content-Type: text/plain; charset="us-ascii"
Let me correct an omission in my response below. The last sentence
should read "But if the data are 10 quarterly or monthly values, these
techniques are not relevant".
Cheers
Lexi
-----Original Message-----
From: [30]r-help-bounces at r-project.org
[mailto:[31]r-help-bounces at r-project.org]
On Behalf Of Setlhare Lekgatlhamang
Sent: Saturday, July 24, 2010 12:54 PM
To: amatoallah ouchen; [32]r-help at r-project.org
Subject: Re: [R] Question regarding panel data diagnostic
My thought is this:
It depends on what you have in the panel. Are your data cross-section
data observed over ten years for, say, 3 countries (or regions within
the same country)? If so, yes you can perform integration properties
(what people usually call unit root test) and then test for
cointegration. But if the data are quarterly or monthly, these
techniques are not relevant.
Hope this helps.
Lexi
-----Original Message-----
From: [33]r-help-bounces at r-project.org
[mailto:[34]r-help-bounces at r-project.org]
On Behalf Of amatoallah ouchen
Sent: Friday, July 23, 2010 12:18 AM
To: [35]r-help at r-project.org
Subject: [R] Question regarding panel data diagnostic
Good day R-listers,
I'm currently working on a panel data analysis (N=17, T=5), in order
to check for the spurious regression problem, i have to test for
stationarity but i've read somewhere that i needn't to test for it as
my T<10 , what do you think? if yes is there any other test i have
to perform in such case (a kind of cointegration test for small T?)
Any hint would be highly appreciated.
Ama.
*
* For searches and help try:
* [36]http://www.stata.com/help.cgi?search
* [37]http://www.stata.com/support/statalist/faq
* [38]http://www.ats.ucla.edu/stat/stata/
______________________________________________
[39]R-help at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
[40]http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
DISCLAIMER:\ Sample Disclaimer added in a VBScript.\ ......{{dropped:14}}
------------------------------
Message: 4
Date: Sat, 24 Jul 2010 13:01:24 +0200
From: "Setlhare Lekgatlhamang" <[41]SetlhareL at bob.bw>
To: "amatoallah ouchen" <[42]at.ouchen at gmail.com>,
<[43]r-help at r-project.org>
Subject: Re: [R] Question regarding panel data diagnostic
Message-ID:
<[44]25D1D72D6E19D144AB813C9C582E16CF03F7EA29 at BOB-EXCHANGE.bob.bw>
Content-Type: text/plain; charset="us-ascii"
Let me correct an omission in my response below. The last sentence
should read "But if the data are 10 quarterly or monthly values, these
techniques are not relevant".
Cheers
Lexi
-----Original Message-----
From: [45]r-help-bounces at r-project.org
[mailto:[46]r-help-bounces at r-project.org]
On Behalf Of Setlhare Lekgatlhamang
Sent: Saturday, July 24, 2010 12:54 PM
To: amatoallah ouchen; [47]r-help at r-project.org
Subject: Re: [R] Question regarding panel data diagnostic
My thought is this:
It depends on what you have in the panel. Are your data cross-section
data observed over ten years for, say, 3 countries (or regions within
the same country)? If so, yes you can perform integration properties
(what people usually call unit root test) and then test for
cointegration. But if the data are quarterly or monthly, these
techniques are not relevant.
Hope this helps.
Lexi
-----Original Message-----
From: [48]r-help-bounces at r-project.org
[mailto:[49]r-help-bounces at r-project.org]
On Behalf Of amatoallah ouchen
Sent: Friday, July 23, 2010 12:18 AM
To: [50]r-help at r-project.org
Subject: [R] Question regarding panel data diagnostic
Good day R-listers,
I'm currently working on a panel data analysis (N=17, T=5), in order
to check for the spurious regression problem, i have to test for
stationarity but i've read somewhere that i needn't to test for it as
my T<10 , what do you think? if yes is there any other test i have
to perform in such case (a kind of cointegration test for small T?)
Any hint would be highly appreciated.
Ama.
*
* For searches and help try:
* [51]http://www.stata.com/help.cgi?search
* [52]http://www.stata.com/support/statalist/faq
* [53]http://www.ats.ucla.edu/stat/stata/
______________________________________________
[54]R-help at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
[55]http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
DISCLAIMER:\ Sample Disclaimer added in a VBScript.\ ......{{dropped:14}}
------------------------------
Message: 5
Date: Sat, 24 Jul 2010 08:40:05 -0400
From: David Winsemius <[56]dwinsemius at comcast.net>
To: <[57]mpward at illinois.edu>
Cc: [58]r-help at r-project.org
Subject: Re: [R] Trouble retrieving the second largest value from each
row of a data.frame
Message-ID: <[59]D09340C5-3B64-47FA-A168-8EA347F79747 at comcast.net>
Content-Type: text/plain; charset=US-ASCII; format=flowed; delsp=yes
On Jul 23, 2010, at 9:20 PM, <[60]mpward at illinois.edu> wrote:
> I have a data frame with a couple million lines and want to retrieve
> the largest and second largest values in each row, along with the
> label of the column these values are in. For example
>
> row 1
> strongest=-11072
> secondstrongest=-11707
> strongestantenna=value120
> secondstrongantenna=value60
>
> Below is the code I am using and a truncated data.frame. Retrieving
> the largest value was easy, but I have been getting errors every way
> I have tried to retrieve the second largest value. I have not even
> tried to retrieve the labels for the value yet.
>
> Any help would be appreciated
> Mike
Using Holtman's extract of your data with x as the name and the order
function to generate an index to names of the dataframe:
> t(apply(x, 1, sort, decreasing=TRUE)[1:3, ])
[, 1] [, 2] [, 3]
1 -11072 -11707 -12471
2 -11176 -11799 -12838
3 -11113 -11778 -12439
4 -11071 -11561 -11653
5 -11067 -11638 -12834
6 -11068 -11698 -12430
7 -11092 -11607 -11709
8 -11061 -11426 -11665
9 -11137 -11736 -12570
10 -11146 -11779 -12537
Putting it all together:
matrix( paste( t(apply(x, 1, sort, decreasing=TRUE)[1:3, ]),
names(x)[ t(apply(x, 1, order, decreasing=TRUE)
[1:3, ]) ]),
ncol=3)
[, 1] [, 2] [, 3]
[1, ] "-11072 value120" "-11707 value60" "-12471 value180"
[2, ] "-11176 value120" "-11799 value180" "-12838 value0"
[3, ] "-11113 value120" "-11778 value60" "-12439 value180"
[4, ] "-11071 value120" "-11561 value240" "-11653 value60"
[5, ] "-11067 value120" "-11638 value180" "-12834 value0"
[6, ] "-11068 value0" "-11698 value60" "-12430 value120"
[7, ] "-11092 value120" "-11607 value240" "-11709 value180"
[8, ] "-11061 value120" "-11426 value240" "-11665 value60"
[9, ] "-11137 value120" "-11736 value60" "-12570 value180"
[10, ] "-11146 value300" "-11779 value0" "-12537 value180"
--
David.
>
>
>> data<-data.frame(value0, value60, value120, value180, value240,
value300)
>> data
> value0 value60 value120 value180 value240 value300
> 1 -13007 -11707 -11072 -12471 -12838 -13357
> 2 -12838 -13210 -11176 -11799 -13210 -13845
> 3 -12880 -11778 -11113 -12439 -13089 -13880
> 4 -12805 -11653 -11071 -12385 -11561 -13317
> 5 -12834 -13527 -11067 -11638 -13527 -13873
> 6 -11068 -11698 -12430 -12430 -12430 -12814
> 7 -12807 -14068 -11092 -11709 -11607 -13025
> 8 -12770 -11665 -11061 -12373 -11426 -12805
> 9 -12988 -11736 -11137 -12570 -13467 -13739
> 10 -11779 -12873 -12973 -12537 -12973 -11146
>> #largest value in the row
>> strongest<-apply(data, 1, max)
>>
>>
>> #second largest value in the row
>> n<-function(data)(1/(min(1/(data[1, ]-max(data[1, ]))))+
>> (max(data[1, ])))
>> secondstrongest<-apply(data, 1, n)
> Error in data[1, ] : incorrect number of dimensions
>>
>
> ______________________________________________
> [61]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
[62]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
------------------------------
Message: 6
Date: Sat, 24 Jul 2010 04:25:10 -0700 (PDT)
From: assaedi76 assaedi76 <[63]assaedi76 at yahoo.com>
To: [64]r-help at r-project.org
Subject: [R] local polynomial with differnt kernal functions
Message-ID: <[65]853644.1608.qm at web45210.mail.sp1.yahoo.com>
Content-Type: text/plain
Hi, R users
�br> I need to use the function (locpoly) to fit a local poynomial
regression model, The defult for kernal function is " normal" , but�I need
to use different kernal functions such as :Uniform, Triangular,
Epanechnikov, ......
Could someone help me define these functions to fit local polynomial
regression model?.
Email:[66]assaedi76 at yahoo.com
�br> �br> Thanks alot
[[alternative HTML version deleted]]
------------------------------
Message: 7
Date: Sat, 24 Jul 2010 04:14:44 -0700 (PDT)
From: "David R." <[67]drbn at yahoo.com>
To: [68]r-help at r-project.org
Subject: [R] Weights in mixed models
Message-ID: <[69]217686.73973.qm at web113215.mail.gq1.yahoo.com>
Content-Type: text/plain; charset=iso-8859-1
Hello everyone,
I wonder if sample size?can be used as weight? in a weighted mixed model.
Or
should I use just the inverse of the variance?
For example, the class'lmer' in the 'lme4' package?have an option
'weights'
(as in the class 'lm' of 'stats'). In the help of lme4 there is an example
using
'herd size' as weight in a mixed model.
?
?So, if I have a measurement data (eg height) of 10 groups (sample size
ranging
from?30 to 3000 individuals?for each group) can?I??use this number (N,
sample
size) in the 'weights' option? Is this wrong?
?
Finally, what to do if the results (coefficients) of weighing by 'inverse
of the
variance' or by 'sample size' are very different, even opposite?
Thank you very much in advance
David
------------------------------
Message: 8
Date: Sat, 24 Jul 2010 02:48:11 -0700 (PDT)
From: Research student <[70]vijayamahantesh_s at dell.com>
To: [71]r-help at r-project.org
Subject: Re: [R] Odp: Help me with prediction in linear model
Message-ID: <[72]1279964891930-2300991.post at n4.nabble.com>
Content-Type: text/plain; charset=us-ascii
Thanks Murphy and pikal,
I need another help, for fitting first fourier transformation , i used
following thing .Please advise on this
beer_monthl has 400+ records
EXample:
> head(beer_monthly)
beer
1 93.2
2 96.0
3 95.2
4 77.1
5 70.9
6 64.8
time<-seq(1956, 1995.2, length=length(beer_monthly))
sin.t<-sin(2*pi*time)
cos.t<-cos(2*pi*time)
beer_fit_fourier=lm(beer_monthly[, 1]~poly(time, 2)+sin.t+cos.t) #this is
not
working
beer_fit_fourier=lm(beer_monthly[, 1]~time+time2+sin.t+cos.t) #it is
working
#prediction is not workinng
tpred_four <- data.frame(time = seq(1995, 1998, length = 20))
predict(beer_fit_fourier, newdata = tpred_four)
Is there any way to fit first fourier frequency ,
Please assist.
Thanks in advance
--
View this message in context:
[73]http://r.789695.n4.nabble.com/Help-me-with-prediction-in-linear-model-
tp2297313p2300991.html
Sent from the R help mailing list archive at Nabble.com.
------------------------------
Message: 9
Date: Sat, 24 Jul 2010 07:53:23 -0500
From: Hadley Wickham <[74]hadley at rice.edu>
To: Jeff Newmiller <[75]jdnewmil at dcn.davis.ca.us>
Cc: [76]r-help at r-project.org, Fahim Md <[77]fahim.md at gmail.com>
Subject: Re: [R] union data in column
Message-ID:
<AANLkTi=eA4KHr2q7fija+qGbTYnHJPLrLLHgw25+Ki=[78]z at mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1
On Sat, Jul 24, 2010 at 2:23 AM, Jeff Newmiller
<[79]jdnewmil at dcn.davis.ca.us> wrote:
> Fahim Md wrote:
>>
>> Is there any function/way to merge/unite the following data
>>
>> ?GENEID ? ? ?col1 ? ? ? ? ?col2 ? ? ? ? ? ? col3 ? ? ? ? ? ? ? ?col4
>> ?G234064 ? ? ? ? 1 ? ? ? ? ? ? 0 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 0
>> ?G234064 ? ? ? ? 1 ? ? ? ? ? ? 0 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 0
>> ?G234064 ? ? ? ? 1 ? ? ? ? ? ? 0 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 0
>> ?G234064 ? ? ? ? 0 ? ? ? ? ? ? 1 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 0
>> ?G234065 ? ? ? ? 0 ? ? ? ? ? ? 1 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 0
>> ?G234065 ? ? ? ? 0 ? ? ? ? ? ? 1 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 0
>> ?G234065 ? ? ? ? 0 ? ? ? ? ? ? 1 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 0
>> ?G234065 ? ? ? ? 0 ? ? ? ? ? ? 0 ? ? ? ? ? ? ? ? ?1 ? ? ? ? ? ? ? ? ? 0
>> ?G234065 ? ? ? ? 0 ? ? ? ? ? ? 0 ? ? ? ? ? ? ? ? ?1 ? ? ? ? ? ? ? ? ? 0
>> ?G234065 ? ? ? ? 0 ? ? ? ? ? ? 0 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 1
>>
>>
>> into
>> GENEID ? ? ?col1 ? ? ? ? ?col2 ? ? ? ? ? ? col3 ? ? ? ? ? ? ? ?col4
>> ?G234064 ? ? ? ? 1 ? ? ? ? ? ? 1 ? ? ? ? ? ? ? ? ?0 ? ? ? ? ? ? ? ? ? 0
>> // 1 appears in col1 and col2 above, rest are zero
>> ?G234065 ? ? ? ? 0 ? ? ? ? ? ? 1 ? ? ? ? ? ? ? ? ?1 ? ? ? ? ? ? ? ? ? 1
>> // 1 appears in col2 , 3 and 4 above.
>>
>>
>> Thank
>
> Warning on terminology: there is a "merge" function in R that lines up
rows
> from different tables to make a new set of longer rows (more columns).
The
> usual term for combining column values from multiple rows is
"aggregation".
>
> In addition to the example offered by Jim Holtzman, here are some other
> options in no particular order:
>
> x <- read.table(textConnection(" GENEID col1 col2 col3 col4
> G234064 1 0 0 0
> G234064 1 0 0 0
> G234064 1 0 0 0
> G234064 0 1 0 0
> G234065 0 1 0 0
> G234065 0 1 0 0
> G234065 0 1 0 0
> G234065 0 0 1 0
> G234065 0 0 1 0
> G234065 0 0 0 1
> "), header=TRUE, as.is=TRUE, row.names=NULL)
> closeAllConnections()
>
> # syntactic repackaging of Jim's basic approach
> library(plyr)
> ddply( x, .(GENEID), function(df)
> {with(as.integer(c(col1=any(col1), col2=any(col2), col3=any(col3),
col4=any(col4))))}
> )
You can do this a little more succinctly with colwise:
any_1 <- function(x) as.integer(any(x))
ddply(x, "GENEID", numcolwise(any_1))
Hadley
--
Assistant Professor / Dobelman Family Junior Chair
Department of Statistics / Rice University
[80]http://had.co.nz/
------------------------------
Message: 10
Date: Sat, 24 Jul 2010 08:55:01 -0500
From: Frank E Harrell Jr <[81]f.harrell at Vanderbilt.Edu>
To: Ravi Varadhan <[82]rvaradhan at jhmi.edu>
Cc: "[83]r-help at r-project.org" <[84]r-help at r-project.org>
Subject: Re: [R] UseR! 2010 - my impressions
Message-ID: <[85]4C4AF0B5.6070300 at vanderbilt.edu>
Content-Type: text/plain; charset="ISO-8859-1"; format=flowed
On 07/23/2010 06:50 PM, Ravi Varadhan wrote:
> Dear UseRs!,
>
> Everything about UseR! 2010 was terrific! I really mean "everything" -
the tutorials, invited talks, kaleidoscope sessions, focus sessions,
breakfast, snacks, lunch, conference dinner, shuttle services, and the
participants. The organization was fabulous. NIST were gracious hosts, and
provided top notch facilities. The rousing speech by Antonio Possolo, who
is the chief of Statistical Engineering Division at NIST, set the tempo
for the entire conference. Excellent invited lectures by Luke Tierney,
Frank Harrell, Mark Handcock, Diethelm Wurtz, Uwe Ligges, and Fritz
Leisch. All the sessions that I attended had many interesting ideas and
useful contributions. During the whole time that I was there, I could not
help but get the feeling that I am a part of something great.
>
> Before I end, let me add a few words about a special person. This
conference would not have been as great as it was without the tireless
efforts of Kate Mullen. The great thing about Kate is that she did so much
without ever hogging the limelight. Thank you, Kate and thank you NIST!
>
> I cannot wait for UseR!2011!
>
> Best,
> Ravi.
>
> ____________________________________________________________________
>
> Ravi Varadhan, Ph.D.
> Assistant Professor,
> Division of Geriatric Medicine and Gerontology
> School of Medicine
> Johns Hopkins University
>
> Ph. (410) 502-2619
> email: [86]rvaradhan at jhmi.edu
I want to echo what Ravi said. The talks were terrific (thanks to the
program committee and the speakers) and Kate Mullen and her team did an
extraordinary job in putting the conference together and running it. I
am proud to have been a part of it. Thank you all!
Frank
--
Frank E Harrell Jr Professor and Chairman School of Medicine
Department of Biostatistics Vanderbilt University
------------------------------
Message: 11
Date: Sat, 24 Jul 2010 08:25:59 -0700
From: "Charles C. Berry" <[87]cberry at tajo.ucsd.edu>
To: Marcus Liu <[88]marcusliu667 at yahoo.com>
Cc: [89]r-help at r-project.org
Subject: Re: [R] , Updating Table
Message-ID: <[90]Pine.LNX.4.64.1007240817250.21422 at tajo.ucsd.edu>
Content-Type: text/plain; charset="x-unknown"; Format="flowed"
On Fri, 23 Jul 2010, Marcus Liu wrote:
> Hi everyone,
>
> Is there any command for updating table withing a loop??
"Loops? We don't need no stinking loops!"
(From 'The Good, the Bad, and the Rgly')
tab <- table(data.raw, findInterval(seq(along=data.raw), ind+1 ) )
tab %*% upper.tri(tab, diag=T)
or
tab2 <- tapply( factor(data.raw), findInterval(seq(along=data.raw), ind+1
), table)
Reduce( "+", tab2, accum=TRUE )
HTH,
Chuck
p.s. See the posting guide re including a reproducible example with
requests like yours.
> For instance, at i, I have a table as ZZ = table(data.raw[1:ind[i]])
> where "ind" = c(10, 20, 30, ...).?Then , ZZ will be as follow
>
> "A" "B" "C"
> ?3??? 10?? 2
>
> At (i + 1), ZZ = table(data.raw[(ind[i]+1):ind[i+1]])
>
> "A" "B" "D"
> ?4 ?? 7??? 8
>
> Is there any command that can update the table ZZ for each time so that
in the above example, ZZ will be
>
> "A" "B" "C" "D"
> ?7??? 17?? 2??? 8
>
> Thanks.
>
> liu
>
>
>
>
> [[alternative HTML version deleted]]
>
>
Charles C. Berry (858) 534-2098
Dept of Family/Preventive Medicine
E mailto:[91]cberry at tajo.ucsd.edu UC San Diego
[92]http://famprevmed.ucsd.edu/faculty/cberry/ La Jolla, San Diego
92093-0901
------------------------------
Message: 12
Date: Sat, 24 Jul 2010 11:51:05 -0400
From: Duncan Murdoch <[93]murdoch.duncan at gmail.com>
To: "Charles C. Berry" <[94]cberry at tajo.ucsd.edu>
Cc: [95]r-help at r-project.org
Subject: Re: [R] , Updating Table
Message-ID: <[96]4C4B0BE9.7050409 at gmail.com>
Content-Type: text/plain; charset=UTF-8; format=flowed
On 24/07/2010 11:25 AM, Charles C. Berry wrote:
> On Fri, 23 Jul 2010, Marcus Liu wrote:
>
>> Hi everyone,
>>
>> Is there any command for updating table withing a loop??
>
> "Loops? We don't need no stinking loops!"
> (From 'The Good, the Bad, and the Rgly')
Actually, that quote comes from the TreasR of the SieRa MadRe.
Duncan Murdoch
> tab <- table(data.raw, findInterval(seq(along=data.raw), ind+1 ) )
> tab %*% upper.tri(tab, diag=T)
>
> or
>
> tab2 <- tapply( factor(data.raw), findInterval(seq(along=data.raw),
ind+1 ), table)
> Reduce( "+", tab2, accum=TRUE )
>
> HTH,
>
> Chuck
>
> p.s. See the posting guide re including a reproducible example with
> requests like yours.
>
>> For instance, at i, I have a table as ZZ = table(data.raw[1:ind[i]])
>> where "ind" = c(10, 20, 30, ...).?Then , ZZ will be as follow
>>
>> "A" "B" "C"
>> ?3??? 10?? 2
>>
>> At (i + 1), ZZ = table(data.raw[(ind[i]+1):ind[i+1]])
>>
>> "A" "B" "D"
>> ?4 ?? 7??? 8
>>
>> Is there any command that can update the table ZZ for each time so that
in the above example, ZZ will be
>>
>> "A" "B" "C" "D"
>> ?7??? 17?? 2??? 8
>>
>> Thanks.
>>
>> liu
>>
>>
>>
>>
>> [[alternative HTML version deleted]]
>>
>>
>
> Charles C. Berry (858) 534-2098
> Dept of Family/Preventive Medicine
> E mailto:[97]cberry at tajo.ucsd.edu UC San Diego
> [98]http://famprevmed.ucsd.edu/faculty/cberry/ La Jolla, San Diego
92093-0901
>
>
>
> ------------------------------------------------------------------------
>
> ______________________________________________
> [99]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
[100]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
------------------------------
Message: 13
Date: Sat, 24 Jul 2010 15:52:52 +0000 (UTC)
From: Ben Bolker <[101]bbolker at gmail.com>
To: [102]r-help at stat.math.ethz.ch
Subject: Re: [R] glm - prediction of a factor with several levels
Message-ID: <[103]loom.20100724T175114-259 at post.gmane.org>
Content-Type: text/plain; charset=us-ascii
blackscorpio <olivier.collignon <at> live.fr> writes:
> I'm currently attempting to predict the occurence of an event (factor)
> having more than 2 levels with several continuous predictors. The model
> being ordinal, I was waiting the glm function to return several
intercepts,
> which is not the case when looking to my results (I only have one
> intercept). I finally managed to perform an ordinal polytomous logisitc
> regression with the polr function, which gives several intercepts.
> But does anyone know what was the model performed with glm and why only
one
> intercept was given ?
It's not sufficiently clear (to me at least) what you're trying to
do. Please provide a minimal reproducible example ... As far as I know,
polr is the right way to do ordinal regression; it's not clear how you
were trying to use glm to do it.
Ben Bolker
------------------------------
Message: 14
Date: Sat, 24 Jul 2010 12:13:30 -0300
From: Bruno Bastos Gon?alves <[104]brubruzao at hotmail.com>
To: <[105]r-help at r-project.org>
Subject: [R] Doubt about a population competition function
Message-ID: <[106]SNT111-DS23835E9F6F65737C7B0BEDBBA40 at phx.gbl>
Content-Type: text/plain
Hi,
I'm doing a function that describe two populations in competition.
that's the function that i wrote:
exclusao<-function(n10, n20, k1, k2, alfa, beta, t){
n1<-k1-(alfa*n20)
n2<-k2-(beta*n10)
if(t==0){plot(t, n10, type='b', xlim=range(c(1:t), c
(1:t)), ylim=range(n10, n20), xlab='tempo',
ylab='tamanho populacional')
points(t, n20, type='b', col="red")
points(t, n10, type="b", col="black")
legend("topleft", c("Pop1", "Pop2"), cex=0.8, col=c
("black", "red"), pch=21:21, lty=1:1);
}
if(t>0){
for (i in 1:t){
n1[i==1]<-n1
n2[i==1]<-n2
n1[i+1]<-k1-alfa*n2[i]
n2[i+1]<-k2-beta*n1[i]
if(n1[i]==0){n1[i:t]==0}
if(n2[i]==0){n2[i:t]==0}
}
plot(c(1:t), n1[1:t], type='b', xlim=range(c(1:t), c
(1:t)), ylim=range(n1[1:t], n2[1:t]), xlab='tempo',
ylab='tamanho populacional')
points(c(1:t), n2[1:t], type='b', col="red")
legend("topleft", c("Pop1", "Pop2"), cex=0.8, col=c
("black", "red"), pch=21:21, lty=1:1);
}}
Where n10: size population in time 0, n20: size population in time 0, k1:
carrying capacity of the population 1, k2: carrying capacity of the
population 2, alfa: competition coefficient of population 2 in population
1, beta: competition coefficient of population 1 in population 2, t: time.
and when some population becomes 0 (ZERO), i want that population still 0
(ZERO) until the end of "t". i have tried to put "
if(n1[i]==0){n1[i:t]==0} if(n2[i]==0){n2[i:t]==0}" after
"n2[i+1]<-k2-beta*n1[i]" in the for function, but nothing happens. What
may i do ?
Thanks
Bruno
[[alternative HTML version deleted]]
------------------------------
Message: 15
Date: Sat, 24 Jul 2010 12:15:24 -0300
From: "Gmail" <[107]goncalves.b.b at gmail.com>
To: "[108]r-help at r-project.org"@stat.math.ethz.ch
Subject: [R] Doubt about a population competition function
Message-ID: <822A5DC79A42471AB493F6C4A0817C8E at NotebookBruno>
Content-Type: text/plain
Hi,
I'm doing a function that describe two populations in competition.
that's the function that i wrote:
exclusao<-function(n10, n20, k1, k2, alfa, beta, t){
n1<-k1-(alfa*n20)
n2<-k2-(beta*n10)
if(t==0){plot(t, n10, type='b', xlim=range(c(1:t), c
(1:t)), ylim=range(n10, n20), xlab='tempo',
ylab='tamanho populacional')
points(t, n20, type='b', col="red")
points(t, n10, type="b", col="black")
legend("topleft", c("Pop1", "Pop2"), cex=0.8, col=c
("black", "red"), pch=21:21, lty=1:1);
}
if(t>0){
for (i in 1:t){
n1[i==1]<-n1
n2[i==1]<-n2
n1[i+1]<-k1-alfa*n2[i]
n2[i+1]<-k2-beta*n1[i]
if(n1[i]==0){n1[i:t]==0}
if(n2[i]==0){n2[i:t]==0}
}
plot(c(1:t), n1[1:t], type='b', xlim=range(c(1:t), c
(1:t)), ylim=range(n1[1:t], n2[1:t]), xlab='tempo',
ylab='tamanho populacional')
points(c(1:t), n2[1:t], type='b', col="red")
legend("topleft", c("Pop1", "Pop2"), cex=0.8, col=c
("black", "red"), pch=21:21, lty=1:1);
}}
Where n10: size population in time 0, n20: size population in time 0, k1:
carrying capacity of the population 1, k2: carrying capacity of the
population 2, alfa: competition coefficient of population 2 in population
1, beta: competition coefficient of population 1 in population 2, t: time.
and when some population becomes 0 (ZERO), i want that population still 0
(ZERO) until the end of "t". i have tried to put "
if(n1[i]==0){n1[i:t]==0} if(n2[i]==0){n2[i:t]==0}" after
"n2[i+1]<-k2-beta*n1[i]" in the for function, but nothing happens. What
may i do ?
Thanks
Bruno
[[alternative HTML version deleted]]
------------------------------
Message: 16
Date: Sat, 24 Jul 2010 09:39:10 -0700 (PDT)
From: Matt Stati <[109]mattstati at yahoo.com>
To: [110]r-help at r-project.org
Subject: [R] Book on R's Programming Language
Message-ID: <[111]289386.7557.qm at web43507.mail.sp1.yahoo.com>
Content-Type: text/plain
Can someone please recommend to me a book on the programming language that
R is based on? I'm looking for a foundational book that teaches the logic
of the S language. It seems that knowing the underpinnings of the language
can only make using R a bit easier.
Any leads are greatly appreciated . . .
Matt.
[[alternative HTML version deleted]]
------------------------------
Message: 17
Date: Sat, 24 Jul 2010 13:39:53 -0300
From: Henrique Dallazuanna <[112]wwwhsd at gmail.com>
To: aegea <[113]gcheer3 at gmail.com>
Cc: [114]r-help at r-project.org
Subject: Re: [R] how to calculate the product of every two elements in
two vectors
Message-ID:
<AANLkTi=DCF=Jv9gzLQK8p=[115]XrwenTLja3ZTrwKE9USm4z at mail.gmail.com>
Content-Type: text/plain
Try this:
c(as.matrix(B) %*% A)
On Fri, Jul 23, 2010 at 12:11 PM, aegea <[116]gcheer3 at gmail.com> wrote:
>
> Thanks in advance!
>
> A=c(1, 2, 3)
> B=c (9, 10, 11, 12)
>
> I want to get C=c(1*9, 1*10, 1*11, 1*12, ....., 3*9, 3*10, 3*11, 3*12)?
> C is still a vector with 12 elements
> Is there a way to do that?
> --
> View this message in context:
>
[117]http://r.789695.n4.nabble.com/how-to-calculate-the-product-of-every-t
wo-elements-in-two-vectors-tp2300299p2300299.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> [118]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> [119]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Henrique Dallazuanna
Curitiba-Paran�Brasil
25�25' 40" S 49�16' 22" O
[[alternative HTML version deleted]]
------------------------------
Message: 18
Date: Sat, 24 Jul 2010 10:13:52 -0700
From: Joshua Wiley <[120]jwiley.psych at gmail.com>
To: Matt Stati <[121]mattstati at yahoo.com>
Cc: [122]r-help at r-project.org
Subject: Re: [R] Book on R's Programming Language
Message-ID:
<AANLkTiki+9endGiR_T5pf9e8FofXbcyE5AL+-+[123]v0shNs at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8
Hi Matt,
[124]http://www.r-project.org/doc/bib/R-books.html
Lists a variety of books, and seems to include most (i.e., my
searches through google, amazon, and barnes and noble, didn't really
turn up others) books dedicated to R. I have always been under the
impression that Programming with Data (the Green Book) is a classic.
[125]http://cran.r-project.org/manuals.html
has the official manuals
Similar questions have been asked several times on this list so you
can also search for previous threads (e.g.,
[126]http://tolstoy.newcastle.edu.au/R/help/04/06/0063.html )
Best regards,
Josh
On Sat, Jul 24, 2010 at 9:39 AM, Matt Stati <[127]mattstati at yahoo.com>
wrote:
> Can someone please recommend to me a book on the programming language
that R is based on? I'm looking for a foundational book that teaches the
logic of the S language. It seems that knowing the underpinnings of the
language can only make using R a bit easier.
>
> Any leads are greatly appreciated . . .
>
> Matt.
>
>
>
>
> ? ? ? ?[[alternative HTML version deleted]]
>
> ______________________________________________
> [128]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
[129]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Joshua Wiley
Ph.D. Student, Health Psychology
University of California, Los Angeles
[130]http://www.joshuawiley.com/
------------------------------
Message: 19
Date: Sat, 24 Jul 2010 13:15:34 -0400
From: Gabor Grothendieck <[131]ggrothendieck at gmail.com>
To: aegea <[132]gcheer3 at gmail.com>
Cc: [133]r-help at r-project.org
Subject: Re: [R] how to calculate the product of every two elements in
two vectors
Message-ID:
<AANLkTinRzLax+[134]dPbmg9YRPUcE2fQH_0Sqs2UJKFVJ3BS at mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1
On Fri, Jul 23, 2010 at 11:11 AM, aegea <[135]gcheer3 at gmail.com> wrote:
>
> Thanks in advance!
>
> A=c(1, 2, 3)
> B=c (9, 10, 11, 12)
>
> I want to get C=c(1*9, 1*10, 1*11, 1*12, ....., 3*9, 3*10, 3*11, 3*12)?
> C is still a vector with 12 elements
> Is there a way to do that?
Here are yet a few more. The first one is the only one so far that
uses a single function and the last two are slight variations of ones
already posted.
kronecker(A, B)
c(tcrossprod(B, A))
c(outer(B, A))
c(B %o% A)
Here is a speed comparison. The fastest are as.matrix, %outer% and
%o% . They are so close that random fluctuations might easily change
their order and since %o% involves the least keystrokes that one might
be a good overall choice. Although not among the fastest the
kronecker solution is the simplest since it only involves a single
function call so it might be preferred on that count.
> A <- B <- 1:400
> out <- benchmark(
+ as.matrix = c(as.matrix(B) %*% A),
+ crossprod = c(tcrossprod(B, A)),
+ outer = c(outer(B, A)),
+ o = c(B %o% A),
+ kronecker = kronecker(A, B),
+ touter = as.vector(t(outer(A, B))))
> out[order(out$relative), ]
test replications elapsed relative user.self sys.self
user.child sys.child
1 as.matrix 100 0.92 1.000000 0.62 0.28
NA NA
3 outer 100 0.93 1.010870 0.59 0.35
NA NA
4 o 100 0.94 1.021739 0.66 0.28
NA NA
2 crossprod 100 1.11 1.206522 0.67 0.43
NA NA
5 kronecker 100 1.45 1.576087 1.25 0.21
NA NA
6 touter 100 1.84 2.000000 1.40 0.43
NA NA
------------------------------
Message: 20
Date: Sat, 24 Jul 2010 11:55:08 -0500
From: Joseph Magagnoli <[136]jcm331 at gmail.com>
To: Matt Stati <[137]mattstati at yahoo.com>, rhelp
<[138]r-help at r-project.org>
Subject: Re: [R] Book on R's Programming Language
Message-ID:
<AANLkTi=+umQatto9rVeujEwjNyqOSiHOas+[139]eWg9Wcb80 at mail.gmail.com>
Content-Type: text/plain
Matt,
you might want to check out programming with data by John Chambers.
[140]http://www.amazon.com/Programming-Data-Guide-S-Language/dp/0387985034
/ref=sr_1_1?ie=UTF8&s=books&qid=1279990404&sr=8-1
Best Joe
On Sat, Jul 24, 2010 at 11:39 AM, Matt Stati <[141]mattstati at yahoo.com>
wrote:
> Can someone please recommend to me a book on the programming language
that
> R is based on? I'm looking for a foundational book that teaches the
logic of
> the S language. It seems that knowing the underpinnings of the language
can
> only make using R a bit easier.
>
> Any leads are greatly appreciated . . .
>
> Matt.
>
>
>
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> [142]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
>
[143]http://www.R-project.org/posting-guide.html<[144]http://www.r-project
.org/posting-guide.html>
> and provide commented, minimal, self-contained, reproducible code.
>
--
Joseph C. Magagnoli
Doctoral Student
Department of Political Science
University of North Texas
1155 Union Circle #305340
Denton, Texas 76203-5017
Email: [145]jcm0250 at unt.edu
[[alternative HTML version deleted]]
------------------------------
Message: 21
Date: Sat, 24 Jul 2010 12:55:18 -0600
From: Greg Snow <[146]Greg.Snow at imail.org>
To: "Farley, Robert" <[147]FarleyR at metro.net>, "[148]r-help at r-project.org"
<[149]r-help at r-project.org>
Subject: Re: [R] Constrain density to 0 at 0?
Message-ID:
<[150]B37C0A15B8FB3C468B5BC7EBC7DA14CC633A53DF11 at LP-EXMBVS10.CO.IHC.COM>
Content-Type: text/plain; charset="us-ascii"
Look at the logspline package. This is a different approach to density
estimation from the kernel densities used by 'density', but does allow you
to set fixed boundaries.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[151]greg.snow at imail.org
801.408.8111
> -----Original Message-----
> From: [152]r-help-bounces at r-project.org [mailto:r-help-bounces at r-
> project.org] On Behalf Of Farley, Robert
> Sent: Monday, July 19, 2010 7:57 PM
> To: [153]r-help at r-project.org
> Subject: [R] Constrain density to 0 at 0?
>
> I'm plotting some trip length frequencies using the following code:
>
> plot( density(zTestData$Distance, weights=zTestData$Actual),
> xlim=c(0, 10),
> main="Test TLFD",
> xlab="Distance",
> col=6 )
> lines(density(zTestData$Distance, weights=zTestData$FlatWeight), col=2)
> lines(density(zTestData$Distance, weights=zTestData$BrdWeight ), col=3)
>
> which works fine except the distances are all positive, but the
> densities don't drop to 0 until around -2 or -3.
>
> Is there a way for me to "force" the density plot to 0 at 0?
>
>
>
> Thanks
>
>
>
> Robert Farley
> Metro
> 1 Gateway Plaza
> Mail Stop 99-23-7
> Los Angeles, CA 90012-2952
> Voice: (213)922-2532
> Fax: (213)922-2868
> www.Metro.net
>
>
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> [154]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide [155]http://www.R-project.org/posting-
> guide.html
> and provide commented, minimal, self-contained, reproducible code.
------------------------------
Message: 22
Date: Sun, 25 Jul 2010 01:16:57 +0530
From: shabnam k <[156]shabnambioinfo at gmail.com>
To: r-help <[157]r-help at r-project.org>
Subject: [R] matest function for multiple factors
Message-ID:
<AANLkTi=mQ5bhntKQbfUapHpWNWpVLn+[158]Hzat_FD6jpZF6 at mail.gmail.com>
Content-Type: text/plain
Hi,
I am using maanova package for analysis. In my dataset, two fixed factors
time and treatment and sample as random factor is there. Am able to get
madata object and fitmaanova object. But, am unable to do f-test with two
factors, but i have done f-test seperately for two factors.
fit.full.mix <- fitmaanova(madata, formula = ~Sample+Time+Treatment,
random = ~Sample)
ftest.all = *matest*(madata, fit.full.mix, test.method=c(1, 1),
shuffle.method="sample", *term="Time+Treatment"*, n.perm= 100)
Can u please suggest me, how to represent multiple factors in the
above function simultaneously in term.
[[alternative HTML version deleted]]
------------------------------
Message: 23
Date: Sat, 24 Jul 2010 13:55:59 -0600
From: Greg Snow <[159]Greg.Snow at imail.org>
To: "[160]babyfoxlove1 at sina.com" <[161]babyfoxlove1 at sina.com>,
"[162]r-help at r-project.org" <[163]r-help at r-project.org>
Subject: Re: [R] How to deal with more than 6GB dataset using R?
Message-ID:
<[164]B37C0A15B8FB3C468B5BC7EBC7DA14CC633A53DF25 at LP-EXMBVS10.CO.IHC.COM>
Content-Type: text/plain; charset="us-ascii"
You may want to look at the biglm package as another way to regression
models on very large data sets.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[165]greg.snow at imail.org
801.408.8111
> -----Original Message-----
> From: [166]r-help-bounces at r-project.org [mailto:r-help-bounces at r-
> project.org] On Behalf Of [167]babyfoxlove1 at sina.com
> Sent: Friday, July 23, 2010 10:10 AM
> To: [168]r-help at r-project.org
> Subject: [R] How to deal with more than 6GB dataset using R?
>
> Hi there,
>
> Sorry to bother those who are not interested in this problem.
>
> I'm dealing with a large data set, more than 6 GB file, and doing
> regression test with those data. I was wondering are there any
> efficient ways to read those data? Instead of just using read.table()?
> BTW, I'm using a 64bit version desktop and a 64bit version R, and the
> memory for the desktop is enough for me to use.
> Thanks.
>
>
> --Gin
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> [169]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide [170]http://www.R-project.org/posting-
> guide.html
> and provide commented, minimal, self-contained, reproducible code.
------------------------------
Message: 24
Date: Sat, 24 Jul 2010 17:03:44 -0400
From: "Abdi, Abdulhakim" <[171]AbdiA at si.edu>
To: "[172]r-help at r-project.org" <[173]r-help at r-project.org>
Subject: [R] Using R to fill ETM+ data gaps?
Message-ID:
<[174]97679C0A11332E48A01E0D463E8B3FF103AF0CE275 at SI-MSEV02.US.SINET.SI.EDU
>
Content-Type: text/plain; charset="us-ascii"
Hi,
I was wondering if anyone knows of a method (or if it's possible) to use R
in interpolating Landsat ETM+ data gaps?
Regards,
Hakim Abdi
_________________________________________
Abdulhakim Abdi, M.Sc.
Conservation GIS/Remote Sensing Lab
Smithsonian Conservation Biology Institute
1500 Remount Road
Front Royal, VA 22630
phone: +1 540 635 6578
mobile: +1 747 224 7006
fax: +1 540 635 6506 (Attn: ABDI/GIS Lab)
email: [175]abdia at si.edu
[176]http://nationalzoo.si.edu/SCBI/ConservationGIS/
------------------------------
Message: 25
Date: Sat, 24 Jul 2010 14:07:25 -0700 (PDT)
From: Felipe Carrillo <[177]mazatlanmexico at yahoo.com>
To: [178]r-help at stat.math.ethz.ch
Subject: [R] How to generate a sequence of dates without hardcoding
the year
Message-ID: <[179]418059.32636.qm at web56602.mail.re3.yahoo.com>
Content-Type: text/plain; charset=iso-8859-1
Hi:
I have a dataframe named 'spring' and I am trying to add a new variable
named
'IdDate'
This line of code works fine:
spring$idDate <- seq(as.Date("2008-07-01"), as.Date("2009-06-30"),
by="week")
But I don't want to hardcode the year because it will be used again the
following year
Is it possible to just generate dates with the month and day?
I tried the code below:
seq(as.Date("7-1", "%B%d"), as.Date("6-30", "%B%d"), by="week")
and got this?error message:
Error in seq.int(0, to - from, by) : 'to' must be finite
Thanks for any pointers
?
Felipe D. Carrillo
Supervisory Fishery Biologist
Department of the Interior
US Fish & Wildlife Service
California, USA
------------------------------
Message: 26
Date: Sat, 24 Jul 2010 18:09:45 -0300
From: Henrique Dallazuanna <[180]wwwhsd at gmail.com>
To: Felipe Carrillo <[181]mazatlanmexico at yahoo.com>
Cc: [182]r-help at stat.math.ethz.ch
Subject: Re: [R] How to generate a sequence of dates without
hardcoding the year
Message-ID:
<[183]AANLkTimd6DLagDiHKbFuF3hE5dES4d6DmC5T50Vn25N3 at mail.gmail.com>
Content-Type: text/plain
Try this:
format(seq(as.Date("2008-07-01"), as.Date("2009-06-30"), by="week"),
"%d/%m")
On Sat, Jul 24, 2010 at 6:07 PM, Felipe Carrillo
<[184]mazatlanmexico at yahoo.com>wrote:
> Hi:
> I have a dataframe named 'spring' and I am trying to add a new variable
> named
> 'IdDate'
> This line of code works fine:
> spring$idDate <- seq(as.Date("2008-07-01"), as.Date("2009-06-30"),
by="week")
>
> But I don't want to hardcode the year because it will be used again the
> following year
> Is it possible to just generate dates with the month and day?
>
> I tried the code below:
> seq(as.Date("7-1", "%B%d"), as.Date("6-30", "%B%d"), by="week")
>
> and got this error message:
> Error in seq.int(0, to - from, by) : 'to' must be finite
> Thanks for any pointers
>
>
> Felipe D. Carrillo
> Supervisory Fishery Biologist
> Department of the Interior
> US Fish & Wildlife Service
> California, USA
>
>
>
>
> ______________________________________________
> [185]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> [186]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Henrique Dallazuanna
Curitiba-Paran�Brasil
25�25' 40" S 49�16' 22" O
[[alternative HTML version deleted]]
------------------------------
Message: 27
Date: Sat, 24 Jul 2010 15:54:51 -0500 (CDT)
From: <[187]mpward at illinois.edu>
To: "Joshua Wiley" <[188]jwiley.psych at gmail.com>
Cc: [189]r-help at r-project.org
Subject: Re: [R] Trouble retrieving the second largest value from each
row of a data.frame
Message-ID: <[190]20100724155451.CHG28413 at expms6.cites.uiuc.edu>
Content-Type: text/plain; charset=iso-8859-1
THANKS, but I have one issue and one question.
For some reason the "secondstrongest" value for row 3 and 6 are incorrect
(they are the strongest) the remaining 10 are correct??
These data are being used to track radio-tagged birds, they are from
automated radio telemetry receivers. I will applying the following formula
diff <- ((strongest- secondstrongest)/100)
bearingdiff <-30-(-0.0624*(diff**2))-(2.8346*diff)
Then the bearing diff is added to strongestantenna (value0 = 0degrees) if
the secondstrongestatenna is greater (eg value0 and value60), or if the
secondstrongestantenna is smaller than the strongestantenna, then the
bearingdiff is substracted from the strongestantenna. The only exception
is that if value0 (0degrees) is strongest and value300(360degrees) is the
secondstrongestantenna then the bearing is 360-bearingdiff. Also the
strongestantenna and secondstrongestantenna have to be next to each other
(e.g. value0 with value60, value240 with value300, value0 with value300)
or the results should be NA. I have been trying to use a series of if,
else statements to produce these bearing, but all I am producing is
errors. Any suggestion would be appreciated.
Again THANKS for you efforts.
Mike
---- Original message ----
>Date: Fri, 23 Jul 2010 23:01:56 -0700
>From: Joshua Wiley <[191]jwiley.psych at gmail.com>
>Subject: Re: [R] Trouble retrieving the second largest value from each
row of a data.frame
>To: [192]mpward at illinois.edu
>Cc: [193]r-help at r-project.org
>
>Hi,
>
>Here is a little function that will do what you want and return a nice
output:
>
>#Function To calculate top two values and return
>my.finder <- function(mydata) {
> my.fun <- function(data) {
> strongest <- which.max(data)
> secondstrongest <- which.max(data[-strongest])
> strongestantenna <- names(data)[strongest]
> secondstrongantenna <- names(data[-strongest])[secondstrongest]
> value <- matrix(c(data[strongest], data[secondstrongest],
> strongestantenna, secondstrongantenna), ncol =4)
> return(value)
> }
> dat <- apply(mydata, 1, my.fun)
> dat <- t(dat)
> dat <- as.data.frame(dat, stringsAsFactors = FALSE)
> colnames(dat) <- c("strongest", "secondstrongest",
> "strongestantenna", "secondstrongantenna")
> dat[ , "strongest"] <- as.numeric(dat[ , "strongest"])
> dat[ , "secondstrongest"] <- as.numeric(dat[ , "secondstrongest"])
> return(dat)
>}
>
>
>#Using your example data:
>
>yourdata <- structure(list(value0 = c(-13007L, -12838L, -12880L, -12805L,
>-12834L, -11068L, -12807L, -12770L, -12988L, -11779L), value60 =
c(-11707L,
>-13210L, -11778L, -11653L, -13527L, -11698L, -14068L, -11665L,
>-11736L, -12873L), value120 = c(-11072L, -11176L, -11113L, -11071L,
>-11067L, -12430L, -11092L, -11061L, -11137L, -12973L), value180 =
c(-12471L,
>-11799L, -12439L, -12385L, -11638L, -12430L, -11709L, -12373L,
>-12570L, -12537L), value240 = c(-12838L, -13210L, -13089L, -11561L,
>-13527L, -12430L, -11607L, -11426L, -13467L, -12973L), value300 =
c(-13357L,
>-13845L, -13880L, -13317L, -13873L, -12814L, -13025L, -12805L,
>-13739L, -11146L)), .Names = c("value0", "value60", "value120",
>"value180", "value240", "value300"), class = "data.frame", row.names =
c("1",
>"2", "3", "4", "5", "6", "7", "8", "9", "10"))
>
>my.finder(yourdata) #and what you want is in a nicely labeled data frame
>
>#A potential problem is that it is not very efficient
>
>#Here is a test using a matrix of 100, 000 rows
>#sampled from the same range as your data
>#with the same number of columns
>
>data.test <- matrix(
> sample(seq(min(yourdata), max(yourdata)), size = 500000, replace =
TRUE),
> ncol = 5)
>
>system.time(my.finder(data.test))
>
>#On my system I get
>
>> system.time(my.finder(data.test))
> user system elapsed
> 2.89 0.00 2.89
>
>Hope that helps,
>
>Josh
>
>
>
>On Fri, Jul 23, 2010 at 6:20 PM, <[194]mpward at illinois.edu> wrote:
>> I have a data frame with a couple million lines and want to retrieve
the largest and second largest values in each row, along with the label of
the column these values are in. For example
>>
>> row 1
>> strongest=-11072
>> secondstrongest=-11707
>> strongestantenna=value120
>> secondstrongantenna=value60
>>
>> Below is the code I am using and a truncated data.frame. ?Retrieving
the largest value was easy, but I have been getting errors every way I
have tried to retrieve the second largest value. ?I have not even tried to
retrieve the labels for the value yet.
>>
>> Any help would be appreciated
>> Mike
>>
>>
>>> data<-data.frame(value0, value60, value120, value180, value240,
value300)
>>> data
>> ? value0 value60 value120 value180 value240 value300
>> 1 ?-13007 ?-11707 ? -11072 ? -12471 ? -12838 ? -13357
>> 2 ?-12838 ?-13210 ? -11176 ? -11799 ? -13210 ? -13845
>> 3 ?-12880 ?-11778 ? -11113 ? -12439 ? -13089 ? -13880
>> 4 ?-12805 ?-11653 ? -11071 ? -12385 ? -11561 ? -13317
>> 5 ?-12834 ?-13527 ? -11067 ? -11638 ? -13527 ? -13873
>> 6 ?-11068 ?-11698 ? -12430 ? -12430 ? -12430 ? -12814
>> 7 ?-12807 ?-14068 ? -11092 ? -11709 ? -11607 ? -13025
>> 8 ?-12770 ?-11665 ? -11061 ? -12373 ? -11426 ? -12805
>> 9 ?-12988 ?-11736 ? -11137 ? -12570 ? -13467 ? -13739
>> 10 -11779 ?-12873 ? -12973 ? -12537 ? -12973 ? -11146
>>> #largest value in the row
>>> strongest<-apply(data, 1, max)
>>>
>>>
>>> #second largest value in the row
>>> n<-function(data)(1/(min(1/(data[1, ]-max(data[1, ]))))+ (max(data[1,
])))
>>> secondstrongest<-apply(data, 1, n)
>> Error in data[1, ] : incorrect number of dimensions
>>>
>>
>> ______________________________________________
>> [195]R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
[196]http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>
>
>
>--
>Joshua Wiley
>Ph.D. Student, Health Psychology
>University of California, Los Angeles
>[197]http://www.joshuawiley.com/
------------------------------
Message: 28
Date: Sat, 24 Jul 2010 22:21:21 +0100
From: Paul Smith <[198]phhs80 at gmail.com>
To: [199]r-help at r-project.org
Subject: Re: [R] (no subject)
Message-ID:
<AANLkTi=qKpbPZutDNr9n8af1jGFsLfTqT1C7Nm8Ko+[200]oH at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8
2010/7/23 w ? <[201]hw_joyce_cn at hotmail.com>:
> I use the constrOptim to maximize a function with four constriants but
the answer does not leave from the starting value and there is only one
outer iteration. The function is defined as follows:
> tm<-function(p){
> p1<-p[1]; p2<-p[2]; p3<-p[3];
> p4<-1-p1-p2-p3;
> p1*p2*p3*p4}
>
> ##the constraints are p1>=0; p2>=0; p3>=0 and p4>=0 i.e. p1+p2+p3<=1
> start<-c(0.9999991, 0.0000001, 0.0000001)
> dev<-rbind(diag(3), -diag(3), rep(-1, 3))
> bvec<-c(rep(0, 3), rep(-1, 4))
> constrOptim(start, tm, NULL, ui=dev, ci=bvec, control=list(maxit=10000))
>
> Am i missing something obviously that cause the problem or there is some
bugs in constrOptim. Could you please help me out
Wenwen,
I believe that the reason why constrOptim behaves as described is
related to the fact that
(p1, p2, p3) = (1, 0, 0)
is a stationary point and you use it as a starting point. Try a
different starting point.
If the objective is to maximize, then you should use the following
command:
constrOptim(start, tm, NULL, ui=dev, ci=bvec, control=list(maxit=10000,
fnscale=-1))
(Notice fnscale=-1.)
Finally, whenever you ask something on this list, please use a
meaningful title for your message, as it will dramatically increase
the chances of you getting an answer.
Good luck,
Paul
------------------------------
Message: 29
Date: Sat, 24 Jul 2010 19:02:57 -0400
From: jim holtman <[202]jholtman at gmail.com>
To: Felipe Carrillo <[203]mazatlanmexico at yahoo.com>
Cc: [204]r-help at stat.math.ethz.ch
Subject: Re: [R] How to generate a sequence of dates without
hardcoding the year
Message-ID:
<AANLkTikBNcm2=EzKaWMGNez-b53QhYYe_9CuK4LRN=[205]3P at mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1
Is this what you want if you want to assume that the date without a
year is this year:
> seq(as.Date("7-1", "%m-%d"), by="week", length=52)
[1] "2010-07-01" "2010-07-08" "2010-07-15" "2010-07-22" "2010-07-29"
"2010-08-05" "2010-08-12" "2010-08-19"
[9] "2010-08-26" "2010-09-02" "2010-09-09" "2010-09-16" "2010-09-23"
"2010-09-30" "2010-10-07" "2010-10-14"
[17] "2010-10-21" "2010-10-28" "2010-11-04" "2010-11-11" "2010-11-18"
"2010-11-25" "2010-12-02" "2010-12-09"
[25] "2010-12-16" "2010-12-23" "2010-12-30" "2011-01-06" "2011-01-13"
"2011-01-20" "2011-01-27" "2011-02-03"
[33] "2011-02-10" "2011-02-17" "2011-02-24" "2011-03-03" "2011-03-10"
"2011-03-17" "2011-03-24" "2011-03-31"
[41] "2011-04-07" "2011-04-14" "2011-04-21" "2011-04-28" "2011-05-05"
"2011-05-12" "2011-05-19" "2011-05-26"
[49] "2011-06-02" "2011-06-09" "2011-06-16" "2011-06-23"
>
On Sat, Jul 24, 2010 at 5:07 PM, Felipe Carrillo
<[206]mazatlanmexico at yahoo.com> wrote:
> Hi:
> I have a dataframe named 'spring' and I am trying to add a new variable
named
> 'IdDate'
> This line of code works fine:
> spring$idDate <- seq(as.Date("2008-07-01"), as.Date("2009-06-30"),
by="week")
>
> But I don't want to hardcode the year because it will be used again the
> following year
> Is it possible to just generate dates with the month and day?
>
> I tried the code below:
> seq(as.Date("7-1", "%B%d"), as.Date("6-30", "%B%d"), by="week")
>
> and got this?error message:
> Error in seq.int(0, to - from, by) : 'to' must be finite
> Thanks for any pointers
>
>
> Felipe D. Carrillo
> Supervisory Fishery Biologist
> Department of the Interior
> US Fish & Wildlife Service
> California, USA
>
>
>
>
> ______________________________________________
> [207]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
[208]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Jim Holtman
Cincinnati, OH
+1 513 646 9390
What is the problem that you are trying to solve?
------------------------------
Message: 30
Date: Sat, 24 Jul 2010 16:27:06 -0700 (PDT)
From: zachmohr <[209]zachmohr at gmail.com>
To: [210]r-help at r-project.org
Subject: Re: [R] glm - prediction of a factor with several levels
Message-ID:
<[211]AANLkTinzngqO3S_p-E25N3TTWEdiQBBS4CvzErgywj1p at mail.gmail.com>
Content-Type: text/plain
As far as I know, glm only works with dichotomous or count data. polr in
the MASS package works and so does lrm {Design} for ordinal dependent
variables. I would assume that the model produced by glm is a dichotomous
version of your model but not sure. Only one intercept would be given
because if you used the log link then it would have produced a dichotomous
model instead of an ordered logistic regression. My suggestion is to use
polr or lrm.
On Fri, Jul 23, 2010 at 6:15 PM, blackscorpio [via R] <
ml-node+[212]2300793-1751019155-246278 at n4.nabble.com<ml-node%[213]2B230079
3-1751019155-246278 at n4.nabble.com>
> wrote:
> Dear community,
> I'm currently attempting to predict the occurence of an event (factor)
> having more than 2 levels with several continuous predictors. The model
> being ordinal, I was waiting the glm function to return several
intercepts,
> which is not the case when looking to my results (I only have one
> intercept). I finally managed to perform an ordinal polytomous logisitc
> regression with the polr function, which gives several intercepts.
> But does anyone know what was the model performed with glm and why only
one
> intercept was given ?
> Thanks a lot for your help !
>
>
> ------------------------------
> View message @
>
[214]http://r.789695.n4.nabble.com/glm-prediction-of-a-factor-with-several
-levels-tp2300793p2300793.html
> To unsubscribe from R, click here< (link removed) ==>.
>
>
>
--
Department of Public Administration
University of Kansas
4060 Wesco Hall
Office W
Lawrence KS 66045-3177
Phone: (785) 813-1384
Email: [215]zachmohr at gmail.com
--
View this message in context:
[216]http://r.789695.n4.nabble.com/glm-prediction-of-a-factor-with-several
-levels-tp2300793p2301324.html
Sent from the R help mailing list archive at Nabble.com.
[[alternative HTML version deleted]]
------------------------------
Message: 31
Date: Sat, 24 Jul 2010 20:09:32 -0400
From: David Winsemius <[217]dwinsemius at comcast.net>
To: <[218]mpward at illinois.edu>
Cc: [219]r-help at r-project.org
Subject: Re: [R] Trouble retrieving the second largest value from each
row of a data.frame
Message-ID: <[220]52EA484F-C066-4ACC-B5BC-1A3A20876D9E at comcast.net>
Content-Type: text/plain; charset=US-ASCII; format=flowed; delsp=yes
On Jul 24, 2010, at 4:54 PM, <[221]mpward at illinois.edu> wrote:
> THANKS, but I have one issue and one question.
>
> For some reason the "secondstrongest" value for row 3 and 6 are
> incorrect (they are the strongest) the remaining 10 are correct??
In my run of Wiley's code I instead get identical values for rows
2, 5, 6. Holtman's and my solutions did not suffer from that defect,
although mine suffered from my misreading of your request, thinking
that you wanted the top 3. The fix is trivial
>
> These data are being used to track radio-tagged birds, they are from
> automated radio telemetry receivers. I will applying the following
> formula
>
> diff <- ((strongest- secondstrongest)/100)
> bearingdiff <-30-(-0.0624*(diff**2))-(2.8346*diff)
vals <- c("value0", "value60", "value120", "value180", "value240",
"value300")
value.str2 <- (match(yourdata$secondstrongestantenna, vals)-1)*60
value.str1 <- (match(yourdata$strongestantenna, vals)-1)*60
change.ind <- abs(match(yourdata, vals) - which(match(yourdata, vals) )
>
> A) Then the bearing diff is added to strongestantenna (value0 =
> 0degrees) if the secondstrongestatenna is greater (eg value0 and
> value60),
> B) or if the secondstrongestantenna is smaller than the
> strongestantenna,
> then the bearingdiff is substracted from the strongestantenna.
>
> C) The only exception is that if value0 (0degrees) is strongest and
> value300(360degrees) is the secondstrongestantenna then the bearing
> is 360-bearingdiff.
> D) Also the strongestantenna and secondstrongestantenna have to be
> next to each other (e.g. value0 with value60, value240 with
> value300, value0 with value300) or the results should be NA.
After setting finalbearing with A, B, and C then:
yourdata$finalbearing <- with(yourdata, ifelse (
change.ind <5 & change.ind > 1 ,
NA, finalbearing) )
> I have been trying to use a series of if, else statements to produce
> these bearing, but all I am producing is errors. Any suggestion
> would be appreciated.
>
> Again THANKS for you efforts.
>
> Mike
>
> ---- Original message ----
>> Date: Fri, 23 Jul 2010 23:01:56 -0700
>> From: Joshua Wiley <[222]jwiley.psych at gmail.com>
>> Subject: Re: [R] Trouble retrieving the second largest value from
>> each row of a data.frame
>> To: [223]mpward at illinois.edu
>> Cc: [224]r-help at r-project.org
>>
>> Hi,
>>
>> Here is a little function that will do what you want and return a
>> nice output:
>>
>> #Function To calculate top two values and return
>> my.finder <- function(mydata) {
>> my.fun <- function(data) {
>> strongest <- which.max(data)
>> secondstrongest <- which.max(data[-strongest])
>> strongestantenna <- names(data)[strongest]
>> secondstrongantenna <- names(data[-strongest])[secondstrongest]
>> value <- matrix(c(data[strongest], data[secondstrongest],
>> strongestantenna, secondstrongantenna), ncol =4)
>> return(value)
>> }
>> dat <- apply(mydata, 1, my.fun)
>> dat <- t(dat)
>> dat <- as.data.frame(dat, stringsAsFactors = FALSE)
>> colnames(dat) <- c("strongest", "secondstrongest",
>> "strongestantenna", "secondstrongantenna")
>> dat[ , "strongest"] <- as.numeric(dat[ , "strongest"])
>> dat[ , "secondstrongest"] <- as.numeric(dat[ , "secondstrongest"])
>> return(dat)
>> }
>>
>>
>> #Using your example data:
>>
>> yourdata <- structure(list(value0 = c(-13007L, -12838L, -12880L,
>> -12805L,
>> -12834L, -11068L, -12807L, -12770L, -12988L, -11779L), value60 =
>> c(-11707L,
>> -13210L, -11778L, -11653L, -13527L, -11698L, -14068L, -11665L,
>> -11736L, -12873L), value120 = c(-11072L, -11176L, -11113L, -11071L,
>> -11067L, -12430L, -11092L, -11061L, -11137L, -12973L), value180 =
>> c(-12471L,
>> -11799L, -12439L, -12385L, -11638L, -12430L, -11709L, -12373L,
>> -12570L, -12537L), value240 = c(-12838L, -13210L, -13089L, -11561L,
>> -13527L, -12430L, -11607L, -11426L, -13467L, -12973L), value300 =
>> c(-13357L,
>> -13845L, -13880L, -13317L, -13873L, -12814L, -13025L, -12805L,
>> -13739L, -11146L)), .Names = c("value0", "value60", "value120",
>> "value180", "value240", "value300"), class = "data.frame",
>> row.names = c("1",
>> "2", "3", "4", "5", "6", "7", "8", "9", "10"))
>>
>> my.finder(yourdata) #and what you want is in a nicely labeled data
>> frame
>>
>> #A potential problem is that it is not very efficient
>>
>> #Here is a test using a matrix of 100, 000 rows
>> #sampled from the same range as your data
>> #with the same number of columns
>>
>> data.test <- matrix(
>> sample(seq(min(yourdata), max(yourdata)), size = 500000, replace =
>> TRUE),
>> ncol = 5)
>>
>> system.time(my.finder(data.test))
>>
>> #On my system I get
>>
>>> system.time(my.finder(data.test))
>> user system elapsed
>> 2.89 0.00 2.89
>>
>> Hope that helps,
>>
>> Josh
>>
>>
>>
>> On Fri, Jul 23, 2010 at 6:20 PM, <[225]mpward at illinois.edu> wrote:
>>> I have a data frame with a couple million lines and want to
>>> retrieve the largest and second largest values in each row, along
>>> with the label of the column these values are in. For example
>>>
>>> row 1
>>> strongest=-11072
>>> secondstrongest=-11707
>>> strongestantenna=value120
>>> secondstrongantenna=value60
>>>
>>> Below is the code I am using and a truncated data.frame.
>>> Retrieving the largest value was easy, but I have been getting
>>> errors every way I have tried to retrieve the second largest
>>> value. I have not even tried to retrieve the labels for the value
>>> yet.
>>>
>>> Any help would be appreciated
>>> Mike
>>>
>>>
>>>> data<-
>>>> data.frame(value0, value60, value120, value180, value240, value300)
>>>> data
>>> value0 value60 value120 value180 value240 value300
>>> 1 -13007 -11707 -11072 -12471 -12838 -13357
>>> 2 -12838 -13210 -11176 -11799 -13210 -13845
>>> 3 -12880 -11778 -11113 -12439 -13089 -13880
>>> 4 -12805 -11653 -11071 -12385 -11561 -13317
>>> 5 -12834 -13527 -11067 -11638 -13527 -13873
>>> 6 -11068 -11698 -12430 -12430 -12430 -12814
>>> 7 -12807 -14068 -11092 -11709 -11607 -13025
>>> 8 -12770 -11665 -11061 -12373 -11426 -12805
>>> 9 -12988 -11736 -11137 -12570 -13467 -13739
>>> 10 -11779 -12873 -12973 -12537 -12973 -11146
>>>> #largest value in the row
>>>> strongest<-apply(data, 1, max)
>>>>
>>>>
>>>> #second largest value in the row
>>>> n<-function(data)(1/(min(1/(data[1, ]-max(data[1, ]))))+
>>>> (max(data[1, ])))
>>>> secondstrongest<-apply(data, 1, n)
>>> Error in data[1, ] : incorrect number of dimensions
>>>>
>>>
>>> ______________________________________________
>>> [226]R-help at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
[227]http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>>
>>
>>
>>
>> --
>> Joshua Wiley
>> Ph.D. Student, Health Psychology
>> University of California, Los Angeles
>> [228]http://www.joshuawiley.com/
>
> ______________________________________________
> [229]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
[230]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
------------------------------
Message: 32
Date: Sat, 24 Jul 2010 17:57:10 -0700
From: Joshua Wiley <[231]jwiley.psych at gmail.com>
To: [232]mpward at illinois.edu
Cc: [233]r-help at r-project.org
Subject: Re: [R] Trouble retrieving the second largest value from each
row of a data.frame
Message-ID:
<AANLkTikG3SgD+af50n2dsHMA3+[234]fa9gCQLuYBAaMS_P8w at mail.gmail.com>
Content-Type: text/plain; charset=UTF-8
On Sat, Jul 24, 2010 at 5:09 PM, David Winsemius
<[235]dwinsemius at comcast.net> wrote:
>
> On Jul 24, 2010, at 4:54 PM, <[236]mpward at illinois.edu> wrote:
>
>> THANKS, but I have one issue and one question.
>>
>> For some reason the "secondstrongest" value for row 3 and 6 are
incorrect
>> (they are the strongest) the remaining 10 are correct??
>
> In my run of Wiley's code I instead get identical values for rows 2, 5,
6.
Yes, my apologies; I neglected a [-strongest] when extracting the
second highest value. I included a corrected form below; however,
Winsemius' code is cleaner, not to mention easier to generalize, so I
see no reason not to use that option. You might consider using a
different object name than 'diff' since it is also the name of a
function.
Josh
#######
my.finder <- function(mydata) {
my.fun <- function(data) {
strongest <- which.max(data)
secondstrongest <- which.max(data[-strongest])
strongestantenna <- names(data)[strongest]
secondstrongantenna <- names(data[-strongest])[secondstrongest]
value <- matrix(c(data[strongest], data[-strongest][secondstrongest],
strongestantenna, secondstrongantenna), ncol =4)
return(value)
}
dat <- apply(mydata, 1, my.fun)
dat <- t(dat)
dat <- as.data.frame(dat, stringsAsFactors = FALSE)
colnames(dat) <- c("strongest", "secondstrongest",
"strongestantenna", "secondstrongantenna")
dat[ , "strongest"] <- as.numeric(dat[ , "strongest"])
dat[ , "secondstrongest"] <- as.numeric(dat[ , "secondstrongest"])
return(dat)
}
> Holtman's and my solutions did not suffer from that defect, although
mine
> suffered from my misreading of your request, thinking that you wanted
the
> top 3. The fix is trivial
>>
>> These data are being used to track radio-tagged birds, they are from
>> automated radio telemetry receivers. ?I will applying the following
formula
>>
>> ?diff <- ((strongest- secondstrongest)/100)
>> ?bearingdiff <-30-(-0.0624*(diff**2))-(2.8346*diff)
>
> vals <- c("value0", "value60", "value120", "value180", "value240",
> "value300")
> value.str2 <- (match(yourdata$secondstrongestantenna, vals)-1)*60
> value.str1 <- (match(yourdata$strongestantenna, vals)-1)*60
> change.ind <- abs(match(yourdata, vals) - which(match(yourdata, vals) )
>
>>
>> A) Then the bearing diff is added to strongestantenna (value0 =
0degrees)
>> if the secondstrongestatenna is greater (eg value0 and value60),
>
>> B) or if the secondstrongestantenna is smaller than the
strongestantenna,
>> then the bearingdiff is substracted from the strongestantenna.
>
>>
>> C) The only exception is that if value0 (0degrees) is strongest and
>> value300(360degrees) is the secondstrongestantenna then the bearing is
>> 360-bearingdiff.
>
>
>> D) Also the strongestantenna and secondstrongestantenna have to be next
to
>> each other (e.g. value0 with value60, value240 with value300, value0
with
>> value300) or the results should be NA.
>
> After setting finalbearing with A, B, and C then:
> yourdata$finalbearing <- with(yourdata, ifelse (
> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?change.ind <5 & change.ind > 1 ,
> ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? NA, finalbearing) )
>
>> I have been trying to use a series of if, else statements to produce
these
>> bearing, but all I am producing is errors. Any suggestion would be
>> appreciated.
>
>
>>
>> Again THANKS for you efforts.
>>
>> Mike
>>
>> ---- Original message ----
>>>
>>> Date: Fri, 23 Jul 2010 23:01:56 -0700
>>> From: Joshua Wiley <[237]jwiley.psych at gmail.com>
>>> Subject: Re: [R] Trouble retrieving the second largest value from each
>>> row of ?a data.frame
>>> To: [238]mpward at illinois.edu
>>> Cc: [239]r-help at r-project.org
>>>
>>> Hi,
>>>
>>> Here is a little function that will do what you want and return a nice
>>> output:
>>>
>>> #Function To calculate top two values and return
>>> my.finder <- function(mydata) {
>>> my.fun <- function(data) {
>>> ?strongest <- which.max(data)
>>> ?secondstrongest <- which.max(data[-strongest])
>>> ?strongestantenna <- names(data)[strongest]
>>> ?secondstrongantenna <- names(data[-strongest])[secondstrongest]
>>> ?value <- matrix(c(data[strongest], data[secondstrongest],
>>> ? ? ? ? ? ? ? ? ? ?strongestantenna, secondstrongantenna), ncol =4)
>>> ?return(value)
>>> }
>>> dat <- apply(mydata, 1, my.fun)
>>> dat <- t(dat)
>>> dat <- as.data.frame(dat, stringsAsFactors = FALSE)
>>> colnames(dat) <- c("strongest", "secondstrongest",
>>> ? ? ? ? ? ? ? ? ? "strongestantenna", "secondstrongantenna")
>>> dat[ , "strongest"] <- as.numeric(dat[ , "strongest"])
>>> dat[ , "secondstrongest"] <- as.numeric(dat[ , "secondstrongest"])
>>> return(dat)
>>> }
>>>
>>>
>>> #Using your example data:
>>>
>>> yourdata <- structure(list(value0 = c(-13007L, -12838L, -12880L,
-12805L,
>>> -12834L, -11068L, -12807L, -12770L, -12988L, -11779L), value60 =
>>> c(-11707L,
>>> -13210L, -11778L, -11653L, -13527L, -11698L, -14068L, -11665L,
>>> -11736L, -12873L), value120 = c(-11072L, -11176L, -11113L, -11071L,
>>> -11067L, -12430L, -11092L, -11061L, -11137L, -12973L), value180 =
>>> c(-12471L,
>>> -11799L, -12439L, -12385L, -11638L, -12430L, -11709L, -12373L,
>>> -12570L, -12537L), value240 = c(-12838L, -13210L, -13089L, -11561L,
>>> -13527L, -12430L, -11607L, -11426L, -13467L, -12973L), value300 =
>>> c(-13357L,
>>> -13845L, -13880L, -13317L, -13873L, -12814L, -13025L, -12805L,
>>> -13739L, -11146L)), .Names = c("value0", "value60", "value120",
>>> "value180", "value240", "value300"), class = "data.frame", row.names =
>>> c("1",
>>> "2", "3", "4", "5", "6", "7", "8", "9", "10"))
>>>
>>> my.finder(yourdata) #and what you want is in a nicely labeled data
frame
>>>
>>> #A potential problem is that it is not very efficient
>>>
>>> #Here is a test using a matrix of 100, 000 rows
>>> #sampled from the same range as your data
>>> #with the same number of columns
>>>
>>> data.test <- matrix(
>>> sample(seq(min(yourdata), max(yourdata)), size = 500000, replace =
TRUE),
>>> ncol = 5)
>>>
>>> system.time(my.finder(data.test))
>>>
>>> #On my system I get
>>>
>>>> system.time(my.finder(data.test))
>>>
>>> ?user ?system elapsed
>>> ?2.89 ? ?0.00 ? ?2.89
>>>
>>> Hope that helps,
>>>
>>> Josh
>>>
>>>
>>>
>>> On Fri, Jul 23, 2010 at 6:20 PM, ?<[240]mpward at illinois.edu> wrote:
>>>>
>>>> I have a data frame with a couple million lines and want to retrieve
the
>>>> largest and second largest values in each row, along with the label
of the
>>>> column these values are in. For example
>>>>
>>>> row 1
>>>> strongest=-11072
>>>> secondstrongest=-11707
>>>> strongestantenna=value120
>>>> secondstrongantenna=value60
>>>>
>>>> Below is the code I am using and a truncated data.frame. ?Retrieving
the
>>>> largest value was easy, but I have been getting errors every way I
have
>>>> tried to retrieve the second largest value. ?I have not even tried to
>>>> retrieve the labels for the value yet.
>>>>
>>>> Any help would be appreciated
>>>> Mike
>>>>
>>>>
>>>>> data<-data.frame(value0, value60, value120, value180, value240,
value300)
>>>>> data
>>>>
>>>> ?value0 value60 value120 value180 value240 value300
>>>> 1 ?-13007 ?-11707 ? -11072 ? -12471 ? -12838 ? -13357
>>>> 2 ?-12838 ?-13210 ? -11176 ? -11799 ? -13210 ? -13845
>>>> 3 ?-12880 ?-11778 ? -11113 ? -12439 ? -13089 ? -13880
>>>> 4 ?-12805 ?-11653 ? -11071 ? -12385 ? -11561 ? -13317
>>>> 5 ?-12834 ?-13527 ? -11067 ? -11638 ? -13527 ? -13873
>>>> 6 ?-11068 ?-11698 ? -12430 ? -12430 ? -12430 ? -12814
>>>> 7 ?-12807 ?-14068 ? -11092 ? -11709 ? -11607 ? -13025
>>>> 8 ?-12770 ?-11665 ? -11061 ? -12373 ? -11426 ? -12805
>>>> 9 ?-12988 ?-11736 ? -11137 ? -12570 ? -13467 ? -13739
>>>> 10 -11779 ?-12873 ? -12973 ? -12537 ? -12973 ? -11146
>>>>>
>>>>> #largest value in the row
>>>>> strongest<-apply(data, 1, max)
>>>>>
>>>>>
>>>>> #second largest value in the row
>>>>> n<-function(data)(1/(min(1/(data[1, ]-max(data[1, ]))))+
(max(data[1, ])))
>>>>> secondstrongest<-apply(data, 1, n)
>>>>
>>>> Error in data[1, ] : incorrect number of dimensions
>>>>>
>>>>
>>>> ______________________________________________
>>>> [241]R-help at r-project.org mailing list
>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>> PLEASE do read the posting guide
>>>> [242]http://www.R-project.org/posting-guide.html
>>>> and provide commented, minimal, self-contained, reproducible code.
>>>>
>>>
>>>
>>>
>>> --
>>> Joshua Wiley
>>> Ph.D. Student, Health Psychology
>>> University of California, Los Angeles
>>> [243]http://www.joshuawiley.com/
>>
>> ______________________________________________
>> [244]R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> [245]http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
>
--
Joshua Wiley
Ph.D. Student, Health Psychology
University of California, Los Angeles
[246]http://www.joshuawiley.com/
------------------------------
Message: 33
Date: Sat, 24 Jul 2010 21:27:17 -0400
From: David Winsemius <[247]dwinsemius at comcast.net>
To: "[248]r-help at r-project.org list" <[249]r-help at r-project.org>
Cc: [250]mpward at illinois.edu
Subject: Re: [R] Trouble retrieving the second largest value from each
row of a data.frame
Message-ID: <[251]2B19FDC3-4358-4731-87C7-89399E5DD75E at comcast.net>
Content-Type: text/plain; charset=US-ASCII; format=flowed; delsp=yes
On Jul 24, 2010, at 8:09 PM, David Winsemius wrote:
>
> On Jul 24, 2010, at 4:54 PM, <[252]mpward at illinois.edu> wrote:
>
>> THANKS, but I have one issue and one question.
>>
>> For some reason the "secondstrongest" value for row 3 and 6 are
>> incorrect (they are the strongest) the remaining 10 are correct??
>
> In my run of Wiley's code I instead get identical values for rows
> 2, 5, 6. Holtman's and my solutions did not suffer from that defect,
> although mine suffered from my misreading of your request, thinking
> that you wanted the top 3. The fix is trivial
>>
>> These data are being used to track radio-tagged birds, they are
>> from automated radio telemetry receivers. I will applying the
>> following formula
>>
>> diff <- ((strongest- secondstrongest)/100)
>> bearingdiff <-30-(-0.0624*(diff**2))-(2.8346*diff)
>
> vals <- c("value0", "value60", "value120", "value180", "value240",
> "value300")
> value.str2 <- (match(yourdata$secondstrongestantenna, vals)-1)*60
> value.str1 <- (match(yourdata$strongestantenna, vals)-1)*60
> change.ind <- abs(match(yourdata, vals) - which(match(yourdata,
> vals) )
OOOPs should have been
change.ind <- abs(match(yourdata, vals) - match(yourdata, vals) )
>
>>
>> A) Then the bearing diff is added to strongestantenna (value0 =
>> 0degrees) if the secondstrongestatenna is greater (eg value0 and
>> value60),
>
>> B) or if the secondstrongestantenna is smaller than the
>> strongestantenna,
>> then the bearingdiff is substracted from the strongestantenna.
yourdata$finalbearing <- with(yourdata, ifelse (value.str2>value.str1,
bearingdiff+value.str1, value.str1-bearingdiff) )
>
>>
>> C) The only exception is that if value0 (0degrees) is strongest and
>> value300(360degrees) is the secondstrongestantenna then the bearing
>> is 360-bearingdiff.
>
yourdata$finalbearing <- with(yourdata, ifelse (strongestantenna ==
"value0" & secondstrongantenna == "value300", 360- bearingdiff,
finalbearing) );
>> D) Also the strongestantenna and secondstrongestantenna have to be
>> next to each other (e.g. value0 with value60, value240 with
>> value300, value0 with value300) or the results should be NA.
>
> After setting finalbearing with A, B, and C then:
> yourdata$finalbearing <- with(yourdata, ifelse (
> change.ind <5 & change.ind > 1 ,
> NA, finalbearing) )
> yourdata
strongest secondstrongest strongestantenna secondstrongantenna
finalbearing
1 -11072 -11707 value120 value60
-11086.52
2 -11176 -11799 value120 value180
-11190.76
3 -11113 -11778 value120 value60
-11126.91
4 -11071 -11561 value120
value240 NA
5 -11067 -11638 value120 value180
-11082.85
6 -11068 -11698 value0 value60
-11082.62
7 -11092 -11607 value120
value240 NA
8 -11061 -11426 value120
value240 NA
9 -11137 -11736 value120 value60
-11152.26
10 -11146 -11779 value300 value0
-11160.56
>
>> I have been trying to use a series of if, else statements to produce
>> these bearing,
ifelse is the correct construct for processing vectors
--
David.
>> but all I am producing is errors. Any suggestion would be
>> appreciated.
>
>
>>
>> Again THANKS for you efforts.
>>
>> Mike
>>
>> ---- Original message ----
>>> Date: Fri, 23 Jul 2010 23:01:56 -0700
>>> From: Joshua Wiley <[253]jwiley.psych at gmail.com>
>>> Subject: Re: [R] Trouble retrieving the second largest value from
>>> each row of a data.frame
>>> To: [254]mpward at illinois.edu
>>> Cc: [255]r-help at r-project.org
>>>
>>> Hi,
>>>
>>> Here is a little function that will do what you want and return a
>>> nice output:
>>>
>>> #Function To calculate top two values and return
>>> my.finder <- function(mydata) {
>>> my.fun <- function(data) {
>>> strongest <- which.max(data)
>>> secondstrongest <- which.max(data[-strongest])
>>> strongestantenna <- names(data)[strongest]
>>> secondstrongantenna <- names(data[-strongest])[secondstrongest]
>>> value <- matrix(c(data[strongest], data[secondstrongest],
>>> strongestantenna, secondstrongantenna), ncol =4)
>>> return(value)
>>> }
>>> dat <- apply(mydata, 1, my.fun)
>>> dat <- t(dat)
>>> dat <- as.data.frame(dat, stringsAsFactors = FALSE)
>>> colnames(dat) <- c("strongest", "secondstrongest",
>>> "strongestantenna", "secondstrongantenna")
>>> dat[ , "strongest"] <- as.numeric(dat[ , "strongest"])
>>> dat[ , "secondstrongest"] <- as.numeric(dat[ , "secondstrongest"])
>>> return(dat)
>>> }
>>>
>>>
>>> #Using your example data:
>>>
>>> yourdata <- structure(list(value0 = c(-13007L, -12838L, -12880L,
>>> -12805L,
>>> -12834L, -11068L, -12807L, -12770L, -12988L, -11779L), value60 =
>>> c(-11707L,
>>> -13210L, -11778L, -11653L, -13527L, -11698L, -14068L, -11665L,
>>> -11736L, -12873L), value120 = c(-11072L, -11176L, -11113L, -11071L,
>>> -11067L, -12430L, -11092L, -11061L, -11137L, -12973L), value180 =
>>> c(-12471L,
>>> -11799L, -12439L, -12385L, -11638L, -12430L, -11709L, -12373L,
>>> -12570L, -12537L), value240 = c(-12838L, -13210L, -13089L, -11561L,
>>> -13527L, -12430L, -11607L, -11426L, -13467L, -12973L), value300 =
>>> c(-13357L,
>>> -13845L, -13880L, -13317L, -13873L, -12814L, -13025L, -12805L,
>>> -13739L, -11146L)), .Names = c("value0", "value60", "value120",
>>> "value180", "value240", "value300"), class = "data.frame",
>>> row.names = c("1",
>>> "2", "3", "4", "5", "6", "7", "8", "9", "10"))
>>>
>>> my.finder(yourdata) #and what you want is in a nicely labeled data
>>> frame
>>>
>>> #A potential problem is that it is not very efficient
>>>
>>> #Here is a test using a matrix of 100, 000 rows
>>> #sampled from the same range as your data
>>> #with the same number of columns
>>>
>>> data.test <- matrix(
>>> sample(seq(min(yourdata), max(yourdata)), size = 500000, replace =
>>> TRUE),
>>> ncol = 5)
>>>
>>> system.time(my.finder(data.test))
>>>
>>> #On my system I get
>>>
>>>> system.time(my.finder(data.test))
>>> user system elapsed
>>> 2.89 0.00 2.89
>>>
>>> Hope that helps,
>>>
>>> Josh
>>>
>>>
>>>
>>> On Fri, Jul 23, 2010 at 6:20 PM, <[256]mpward at illinois.edu> wrote:
>>>> I have a data frame with a couple million lines and want to
>>>> retrieve the largest and second largest values in each row, along
>>>> with the label of the column these values are in. For example
>>>>
>>>> row 1
>>>> strongest=-11072
>>>> secondstrongest=-11707
>>>> strongestantenna=value120
>>>> secondstrongantenna=value60
>>>>
>>>> Below is the code I am using and a truncated data.frame.
>>>> Retrieving the largest value was easy, but I have been getting
>>>> errors every way I have tried to retrieve the second largest
>>>> value. I have not even tried to retrieve the labels for the
>>>> value yet.
>>>>
>>>> Any help would be appreciated
>>>> Mike
>>>>
>>>>
>>>>> data<-
>>>>> data.frame(value0, value60, value120, value180, value240, value300)
>>>>> data
>>>> value0 value60 value120 value180 value240 value300
>>>> 1 -13007 -11707 -11072 -12471 -12838 -13357
>>>> 2 -12838 -13210 -11176 -11799 -13210 -13845
>>>> 3 -12880 -11778 -11113 -12439 -13089 -13880
>>>> 4 -12805 -11653 -11071 -12385 -11561 -13317
>>>> 5 -12834 -13527 -11067 -11638 -13527 -13873
>>>> 6 -11068 -11698 -12430 -12430 -12430 -12814
>>>> 7 -12807 -14068 -11092 -11709 -11607 -13025
>>>> 8 -12770 -11665 -11061 -12373 -11426 -12805
>>>> 9 -12988 -11736 -11137 -12570 -13467 -13739
>>>> 10 -11779 -12873 -12973 -12537 -12973 -11146
>>>>> #largest value in the row
>>>>> strongest<-apply(data, 1, max)
>>>>>
>>>>>
>>>>> #second largest value in the row
>>>>> n<-function(data)(1/(min(1/(data[1, ]-max(data[1, ]))))+
>>>>> (max(data[1, ])))
>>>>> secondstrongest<-apply(data, 1, n)
>>>> Error in data[1, ] : incorrect number of dimensions
>>>>>
>>>>
>>>> ______________________________________________
>>>> [257]R-help at r-project.org mailing list
>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>> PLEASE do read the posting guide
[258]http://www.R-project.org/posting-guide.html
>>>> and provide commented, minimal, self-contained, reproducible code.
>>>>
>>>
>>>
>>>
>>> --
>>> Joshua Wiley
>>> Ph.D. Student, Health Psychology
>>> University of California, Los Angeles
>>> [259]http://www.joshuawiley.com/
>>
>> ______________________________________________
>> [260]R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
[261]http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
> ______________________________________________
> [262]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
[263]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
------------------------------
Message: 34
Date: Sat, 24 Jul 2010 21:48:12 -0400
From: David Winsemius <[264]dwinsemius at comcast.net>
To: David Winsemius <[265]dwinsemius at comcast.net>
Cc: "[266]r-help at r-project.org list" <[267]r-help at r-project.org>,
[268]mpward at illinois.edu
Subject: Re: [R] Trouble retrieving the second largest value from each
row of a data.frame
Message-ID: <[269]9B25E777-650F-4419-92F0-9319A2B753B4 at comcast.net>
Content-Type: text/plain; charset=US-ASCII; format=flowed; delsp=yes
On Jul 24, 2010, at 9:27 PM, David Winsemius wrote:
>
> On Jul 24, 2010, at 8:09 PM, David Winsemius wrote:
>
>>
>> On Jul 24, 2010, at 4:54 PM, <[270]mpward at illinois.edu> wrote:
>>
>>> THANKS, but I have one issue and one question.
>>>
>>> For some reason the "secondstrongest" value for row 3 and 6 are
>>> incorrect (they are the strongest) the remaining 10 are correct??
>>
>> In my run of Wiley's code I instead get identical values for rows
>> 2, 5, 6. Holtman's and my solutions did not suffer from that defect,
>> although mine suffered from my misreading of your request, thinking
>> that you wanted the top 3. The fix is trivial
>>>
>>> These data are being used to track radio-tagged birds, they are
>>> from automated radio telemetry receivers. I will applying the
>>> following formula
>>>
>>> diff <- ((strongest- secondstrongest)/100)
>>> bearingdiff <-30-(-0.0624*(diff**2))-(2.8346*diff)
>>
>> vals <- c("value0", "value60", "value120", "value180", "value240",
>> "value300")
>> value.str2 <- (match(yourdata$secondstrongestantenna, vals)-1)*60
Had a misspelling ... rather:
match(yourdata$secondstrongantenna, vals)
>> value.str1 <- (match(yourdata$strongestantenna, vals)-1)*60
>> change.ind <- abs(match(yourdata, vals) - which(match(yourdata,
>> vals) )
>
> OOOPs should have been
>
> change.ind <- abs(match(yourdata, vals) - match(yourdata, vals) )
>
>
>>
>>>
>>> A) Then the bearing diff is added to strongestantenna (value0 =
>>> 0degrees) if the secondstrongestatenna is greater (eg value0 and
>>> value60),
>>
>>> B) or if the secondstrongestantenna is smaller than the
>>> strongestantenna,
>>> then the bearingdiff is substracted from the strongestantenna.
>
> yourdata$finalbearing <- with(yourdata, ifelse
> (value.str2>value.str1, bearingdiff+value.str1, value.str1-
> bearingdiff) )
>
>
>>
>>>
>>> C) The only exception is that if value0 (0degrees) is strongest
>>> and value300(360degrees) is the secondstrongestantenna then the
>>> bearing is 360-bearingdiff.
>>
>
> yourdata$finalbearing <- with(yourdata, ifelse (strongestantenna ==
> "value0" & secondstrongantenna == "value300", 360- bearingdiff,
> finalbearing) );
>
>
>>> D) Also the strongestantenna and secondstrongestantenna have to be
>>> next to each other (e.g. value0 with value60, value240 with
>>> value300, value0 with value300) or the results should be NA.
>>
>> After setting finalbearing with A, B, and C then:
>> yourdata$finalbearing <- with(yourdata, ifelse (
>> change.ind <5 & change.ind > 1 ,
>> NA, finalbearing) )
>
Better result with proper creation of value.str2:
yourdata
strongest secondstrongest strongestantenna secondstrongantenna
finalbearing
1 -11072 -11707 value120 value60
105.48359
2 -11176 -11799 value120 value180
134.76237
3 -11113 -11778 value120 value60
106.09061
4 -11071 -11561 value120
value240 NA
5 -11067 -11638 value120 value180
135.84893
6 -11068 -11698 value0 value60
14.61868
7 -11092 -11607 value120
value240 NA
8 -11061 -11426 value120
value240 NA
9 -11137 -11736 value120 value60
104.74034
10 -11146 -11779 value300 value0
285.44272
>>
>>> I have been trying to use a series of if, else statements to
>>> produce these bearing,
>
> ifelse is the correct construct for processing vectors
>
> --
> David.
>>> but all I am producing is errors. Any suggestion would be
>>> appreciated.
>>
>>
>>>
>>> Again THANKS for you efforts.
>>>
>>> Mike
>>>
>>> ---- Original message ----
>>>> Date: Fri, 23 Jul 2010 23:01:56 -0700
>>>> From: Joshua Wiley <[271]jwiley.psych at gmail.com>
>>>> Subject: Re: [R] Trouble retrieving the second largest value from
>>>> each row of a data.frame
>>>> To: [272]mpward at illinois.edu
>>>> Cc: [273]r-help at r-project.org
>>>>
>>>> Hi,
>>>>
>>>> Here is a little function that will do what you want and return a
>>>> nice output:
>>>>
>>>> #Function To calculate top two values and return
>>>> my.finder <- function(mydata) {
>>>> my.fun <- function(data) {
>>>> strongest <- which.max(data)
>>>> secondstrongest <- which.max(data[-strongest])
>>>> strongestantenna <- names(data)[strongest]
>>>> secondstrongantenna <- names(data[-strongest])[secondstrongest]
>>>> value <- matrix(c(data[strongest], data[secondstrongest],
>>>> strongestantenna, secondstrongantenna), ncol =4)
>>>> return(value)
>>>> }
>>>> dat <- apply(mydata, 1, my.fun)
>>>> dat <- t(dat)
>>>> dat <- as.data.frame(dat, stringsAsFactors = FALSE)
>>>> colnames(dat) <- c("strongest", "secondstrongest",
>>>> "strongestantenna", "secondstrongantenna")
>>>> dat[ , "strongest"] <- as.numeric(dat[ , "strongest"])
>>>> dat[ , "secondstrongest"] <- as.numeric(dat[ , "secondstrongest"])
>>>> return(dat)
>>>> }
>>>>
>>>>
>>>> #Using your example data:
>>>>
>>>> yourdata <- structure(list(value0 = c(-13007L, -12838L, -12880L,
>>>> -12805L,
>>>> -12834L, -11068L, -12807L, -12770L, -12988L, -11779L), value60 =
>>>> c(-11707L,
>>>> -13210L, -11778L, -11653L, -13527L, -11698L, -14068L, -11665L,
>>>> -11736L, -12873L), value120 = c(-11072L, -11176L, -11113L, -11071L,
>>>> -11067L, -12430L, -11092L, -11061L, -11137L, -12973L), value180 =
>>>> c(-12471L,
>>>> -11799L, -12439L, -12385L, -11638L, -12430L, -11709L, -12373L,
>>>> -12570L, -12537L), value240 = c(-12838L, -13210L, -13089L, -11561L,
>>>> -13527L, -12430L, -11607L, -11426L, -13467L, -12973L), value300 =
>>>> c(-13357L,
>>>> -13845L, -13880L, -13317L, -13873L, -12814L, -13025L, -12805L,
>>>> -13739L, -11146L)), .Names = c("value0", "value60", "value120",
>>>> "value180", "value240", "value300"), class = "data.frame",
>>>> row.names = c("1",
>>>> "2", "3", "4", "5", "6", "7", "8", "9", "10"))
>>>>
>>>> my.finder(yourdata) #and what you want is in a nicely labeled
>>>> data frame
>>>>
>>>> #A potential problem is that it is not very efficient
>>>>
>>>> #Here is a test using a matrix of 100, 000 rows
>>>> #sampled from the same range as your data
>>>> #with the same number of columns
>>>>
>>>> data.test <- matrix(
>>>> sample(seq(min(yourdata), max(yourdata)), size = 500000, replace =
>>>> TRUE),
>>>> ncol = 5)
>>>>
>>>> system.time(my.finder(data.test))
>>>>
>>>> #On my system I get
>>>>
>>>>> system.time(my.finder(data.test))
>>>> user system elapsed
>>>> 2.89 0.00 2.89
>>>>
>>>> Hope that helps,
>>>>
>>>> Josh
>>>>
>>>>
>>>>
>>>> On Fri, Jul 23, 2010 at 6:20 PM, <[274]mpward at illinois.edu> wrote:
>>>>> I have a data frame with a couple million lines and want to
>>>>> retrieve the largest and second largest values in each row,
>>>>> along with the label of the column these values are in. For
>>>>> example
>>>>>
>>>>> row 1
>>>>> strongest=-11072
>>>>> secondstrongest=-11707
>>>>> strongestantenna=value120
>>>>> secondstrongantenna=value60
>>>>>
>>>>> Below is the code I am using and a truncated data.frame.
>>>>> Retrieving the largest value was easy, but I have been getting
>>>>> errors every way I have tried to retrieve the second largest
>>>>> value. I have not even tried to retrieve the labels for the
>>>>> value yet.
>>>>>
>>>>> Any help would be appreciated
>>>>> Mike
>>>>>
>>>>>
>>>>>> data<-
>>>>>> data.frame(value0, value60, value120, value180, value240, value300)
>>>>>> data
>>>>> value0 value60 value120 value180 value240 value300
>>>>> 1 -13007 -11707 -11072 -12471 -12838 -13357
>>>>> 2 -12838 -13210 -11176 -11799 -13210 -13845
>>>>> 3 -12880 -11778 -11113 -12439 -13089 -13880
>>>>> 4 -12805 -11653 -11071 -12385 -11561 -13317
>>>>> 5 -12834 -13527 -11067 -11638 -13527 -13873
>>>>> 6 -11068 -11698 -12430 -12430 -12430 -12814
>>>>> 7 -12807 -14068 -11092 -11709 -11607 -13025
>>>>> 8 -12770 -11665 -11061 -12373 -11426 -12805
>>>>> 9 -12988 -11736 -11137 -12570 -13467 -13739
>>>>> 10 -11779 -12873 -12973 -12537 -12973 -11146
>>>>>> #largest value in the row
>>>>>> strongest<-apply(data, 1, max)
>>>>>>
>>>>>>
>>>>>> #second largest value in the row
>>>>>> n<-function(data)(1/(min(1/(data[1, ]-max(data[1, ]))))+
>>>>>> (max(data[1, ])))
>>>>>> secondstrongest<-apply(data, 1, n)
>>>>> Error in data[1, ] : incorrect number of dimensions
>>>>>>
>>>>>
>>>>> ______________________________________________
>>>>> [275]R-help at r-project.org mailing list
>>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>>> PLEASE do read the posting guide
[276]http://www.R-project.org/posting-guide.html
>>>>> and provide commented, minimal, self-contained, reproducible code.
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Joshua Wiley
>>>> Ph.D. Student, Health Psychology
>>>> University of California, Los Angeles
>>>> [277]http://www.joshuawiley.com/
>>>
>>> ______________________________________________
>>> [278]R-help at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
[279]http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>
>> ______________________________________________
>> [280]R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
[281]http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
> ______________________________________________
> [282]R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
[283]http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
------------------------------
Message: 35
Date: Sat, 24 Jul 2010 22:51:50 -0400
From: paaventhan jeyaganth <[284]paaveenthan at hotmail.com>
To: r <[285]r-help at r-project.org>
Subject: [R] c-statiscs 95% CI for cox regression model
Message-ID: <[286]BLU140-W10DC8DB498004130C2840DB4A50 at phx.gbl>
Content-Type: text/plain
Dear all,
how can i do the calculate the C-statistics
95% confidences interval for the cox regression model.
Thanks very much for your any help.
Paaveenthan
_________________________________________________________________
[[elided Hotmail spam]]
[[alternative HTML version deleted]]
------------------------------
Message: 36
Date: Sat, 24 Jul 2010 22:48:59 -0500
From: Dirk Eddelbuettel <[287]edd at debian.org>
To: Frank E Harrell Jr <[288]f.harrell at vanderbilt.edu>
Cc: "[289]r-help at r-project.org" <[290]r-help at r-project.org>
Subject: Re: [R] UseR! 2010 - my impressions
Message-ID: <[291]20100725034859.GA11668 at eddelbuettel.com>
Content-Type: text/plain; charset=us-ascii
On Sat, Jul 24, 2010 at 08:55:01AM -0500, Frank E Harrell Jr wrote:
> On 07/23/2010 06:50 PM, Ravi Varadhan wrote:
> I want to echo what Ravi said. The talks were terrific (thanks to
> the program committee and the speakers) and Kate Mullen and her team
> did an extraordinary job in putting the conference together and
> running it. I am proud to have been a part of it. Thank you all!
Not much to add to this, so I just leave it at "Yup!".
Thanks for useR! 2010. A job well done, and then some.
--
Dirk Eddelbuettel | [292]edd at debian.org |
[293]http://dirk.eddelbuettel.com
------------------------------
Message: 37
Date: Sat, 24 Jul 2010 23:19:54 -0500
From: Frank E Harrell Jr <[294]f.harrell at Vanderbilt.Edu>
To: paaventhan jeyaganth <[295]paaveenthan at hotmail.com>
Cc: r <[296]r-help at r-project.org>
Subject: Re: [R] c-statiscs 95% CI for cox regression model
Message-ID: <[297]4C4BBB6A.9090400 at vanderbilt.edu>
Content-Type: text/plain; charset="ISO-8859-1"; format=flowed
On 07/24/2010 09:51 PM, paaventhan jeyaganth wrote:
>
> Dear all,
> how can i do the calculate the C-statistics
> 95% confidences interval for the cox regression model.
> Thanks very much for your any help.
> Paaveenthan
install.packages('Hmisc')
require(Hmisc
?rcorr.cens (there is an example at the bottom)
Frank
--
Frank E Harrell Jr Professor and Chairman School of Medicine
Department of Biostatistics Vanderbilt University
------------------------------
Message: 38
Date: Sun, 25 Jul 2010 07:57:35 +0200
From: Michael Haenlein <[298]haenlein at escpeurope.eu>
To: [299]r-help at r-project.org
Subject: [R] Equivalent to go-to statement
Message-ID:
<[300]AANLkTimX1jOLHX6AkfzDqQEJR4LK5G_-yFfDhZK_U5_i at mail.gmail.com>
Content-Type: text/plain
Dear all,
I'm working with a code that consists of two parts: In Part 1 I'm
generating
a random graph using the igraph library (which represents the
relationships
between different nodes) and a vector (which represents a certain
characteristic for each node):
library(igraph)
g <- watts.strogatz.game(1, 100, 5, 0.05)
z <- rlnorm(100, 0, 1)
In Part 2 I'm iteratively changing the elements of z in order to reach a
certain value of a certain target variable. I'm doing this using a while
statement:
while (target_variable < threshold) {## adapt z}
The problem is that in some rare cases this iterative procedure can take
very long (a couple of million of iterations), depending on the specific
structure of the graph generated in Part 1. I therefore would like to
change
Part 2 of my code in the sense that once a certain threshold number of
iterations has been achieved, the iterative process in Part 2 stops and
goes
back to Part 1 to generate a new graph structure. So my idea is as
follows:
- Run Part 1 and generate g and z
- Run Part 2 and iteratively modify z to maximize the target variable
- If Part 2 can be obtained in less than X steps, then go to Part 3
- If Part 2 takes more than X steps then go back to Part 1 and start again
I think that R does not have a function like "go-to" or "go-back".
Does anybody know of a convenient way of doing this?
Thanks very much for your help,
Michael
[[alternative HTML version deleted]]
------------------------------
Message: 39
Date: Sat, 24 Jul 2010 23:24:40 -0700 (PDT)
From: Vipul Agarwal <[301]iitkvipul at gmail.com>
To: [302]r-help at r-project.org
Subject: [R] Outlier Problem in Survreg Function
Message-ID: <[303]1280039080326-2301422.post at n4.nabble.com>
Content-Type: text/plain; charset=us-ascii
Hi Everyone,
I have recently started using r and working on survival analysis using the
function survreg.
I am facing a trange problem. One of the covariates in my analysis has
outliers because of which survreg is giving incorrect results. Howevere
when
I am removing the outliers or scaling down the values of the covariate by
a
factor of 2 it is giving correct results. Below is a ditribution of the
ariable and the results
Min. 1st Qu. Median Mean 3rd Qu. Max.
0 30000 54500 95450 123000 1650000
Survreg Resuts
survreg(formula = Surv(TIME_TO_FAILURE, CENSOR_DEFAULT) ~ ADVANCE,
data = data)
Coefficients:
(Intercept) ADVANCE
0.000000 -6.385336
Scale= 0.9785933
Loglik(model)= -40227366 Loglik(intercept only)= -914141
Chisq= -78626451 on 1 degrees of freedom, p= 1
n=198099 (885 observations deleted due to missingness)
Survreg Results after scaling down the variable by 10
survreg(formula = Surv(TIME_TO_FAILURE, CENSOR_DEFAULT) ~ ADVANCE_SCALED,
data = data)
Coefficients:
(Intercept) ADVANCE_SCALED
4.132962e+00 -2.181577e-05
Scale= 0.9428758
Loglik(model)= -909139.4 Loglik(intercept only)= -914141
Chisq= 10003.19 on 1 degrees of freedom, p= 0
n=198099 (885 observations deleted due to missingness)
Survreg Results Afte removing the outliers(5% of the obs)
data <- subset(data, data$ADVANCE <= 200000)
> survreg(Surv(TIME_TO_FAILURE, CENSOR_DEFAULT) ~ ADVANCE , data = data )
Call:
survreg(formula = Surv(TIME_TO_FAILURE, CENSOR_DEFAULT) ~ ADVANCE,
data = data)
Coefficients:
(Intercept) ADVANCE
4.224298e+00 -3.727171e-06
Scale= 0.9601186
Loglik(model)= -822521.9 Loglik(intercept only)= -825137.1
Chisq= 5230.49 on 1 degrees of freedom, p= 0
n=177332 (444 observations deleted due to missingness)
Please let me know if someone else has faced the same problem and what is
the way around to deal with it ? Should I scale down the variable or
remove
the outliers?
--
View this message in context:
[304]http://r.789695.n4.nabble.com/Outlier-Problem-in-Survreg-Function-tp2
301422p2301422.html
Sent from the R help mailing list archive at Nabble.com.
------------------------------
Message: 40
Date: Sun, 25 Jul 2010 02:43:24 -0400
From: Gabor Grothendieck <[305]ggrothendieck at gmail.com>
To: Michael Haenlein <[306]haenlein at escpeurope.eu>
Cc: [307]r-help at r-project.org
Subject: Re: [R] Equivalent to go-to statement
Message-ID:
<[308]AANLkTimG-oCXkzLqxpca6HnTbnErb4vbBEQKhYyOkaWs at mail.gmail.com>
Content-Type: text/plain; charset=ISO-8859-1
On Sun, Jul 25, 2010 at 1:57 AM, Michael Haenlein
<[309]haenlein at escpeurope.eu> wrote:
> Dear all,
>
> I'm working with a code that consists of two parts: In Part 1 I'm
generating
> a random graph using the igraph library (which represents the
relationships
> between different nodes) and a vector (which represents a certain
> characteristic for each node):
>
> library(igraph)
> g <- watts.strogatz.game(1, 100, 5, 0.05)
> z <- rlnorm(100, 0, 1)
>
> In Part 2 I'm iteratively changing the elements of z in order to reach a
> certain value of a certain target variable. I'm doing this using a while
> statement:
>
> while (target_variable < threshold) {## adapt z}
>
> The problem is that in some rare cases this iterative procedure can take
> very long (a couple of million of iterations), depending on the specific
> structure of the graph generated in Part 1. I therefore would like to
change
> Part 2 of my code in the sense that once a certain threshold number of
> iterations has been achieved, the iterative process in Part 2 stops and
goes
> back to Part 1 to generate a new graph structure. So my idea is as
follows:
>
> - Run Part 1 and generate g and z
> - Run Part 2 and iteratively modify z to maximize the target variable
> - If Part 2 can be obtained in less than X steps, then go to Part 3
> - If Part 2 takes more than X steps then go back to Part 1 and start
again
>
> I think that R does not have a function like "go-to" or "go-back".
>
> Does anybody know of a convenient way of doing this?
>
> Thanks very much for your help,
>
goto's can be replaced with loops. In this case create a double loop
such that the outer loop does not repeat if the inner loop finished
due to reaching the target:
target_variable <- -Inf
while(target_variable < threshold) {
...
iter <- 0
while(target_variable < threshold && iter < max_iter) {
... update iter and target_variable ...
}
}
------------------------------
_______________________________________________
[310]R-help at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
[311]http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
End of R-help Digest, Vol 89, Issue 25
**************************************
[312][rKWLzcpt.zNp8gmPEwGJCA00]
[@from=dllmain&rcpt=r%2Dhelp%40r%2Dproject%2Eorg&msgid=%3C20100726113540%2EH
M%2E0000000000000bg%40dllmain%2Ewwl737%2Ehanmail%2Enet%3E]
References
1. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
2. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-request@r-project.org
3. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-owner@r-project.org
4. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
5. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=djmuser@gmail.com
6. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=gcheer3@gmail.com
7. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
8. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=KLCO6QX@mail.gmail.com
9. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=gcheer3@gmail.com
10. http://r.789695.n4.nabble.com/how-to-calculate-the-product-of-every-two-elements-in-two-vectors-tp2300299p2300299.html
11. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
12. http://www.r-project.org/posting-guide.html
13. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=SetlhareL@bob.bw
14. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=at.ouchen@gmail.com
15. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
16. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=25D1D72D6E19D144AB813C9C582E16CF03F7EA27@BOB-EXCHANGE.bob.bw
17. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
18. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
19. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
20. http://www.stata.com/help.cgi?search
21. http://www.stata.com/support/statalist/faq
22. http://www.ats.ucla.edu/stat/stata/
23. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
24. http://www.r-project.org/posting-guide.html
25. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=SetlhareL@bob.bw
26. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=SetlhareL@bob.bw
27. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=at.ouchen@gmail.com
28. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
29. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=25D1D72D6E19D144AB813C9C582E16CF03F7EA28@BOB-EXCHANGE.bob.bw
30. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
31. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
32. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
33. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
34. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
35. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
36. http://www.stata.com/help.cgi?search
37. http://www.stata.com/support/statalist/faq
38. http://www.ats.ucla.edu/stat/stata/
39. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
40. http://www.r-project.org/posting-guide.html
41. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=SetlhareL@bob.bw
42. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=at.ouchen@gmail.com
43. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
44. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=25D1D72D6E19D144AB813C9C582E16CF03F7EA29@BOB-EXCHANGE.bob.bw
45. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
46. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
47. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
48. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
49. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
50. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
51. http://www.stata.com/help.cgi?search
52. http://www.stata.com/support/statalist/faq
53. http://www.ats.ucla.edu/stat/stata/
54. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
55. http://www.r-project.org/posting-guide.html
56. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=dwinsemius@comcast.net
57. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
58. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
59. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=D09340C5-3B64-47FA-A168-8EA347F79747@comcast.net
60. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
61. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
62. http://www.r-project.org/posting-guide.html
63. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=assaedi76@yahoo.com
64. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
65. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=853644.1608.qm@web45210.mail.sp1.yahoo.com
66. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=assaedi76@yahoo.com
67. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=drbn@yahoo.com
68. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
69. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=217686.73973.qm@web113215.mail.gq1.yahoo.com
70. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=vijayamahantesh_s@dell.com
71. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
72. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=1279964891930-2300991.post@n4.nabble.com
73. http://r.789695.n4.nabble.com/Help-me-with-prediction-in-linear-model-tp2297313p2300991.html
74. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=hadley@rice.edu
75. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jdnewmil@dcn.davis.ca.us
76. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
77. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=fahim.md@gmail.com
78. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=z@mail.gmail.com
79. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jdnewmil@dcn.davis.ca.us
80. http://had.co.nz/
81. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=f.harrell@Vanderbilt.Edu
82. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=rvaradhan@jhmi.edu
83. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
84. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
85. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=4C4AF0B5.6070300@vanderbilt.edu
86. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=rvaradhan@jhmi.edu
87. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=cberry@tajo.ucsd.edu
88. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=marcusliu667@yahoo.com
89. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
90. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=Pine.LNX.4.64.1007240817250.21422@tajo.ucsd.edu
91. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=cberry@tajo.ucsd.edu
92. http://famprevmed.ucsd.edu/faculty/cberry/
93. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=murdoch.duncan@gmail.com
94. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=cberry@tajo.ucsd.edu
95. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
96. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=4C4B0BE9.7050409@gmail.com
97. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=cberry@tajo.ucsd.edu
98. http://famprevmed.ucsd.edu/faculty/cberry/
99. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
100. http://www.r-project.org/posting-guide.html
101. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=bbolker@gmail.com
102. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@stat.math.ethz.ch
103. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=loom.20100724T175114-259@post.gmane.org
104. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=brubruzao@hotmail.com
105. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
106. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=SNT111-DS23835E9F6F65737C7B0BEDBBA40@phx.gbl
107. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=goncalves.b.b@gmail.com
108. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
109. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mattstati@yahoo.com
110. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
111. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=289386.7557.qm@web43507.mail.sp1.yahoo.com
112. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=wwwhsd@gmail.com
113. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=gcheer3@gmail.com
114. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
115. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=XrwenTLja3ZTrwKE9USm4z@mail.gmail.com
116. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=gcheer3@gmail.com
117. http://r.789695.n4.nabble.com/how-to-calculate-the-product-of-every-two-elements-in-two-vectors-tp2300299p2300299.html
118. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
119. http://www.r-project.org/posting-guide.html
120. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jwiley.psych@gmail.com
121. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mattstati@yahoo.com
122. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
123. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=v0shNs@mail.gmail.com
124. http://www.r-project.org/doc/bib/R-books.html
125. http://cran.r-project.org/manuals.html
126. http://tolstoy.newcastle.edu.au/R/help/04/06/0063.html
127. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mattstati@yahoo.com
128. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
129. http://www.r-project.org/posting-guide.html
130. http://www.joshuawiley.com/
131. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=ggrothendieck@gmail.com
132. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=gcheer3@gmail.com
133. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
134. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=dPbmg9YRPUcE2fQH_0Sqs2UJKFVJ3BS@mail.gmail.com
135. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=gcheer3@gmail.com
136. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jcm331@gmail.com
137. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mattstati@yahoo.com
138. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
139. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=eWg9Wcb80@mail.gmail.com
140. http://www.amazon.com/Programming-Data-Guide-S-Language/dp/0387985034/ref=sr_1_1?ie=UTF8&s=books&qid=1279990404&sr=8-1
141. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mattstati@yahoo.com
142. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
143. http://www.r-project.org/posting-guide.html
144. http://www.r-project.org/posting-guide.html
145. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jcm0250@unt.edu
146. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=Greg.Snow@imail.org
147. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=FarleyR@metro.net
148. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
149. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
150. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=B37C0A15B8FB3C468B5BC7EBC7DA14CC633A53DF11@LP-EXMBVS10.CO.IHC.COM
151. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=greg.snow@imail.org
152. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
153. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
154. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
155. http://www.r-project.org/posting-
156. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=shabnambioinfo@gmail.com
157. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
158. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=Hzat_FD6jpZF6@mail.gmail.com
159. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=Greg.Snow@imail.org
160. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=babyfoxlove1@sina.com
161. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=babyfoxlove1@sina.com
162. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
163. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
164. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=B37C0A15B8FB3C468B5BC7EBC7DA14CC633A53DF25@LP-EXMBVS10.CO.IHC.COM
165. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=greg.snow@imail.org
166. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help-bounces@r-project.org
167. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=babyfoxlove1@sina.com
168. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
169. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
170. http://www.r-project.org/posting-
171. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=AbdiA@si.edu
172. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
173. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
174. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=97679C0A11332E48A01E0D463E8B3FF103AF0CE275@SI-MSEV02.US.SINET.SI.EDU
175. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=abdia@si.edu
176. http://nationalzoo.si.edu/SCBI/ConservationGIS/
177. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mazatlanmexico@yahoo.com
178. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@stat.math.ethz.ch
179. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=418059.32636.qm@web56602.mail.re3.yahoo.com
180. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=wwwhsd@gmail.com
181. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mazatlanmexico@yahoo.com
182. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@stat.math.ethz.ch
183. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=AANLkTimd6DLagDiHKbFuF3hE5dES4d6DmC5T50Vn25N3@mail.gmail.com
184. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mazatlanmexico@yahoo.com
185. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
186. http://www.r-project.org/posting-guide.html
187. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
188. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jwiley.psych@gmail.com
189. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
190. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=20100724155451.CHG28413@expms6.cites.uiuc.edu
191. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jwiley.psych@gmail.com
192. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
193. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
194. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
195. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
196. http://www.r-project.org/posting-guide.html
197. http://www.joshuawiley.com/
198. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=phhs80@gmail.com
199. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
200. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=oH@mail.gmail.com
201. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=hw_joyce_cn@hotmail.com
202. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jholtman@gmail.com
203. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mazatlanmexico@yahoo.com
204. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@stat.math.ethz.ch
205. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=3P@mail.gmail.com
206. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mazatlanmexico@yahoo.com
207. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
208. http://www.r-project.org/posting-guide.html
209. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=zachmohr@gmail.com
210. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
211. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=AANLkTinzngqO3S_p-E25N3TTWEdiQBBS4CvzErgywj1p@mail.gmail.com
212. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=2300793-1751019155-246278@n4.nabble.com
213. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=2B2300793-1751019155-246278@n4.nabble.com
214. http://r.789695.n4.nabble.com/glm-prediction-of-a-factor-with-several-levels-tp2300793p2300793.html
215. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=zachmohr@gmail.com
216. http://r.789695.n4.nabble.com/glm-prediction-of-a-factor-with-several-levels-tp2300793p2301324.html
217. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=dwinsemius@comcast.net
218. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
219. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
220. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=52EA484F-C066-4ACC-B5BC-1A3A20876D9E@comcast.net
221. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
222. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jwiley.psych@gmail.com
223. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
224. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
225. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
226. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
227. http://www.r-project.org/posting-guide.html
228. http://www.joshuawiley.com/
229. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
230. http://www.r-project.org/posting-guide.html
231. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jwiley.psych@gmail.com
232. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
233. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
234. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=fa9gCQLuYBAaMS_P8w@mail.gmail.com
235. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=dwinsemius@comcast.net
236. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
237. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jwiley.psych@gmail.com
238. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
239. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
240. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
241. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
242. http://www.r-project.org/posting-guide.html
243. http://www.joshuawiley.com/
244. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
245. http://www.r-project.org/posting-guide.html
246. http://www.joshuawiley.com/
247. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=dwinsemius@comcast.net
248. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
249. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
250. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
251. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=2B19FDC3-4358-4731-87C7-89399E5DD75E@comcast.net
252. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
253. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jwiley.psych@gmail.com
254. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
255. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
256. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
257. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
258. http://www.r-project.org/posting-guide.html
259. http://www.joshuawiley.com/
260. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
261. http://www.r-project.org/posting-guide.html
262. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
263. http://www.r-project.org/posting-guide.html
264. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=dwinsemius@comcast.net
265. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=dwinsemius@comcast.net
266. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
267. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
268. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
269. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=9B25E777-650F-4419-92F0-9319A2B753B4@comcast.net
270. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
271. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=jwiley.psych@gmail.com
272. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
273. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
274. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=mpward@illinois.edu
275. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
276. http://www.r-project.org/posting-guide.html
277. http://www.joshuawiley.com/
278. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
279. http://www.r-project.org/posting-guide.html
280. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
281. http://www.r-project.org/posting-guide.html
282. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
283. http://www.r-project.org/posting-guide.html
284. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=paaveenthan@hotmail.com
285. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
286. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=BLU140-W10DC8DB498004130C2840DB4A50@phx.gbl
287. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=edd@debian.org
288. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=f.harrell@vanderbilt.edu
289. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
290. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
291. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=20100725034859.GA11668@eddelbuettel.com
292. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=edd@debian.org
293. http://dirk.eddelbuettel.com/
294. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=f.harrell@Vanderbilt.Edu
295. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=paaveenthan@hotmail.com
296. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
297. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=4C4BBB6A.9090400@vanderbilt.edu
298. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=haenlein@escpeurope.eu
299. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
300. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=AANLkTimX1jOLHX6AkfzDqQEJR4LK5G_-yFfDhZK_U5_i@mail.gmail.com
301. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=iitkvipul@gmail.com
302. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
303. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=1280039080326-2301422.post@n4.nabble.com
304. http://r.789695.n4.nabble.com/Outlier-Problem-in-Survreg-Function-tp2301422p2301422.html
305. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=ggrothendieck@gmail.com
306. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=haenlein@escpeurope.eu
307. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=r-help@r-project.org
308. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=AANLkTimG-oCXkzLqxpca6HnTbnErb4vbBEQKhYyOkaWs@mail.gmail.com
309. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=haenlein@escpeurope.eu
310. http://mail2.daum.net/hanmail/mail/MailComposeFrame.daum?TO=R-help@r-project.org
311. http://www.r-project.org/posting-guide.html
312. mailto:dllmain at hanmail.net
More information about the R-help
mailing list