[R] greco-latin square
Richard M. Heiberger
rmh at temple.edu
Fri May 19 06:15:36 CEST 2006
## Thank you for a very nice example. I would like to make two
## expansions on the discussion that Spencer Graves gave.
## Rich
## 1. Multiple regression is a sequential algorithm. In the
## sequential anova table the sums of squares for non-orthogonal
## predictor variables depends on the order in which they are brought
## into the model. As part of the regression algorithm each of the
## predictor variables, in this case the dummy variables corresponding
## to the contrasts, is orthogonalized against all previous dummy
## variables.
## We can see that the algorithm has done its task by looking at the
## projection of Y=asin.Stimulus.ER onto the orthogonalized subspaces
## associated with each of the multi-degree-of-freedom terms in the
## model. We do so by projecting Y onto each of those subspaces with
## the proj() function.
m1 <- aov(asin.Stimulus.ER ~
stimulusName + responseFinger + oom + mapping.code +
mapping.code:Subject.n + stimulusName:mapping.code +
stimulusName:mapping.code:Subject.n,
data=w.tmp)
summary(m1)
m1.proj <- proj(m1)
tmp <- crossprod(m1.proj)
## zapsmall(tmp)
dimnames(tmp) <- rep(list(abbreviate(dimnames(tmp)[[1]])), 2)
## The projections onto the subspaces are orthogonal to each other.
zapsmall(tmp, digits=3)
## > zapsmall(tmp, digits=3)
## (In) stmN rspF oom mpp. m.:S stN:. sN:.:
## (In) 0.2561 0.000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
## stmN 0.0000 0.005 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000
## rspF 0.0000 0.000 0.0337 0.0000 0.0000 0.0000 0.0000 0.0000
## oom 0.0000 0.000 0.0000 0.0061 0.0000 0.0000 0.0000 0.0000
## mpp. 0.0000 0.000 0.0000 0.0000 0.0282 0.0000 0.0000 0.0000
## m.:S 0.0000 0.000 0.0000 0.0000 0.0000 0.0204 0.0000 0.0000
## stN:. 0.0000 0.000 0.0000 0.0000 0.0000 0.0000 0.0223 0.0000
## sN:.: 0.0000 0.000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0139
## >
## The sums of squares of the projections are exactly the sums of
## squares in the ANOVA table.
as.matrix(apply(m1.proj, 2, function(x) sum(x^2)))
## > as.matrix(apply(m1.proj, 2, function(x) sum(x^2)))
## [,1]
## (Intercept) 0.256144760
## stimulusName 0.005001724
## responseFinger 0.033730254
## oom 0.006146134
## mapping.code 0.028189996
## mapping.code:Subject.n 0.020374325
## stimulusName:mapping.code 0.022317815
## stimulusName:mapping.code:Subject.n 0.013902509
## >
## 2. We can ask the summary function to do all the right tests.
## There is no need to do them by hand. The Error() function works
## directly with the orthogonal subspaces and recognizes which terms
## are the numerators and denominators for the F tests.
m1e <- aov(asin.Stimulus.ER ~
(stimulusName + mapping.code + responseFinger + oom)^2 +
Error(mapping.code:Subject.n),
data=w.tmp)
summary(m1e)
## > m1e <- aov(asin.Stimulus.ER ~
## + (stimulusName + mapping.code + responseFinger + oom)^2 +
## + Error(mapping.code:Subject.n),
## + data=w.tmp)
## Warning message:
## Error() model is singular in: aov(asin.Stimulus.ER ~ (stimulusName + mapping.code +
responseFinger +
## >
## > summary(m1e)
##
## Error: mapping.code:Subject.n
## Df Sum Sq Mean Sq F value Pr(>F)
## mapping.code 3 0.0281900 0.0093967 1.8448 0.2794
## Residuals 4 0.0203743 0.0050936
##
## Error: Within
## Df Sum Sq Mean Sq F value Pr(>F)
## stimulusName 3 0.005002 0.001667 1.4391 0.280137
## responseFinger 3 0.033730 0.011243 9.7048 0.001569 **
## oom 3 0.006146 0.002049 1.7684 0.206609
## stimulusName:mapping.code 3 0.022318 0.007439 6.4212 0.007677 **
## Residuals 12 0.013903 0.001159
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## >
## The warning about the singular model is an indicator that more
## dummy variables have been generated than are needed. This is
## consistent with your observation that any of the twoway
## interactions could be used to capture those interactions.
More information about the R-help
mailing list