[R] Total effect of X on Y under presence of interaction effects
Matthew Keller
mckellercran at gmail.com
Thu May 12 00:26:04 CEST 2011
Not to rehash an old statistical argument, but I think David's reply
here is too strong ("In the presence of interactions there is little
point in attempting to assign meaning to individual coefficients.").
As David notes, the "simple effect" of your coefficients (e.g., a) has
an interpretation: it is the predicted effect of a when b, c, and d
are zero. If the zero-level of b, c, and d are meaningful (e.g., if
you have centered all your variables such that the mean of each one is
zero), then the coefficient of a is the predicted slope of a at the
mean level of all other predictors...
Matt
On Wed, May 11, 2011 at 2:40 PM, Greg Snow <Greg.Snow at imail.org> wrote:
> Just to add to what David already said, you might want to look at the Predict.Plot and TkPredict functions in the TeachingDemos package for a simple interface for visualizing predicted values in regression models.
>
> These plots are much more informative than a single number trying to capture total effect.
>
> --
> Gregory (Greg) L. Snow Ph.D.
> Statistical Data Center
> Intermountain Healthcare
> greg.snow at imail.org
> 801.408.8111
>
>
>> -----Original Message-----
>> From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-
>> project.org] On Behalf Of David Winsemius
>> Sent: Wednesday, May 11, 2011 7:48 AM
>> To: Michael Haenlein
>> Cc: r-help at r-project.org
>> Subject: Re: [R] Total effect of X on Y under presence of interaction
>> effects
>>
>>
>> On May 11, 2011, at 4:26 AM, Michael Haenlein wrote:
>>
>> > Dear all,
>> >
>> > this is probably more a statistics question than an R question but
>> > probably
>> > there is somebody who can help me nevertheless.
>> >
>> > I'm running a regression with four predictors (a, b, c, d) and all
>> > their
>> > interaction effects using lm. Based on theory I assume that a
>> > influences y
>> > positively. In my output (see below) I see, however, a negative
>> > regression
>> > coefficient for a. But several of the interaction effects of a with
>> > b, c and
>> > d have positive signs. I don't really understand this. Do I have to
>> > add up
>> > the coefficient for the main effect and the ones of all interaction
>> > effects
>> > to get a total effect of a on y? Or am I doing something wrong here?
>>
>> In the presence of interactions there is little point in attempting to
>> assign meaning to individual coefficients. You need to use predict()
>> (possibly with graphical or tabular displays) and produce estimates of
>> one or two variable at relevant levels of the other variables.
>>
>> The other aspect about which your model is not informative, is the
>> possibility that some of these predictors have non-linear associations
>> with `y`.
>>
>> (The coefficient for `a` examined in isolation might apply to a group
>> of subjects (or other units of analysis) in which the values of `b`,
>> `c`, and `d` were all held at zero. Is that even a situation that
>> would occur in your domain of investigation?)
>>
>> --
>> David.
>> >
>> > Thanks very much for your answer in advance,
>> >
>> > Regards,
>> >
>> > Michael
>> >
>> >
>> > Michael Haenlein
>> > Associate Professor of Marketing
>> > ESCP Europe
>> > Paris, France
>> >
>> >
>> >
>> > Call:
>> > lm(formula = y ~ a * b * c * d)
>> >
>> > Residuals:
>> > Min 1Q Median 3Q Max
>> > -44.919 -5.184 0.294 5.232 115.984
>> >
>> > Coefficients:
>> > Estimate Std. Error t value Pr(>|t|)
>> > (Intercept) 27.3067 0.8181 33.379 < 2e-16 ***
>> > a -11.0524 2.0602 -5.365 8.25e-08 ***
>> > b -2.5950 0.4287 -6.053 1.47e-09 ***
>> > c -22.0025 2.8833 -7.631 2.50e-14 ***
>> > d 20.5037 0.3189 64.292 < 2e-16 ***
>> > a:b 15.1411 1.1862 12.764 < 2e-16 ***
>> > a:c 26.8415 7.2484 3.703 0.000214 ***
>> > b:c 8.3127 1.5080 5.512 3.61e-08 ***
>> > a:d 6.6221 0.8061 8.215 2.33e-16 ***
>> > b:d -2.0449 0.1629 -12.550 < 2e-16 ***
>> > c:d 10.0454 1.1506 8.731 < 2e-16 ***
>> > a:b:c 1.4137 4.1579 0.340 0.733862
>> > a:b:d -6.1547 0.4572 -13.463 < 2e-16 ***
>> > a:c:d -20.6848 2.8832 -7.174 7.69e-13 ***
>> > b:c:d -3.4864 0.6041 -5.772 8.05e-09 ***
>> > a:b:c:d 5.6184 1.6539 3.397 0.000683 ***
>> > ---
>> > Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
>> >
>> > Residual standard error: 7.913 on 12272 degrees of freedom
>> > Multiple R-squared: 0.8845, Adjusted R-squared: 0.8844
>> > F-statistic: 6267 on 15 and 12272 DF, p-value: < 2.2e-16
>> >
>> > [[alternative HTML version deleted]]
>> >
>> > ______________________________________________
>> > R-help at r-project.org mailing list
>> > https://stat.ethz.ch/mailman/listinfo/r-help
>> > PLEASE do read the posting guide http://www.R-project.org/posting-
>> guide.html
>> > and provide commented, minimal, self-contained, reproducible code.
>>
>> David Winsemius, MD
>> West Hartford, CT
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-
>> guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Matthew C Keller
Asst. Professor of Psychology
University of Colorado at Boulder
www.matthewckeller.com
More information about the R-help
mailing list