[R] maximum-likelihood-estimation with mle()
peter dalgaard
pdalgd at gmail.com
Fri Sep 11 15:24:07 CEST 2015
You are being over-optimistic with your starting values, and/or with constrains on the parameter space.
Your fit is diverging in sigma for some reason known only to nonlinear-optimizer gurus...
For me, it works either to put in an explicit constraint or to reparametrize with log(sigma).
E.g.
> fit <- mle(nLL, start=list(a=0, b=0, sigma=1), method="L-BFGS-B", lower=c(-Inf, -Inf, 0.001))
> summary(fit)
Maximum likelihood estimation
Call:
mle(minuslogl = nLL, start = list(a = 0, b = 0, sigma = 1), method = "L-BFGS-B",
lower = c(-Inf, -Inf, 0.001))
Coefficients:
Estimate Std. Error
a 3.0293645 0.01904026
b -1.1516406 0.11814174
sigma 0.1729418 0.03866673
-2 log L: -6.717779
or
> nLL <- function(a, b, lsigma)
+ -sum(dnorm(y, mean=a*x+b, sd=exp(lsigma), log=TRUE))
> fit <- mle(nLL, start=list(a=0, b=0, lsigma=0))
> summary(fit)
Maximum likelihood estimation
Call:
mle(minuslogl = nLL, start = list(a = 0, b = 0, lsigma = 0))
Coefficients:
Estimate Std. Error
a 3.029365 0.01903975
b -1.151642 0.11813856
lsigma -1.754827 0.22360675
-2 log L: -6.717779
both of which reproduce lm() except for DF issues.
To fix sigma, use fixed=list(sigma=0.19) in mle(). This also stabilizes the convergence (which it blinking well should, since it is now a purely quadratic problem).
-pd
On 11 Sep 2015, at 14:20 , Ronald Kölpin <ronald.koelpin at gmail.com> wrote:
> Hi everyone,
>
> I have a problem with maximum-likelihood-estimation in the following
> situation:
>
> Assume a functional relation y = f(x) (the specific form of f should be
> irrelevant). For my observations I assume (for simplicity) white noise,
> such that hat(y_i) = f(x_i) + epsilon_i, with the epsilon_i iid N(0,
> sigma^2). Y_i should then be N(f(x_i), sigma^2)-distributed and due to
> the iid assumption the density of Y = (Y_1, ..., Y_n) is simply the
> product of the individual densities, taking the log gives the the sum of
> the log of individual densities.
>
> I tried coding this in R with a simple example: f(x) = a*x + b (simple
> linear regression). This way I wanted to compare the results from my
> ml-estimation (specifying the log-likelihood manually and estimating
> with mle()) with the results from using lm(y~x). In my example however
> it doesn't work:
>
> x <- 1:10
> y <- 3*x - 1 + rnorm(length(x), mean=0, sd=0.5)
>
> library("stats4")
> nLL <- function(a, b, sigma)
> {
> -sum(dnorm(y, mean=a*x+b, sd=sigma, log=TRUE))
> }
>
> fit <- mle(nLL1, start=list(a=0, b=0, sigma=1), nobs=length(y))
>
> summary(lm(y~x))
> summary(fit)
>
> These should be the same but the aren't. I must have made some mistake
> specifying the (negative) log-likehood (but I just don't see it). I also
> actually don't care much (at the moment) for estimating sigma but I
> don't know of a way to specify (and estimate) the (negative)
> log-likelihood without estimating sigma.
>
> Thanks a lot
> and kind regards
>
> Ronald Koelpin
>
> ______________________________________________
> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
--
Peter Dalgaard, Professor,
Center for Statistics, Copenhagen Business School
Solbjerg Plads 3, 2000 Frederiksberg, Denmark
Phone: (+45)38153501
Office: A 4.23
Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com
More information about the R-help
mailing list