[R] Solved: linear regression example using MLE using optim()
Ajay Narottam Shah
ajayshah at mayin.org
Tue May 31 11:17:09 CEST 2005
Thanks to Gabor for setting me right. My code is as follows. I found
it useful for learning optim(), and you might find it similarly
useful. I will be most grateful if you can guide me on how to do this
better. Should one be using optim() or stats4::mle?
set.seed(101) # For replicability
# Setup problem
X <- cbind(1, runif(100))
theta.true <- c(2,3,1)
y <- X %*% theta.true[1:2] + rnorm(100)
# OLS --
d <- summary(lm(y ~ X[,2]))
theta.ols <- c(d$coefficients[,1], d$sigma)
# Switch to log sigma as the free parameter
theta.true[3] <- log(theta.true[3])
theta.ols[3] <- log(theta.ols[3])
# OLS likelihood function --
ols.lf <- function(theta, K, y, X) {
beta <- theta[1:K]
sigma <- exp(theta[K+1])
e <- (y - X%*%beta)/sigma
logl <- sum(log(dnorm(e)/sigma))
return(logl)
}
# Experiment with the LF --
cat("Evaluating LogL at stupid theta : ", ols.lf(c(1,2,1), 2, y, X), "\n")
cat("Evaluating LogL at true params : ", ols.lf(theta.true, 2, y, X), "\n")
cat("Evaluating LogL at OLS estimates: ", ols.lf(theta.ols, 2, y, X), "\n")
optim(c(1,2,3), # Starting values
ols.lf, # Likelihood function
control=list(trace=1, fnscale=-1), # See ?optim for all controls
K=2, y=y, X=X # "..." stuff into ols.lf()
)
# He will use numerical derivatives by default.
--
Ajay Shah Consultant
ajayshah at mayin.org Department of Economic Affairs
http://www.mayin.org/ajayshah Ministry of Finance, New Delhi
More information about the R-help
mailing list