[R] ML optimization question--unidimensional unfolding scaling
Peter Muhlberger
pmuhl830 at gmail.com
Thu Oct 13 22:30:48 CEST 2005
Hi Spencer: Thanks for your interest! Also, the posting guide was helpful.
I think my problem might be solved if I could find a way to terminate nlm or
optim runs from within the user-given minimization function they call.
Optimization is unconstrained.
I'm essentially using normal like curves that translate observed values on a
set of variables (one curve per variable) into latent unfolded values. The
observed values are on the Y-axis & the latent (hence parameters to be
estimated) are on the X-axis. The problem is that there are two points into
which an observed value can map on a curve--one on either side of the curve
mean. Only one of these values actually will be optimal for all observed
variables, but it's easy to show that most estimation methods will get stuck
on the non-optimal value if they find that one first. Moving away from that
point, the likelihood gets a whole lot worse before the routine will 'see'
the optimal point on the other side of the normal curve.
SANN might work, but I kind of wonder how useful it'd be in estimating
hundreds of parameters--thanks to that latent scale.
My (possibly harebrained) thought for how to estimate this unfolding using
some gradient-based method would be to run through some iterations and then
check to see whether a better solution exists on the 'other side' of the
normal curves. If it does, replace those parameters with the better ones.
Because this causes the likelihood to jump, I'd probably have to start the
estimation process over again (maybe). But, I see no way from within the
minimization function called by NLM or optim to tell NLM or optim to
terminate its current run. I could make the algorithm recursive, but that
eats up resources & will probably have to be terminated w/ an error.
Peter
On 10/11/05 11:11 PM, "Spencer Graves" <spencer.graves at pdf.com> wrote:
> There may be a few problems where ML (or more generally Bayes) fails
> to give sensible answers, but they are relatively rare.
>
> What is your likelihood? How many parameters are you trying to
> estimate?
>
> Are you using constrained or unconstrained optimization? If
> constrained, I suggest you remove the constraints by appropriate
> transformation. When considering alternative transformations, I
> consider (a) what makes physical sense, and (b) which transformation
> produces a log likelihood that is more close to being parabolic.
>
> Hou are you calling "optim"? Have you tried all "SANN" as well as
> "Nelder-Mead", "BFGS", and "CG"? If you are using constrained
> optimization, I suggest you move the constraints to Inf by appropriate
> transformation and use the other methods, as I just suggested.
>
> If you would still like more suggestions from this group, please
> provide more detail -- but as tersely as possible. The posting guide
> is, I believe, quite useful (www.R-project.org/posting-guide.html).
>
> spencer graves
More information about the R-help
mailing list