gradient descent with noisy data

6 visualizaciones (últimos 30 días)
baptiste
baptiste el 16 de En. de 2017
Comentada: baptiste el 17 de En. de 2017
Hello. I am trying to fit a model to experimental data. The problem is that I am using a generative model, i.e. I simulate predictions for every set of parameters. It is very slow because every iteration takes about 20 seconds. Moreover predictions are a bit noisy and Matlab's gradient descent algorithms seem to have difficulties to converge (fminsearch and fmincon). Is there an algorithm known to be more robust (less sensitive to noise) than the other ones? Thanks. Baptiste

Respuestas (2)

Mohammad Abouali
Mohammad Abouali el 16 de En. de 2017
Editada: Mohammad Abouali el 16 de En. de 2017
Try one of the optimization methods in the global optimization toolbox, such as Particle Swarm or Genetic Algorithm
  3 comentarios
Alan Weiss
Alan Weiss el 16 de En. de 2017
Editada: Alan Weiss el 16 de En. de 2017
No, all optimization functions (except lsqcurvefit) take exactly the same form of objective function. fminsearch takes a SINGLE argument to fun, but that single argument can be a vector or array.
So if it works for you with one solver but not another, then something else is going on. Please show us exactly how you are calling ga and fminsearch and particleswarm.
x = fminsearch(@(parameters)fun(parameters,data,...)
then exactly that same function handle should work for all other solvers.
Alan Weiss
MATLAB mathematical toolbox documentation
baptiste
baptiste el 17 de En. de 2017
Ok got it. It does work with "@(parameters)fun(parameters,data,...)".

Iniciar sesión para comentar.


John D'Errico
John D'Errico el 16 de En. de 2017
First of all, fminsearch is NOT a gradient descent algorithm. Calling it that does not make it one.
Second, large residual problems are classically a bane for nonlinear least squares. This is well known. Ok, it should be well known, as I recall reading about the issues 35 years ago or so. For example:
Note the date.
Do you want to use particle swarms or genetic algorithms or any other stochastic optimizer? Not really a good idea, IMHO, since those schemes use LOTS of extra function evaluations while still walking down hill. They are as much (or little) a gradient descent algorithm as is fminsearch. They can be more slowly convergent in general though.
I don't have your model at hand, so it is somewhat difficult to make constructive suggestions. My first choice to improve robustness of large residual problems would be a partitioned nonlinear least squares tool. But that requires the ability to partition the unknowns into a conditionally linear subset, and an intrinsically nonlinear subset. Since your model is a simulation, that may well not be an option.
My second suggestion is to use a robust solver. Nlinfit from the stats toolbox does offer a robust option.
Third, you will benefit greatly from good starting values for large residual problems.
  1 comentario
baptiste
baptiste el 17 de En. de 2017
Ok, basically I have a dataset with 3 variables and try to fit a single model to this dataset. The model predicts the 3 variables jointly. For various reasons we use likelihoods so we compute for each variable the likelihood of observing our data given the prediction from the model and then we sum up all these likelihood. We try to find the set of parameters that maximize this overall likelihood. As I said the model's prediction vary a little from simulation to simulation (they are very slow). When I plot the overall likelihood I see that it basically stays put. Sometimes (e.g. if I increase the number of simulations to get more accurate predictions), it decreases and stabilizes at some value in an exponential approach. But it is never a good fit. I don't understand why it fails to converge. Sure, the simulations are introducing some noise but this noise is far less than the improvement the algorithm could make by fitting the curves correctly. Why does it sometimes not converge, and sometimes start to converge but stabilize far from the minimum?

Iniciar sesión para comentar.

Categorías

Más información sobre Genetic Algorithm en Help Center y File Exchange.

Etiquetas

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by