Why is Bayesian regularization backpropagation (Neural Network Toolbox) so very very slow?
4 visualizaciones (últimos 30 días)
Mostrar comentarios más antiguos
Empirically I've found with a challenging pattern recognition problem I'm working on, that Bayesian regularization backpropagation (trainbr) outperforms more standard tools such as trainlm, trainscg and trainrp by quite a bit. But, it takes an extraordinarily longer time to compute. In its original formulation (MacKay 1992), Bayesian regularization required calculation of the Hessian matrix, which is very computationally demanding, and would account for the long time. However, in Foresee 1997 (both works cited in Matlab doc for trainbr), an alternative was developed that claims to reduce the computational challenge to be similar to e.g. trainlm. This latter work is cited in the documentation, but is it implemented? Can I find a library somewhere that implements it? I'm pretty confident that trainbr as implemented in the Neural Network Toolbox requires calculation of the Hessian, because it refuses to run on a GPU, identifying lack of support for inversion of the (related) Jacobian as the reason. But, I'd be happy to be educated on that.
2 comentarios
Andrew Diamond
el 24 de En. de 2018
did you ever ping matlab support on this? As they say, "Inquiring minds want to know."
Respuestas (2)
Greg Heath
el 28 de Ag. de 2016
Use the command
type trainbr
Thank you for formally accepting my answer
Greg
Mustafa Sobhy
el 22 de Ag. de 2019
Because It requires the computation of the Hessian matrix of the performance index.
Source: Gauss-Newton Approximation to Baysian Learning.
0 comentarios
Ver también
Categorías
Más información sobre Sequence and Numeric Feature Data Workflows en Help Center y File Exchange.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!