Regularization tools for training large feed-forward neural networks using automatic differentiation
1998 (English)In: Optimization Methods & Software, Vol. 10, no 1, 49-69 p.Article in journal (Refereed) Published
We describe regularization tools for training large-scale artificial feed-forward neural networks. We propose algorithms that explicitly use a sequence of Tikhonov regularized nonlinear least squares problems. For large-scare problems, methods using new special purpose automatic differentiation are used in a conjugate gradient method for computing a truncated Gauss-Newton search direction. The algorithms developed utilize the structure of the problem in different ways and perform much better than a Polak-Ribiere based method. All algorithms are tested using benchmark problems and guidelines by Lutz Prechelt in the Probenl package. All software is written in Matlab and gathered in a toolbox.
Place, publisher, year, edition, pages
1998. Vol. 10, no 1, 49-69 p.
IdentifiersURN: urn:nbn:se:umu:diva-21927ISBN: 1055-6788OAI: oai:DiVA.org:umu-21927DiVA: diva2:212182