ITEM METADATA RECORD
Title: Fast learning algorithms for feedforward neural networks
Authors: Jiang, MH ×
Gielen, Georges
Zhang, B
Luo, ZS #
Issue Date: Jan-2003
Publisher: Kluwer academic publ
Series Title: Applied intelligence vol:18 issue:1 pages:37-54
Abstract: In order to improve the training speed of multilayer feedforward neural networks (MLFNN), we propose and explore two new fast backpropagation (BP) algorithms obtained: (1) by changing the error functions, in case using the exponent attenuation (or bell impulse) function and the Fourier kernel function as alternative functions; and (2) by introducing the hybrid conjugate-gradient algorithm of global optimization for dynamic learning rate to overcome the conventional BP learning problems of getting stuck into local minima or slow convergence. Our experimental results demonstrate the effectiveness of the modified error functions since the training speed is faster than that of existing fast methods. In addition, our hybrid algorithm has a higher recognition rate than the Polak-Ribieve conjugate gradient and conventional BP algorithms, and has less training time, less complication and stronger robustness than the Fletcher-Reeves conjugate-gradient and conventional BP algorithms for real speech data.
URI: 
ISSN: 0924-669X
Publication status: published
KU Leuven publication type: IT
Appears in Collections:ESAT - MICAS, Microelectronics and Sensors
× corresponding author
# (joint) last author

Files in This Item:

There are no files associated with this item.

Request a copy

 




All items in Lirias are protected by copyright, with all rights reserved.

© Web of science