Share Email Print
cover

Proceedings Paper

Adaptive structure feed-forward neural networks using polynomial activation functions
Author(s): Liying Ma; Khashayar Khorasani
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

In cascade-correlation (CC) and constructive one-hidden- layer networks, structural level adaptation is achieved by incorporating new hidden units with identical activation functions one at a time into the active evolutionary net. Functional level adaptation has not received considerable attention, since selecting the activation functions will increase the search space considerably, and a systematic and a rigorous algorithm for accomplishing the search will be required as well. In this paper, we present a new strategy that is applicable to both the fixed structure as well as the constructive network trainings by using different activation functions having hierarchical degrees of nonlinearities, as the constructive learning of a one- hidden-layer feed-forward neural network (FNN) is progressing. Specifically, the orthonormal Hermite polynomials are used as the activation functions of the hidden units, which have certain interesting properties that are beneficial in network training. Simulation results for several noisy regression problems have revealed that our scheme can produce FNNs that generalize much better than one-hidden-layer constructive FNNs with identical sigmoidal activation functions, in particular as applied to rather complicated problems.

Paper Details

Date Published: 30 March 2000
PDF: 10 pages
Proc. SPIE 4055, Applications and Science of Computational Intelligence III, (30 March 2000); doi: 10.1117/12.380560
Show Author Affiliations
Liying Ma, Concordia Univ. (Canada)
Khashayar Khorasani, Concordia Univ. (Canada)


Published in SPIE Proceedings Vol. 4055:
Applications and Science of Computational Intelligence III
Kevin L. Priddy; Paul E. Keller; David B. Fogel, Editor(s)

© SPIE. Terms of Use
Back to Top