Share Email Print
cover

Proceedings Paper

Chebyshev polynomials-based (CPB) unified model neural networks for function approximation
Author(s): Tsu-Tian Lee; Jin-Tsong Jeng
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

In this paper, we propose the approximate transformable technique, which includes the direct transformation and indirect transformation, to obtain a CPB unified model neural networks for feedforward/recurrent neural networks via Chebyshev polynomials approximation. Based on this approximate transformable technique, we have derived the relationship between the single-layer neural networks and multilayer perceptron neural networks. It is shown that the CPB unified model neural networks can be represented as a functional link networks that are based on Chebyshev polynomials, and these networks use the recursive least squares method with forgetting factor as learning algorithm. It turns out that the CPB unified model neural networks not only has the same capability of universal approximator, but also has faster learning speed than conventional feedforward/recurrent neural networks. Computer simulations show that the proposed method does have the capability of universal approximator in some functional approximation with considerable reduction in learning time.

Paper Details

Date Published: 4 April 1997
PDF: 10 pages
Proc. SPIE 3077, Applications and Science of Artificial Neural Networks III, (4 April 1997); doi: 10.1117/12.271500
Show Author Affiliations
Tsu-Tian Lee, National Taiwan Institute of Technology (Taiwan)
Jin-Tsong Jeng, National Taiwan Institute of Technology (Taiwan)


Published in SPIE Proceedings Vol. 3077:
Applications and Science of Artificial Neural Networks III
Steven K. Rogers, Editor(s)

© SPIE. Terms of Use
Back to Top