Share Email Print
cover

Proceedings Paper

Symmetric table addition methods for neural network approximations
Author(s): Nihal Koc-Sahan; Jason Schlessman; Michael J. Schulte
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Symmetric table addition methods (STAMs) approximate functions by performing parallel table lookups, followed by multioperand addition. STAMs require significantly less memory than direct table lookups and are faster than piecewise linear approximations. This paper investigates the application of STAMs to the sigmoid function and its derivative, which are commonly used in artificial neural networks. Compared to direct table lookups, STAMs require between 23 and 41 times less memory for sigmoid and between 24 and 46 times less memory for sigmoid's derivative, when the input operand size is 16 bits and the output precision is 12 bits.

Paper Details

Date Published: 20 November 2001
PDF: 8 pages
Proc. SPIE 4474, Advanced Signal Processing Algorithms, Architectures, and Implementations XI, (20 November 2001); doi: 10.1117/12.448641
Show Author Affiliations
Nihal Koc-Sahan, Lehigh Univ. (United States)
Jason Schlessman, Lehigh Univ. (United States)
Michael J. Schulte, Lehigh Univ. (United States)


Published in SPIE Proceedings Vol. 4474:
Advanced Signal Processing Algorithms, Architectures, and Implementations XI
Franklin T. Luk, Editor(s)

© SPIE. Terms of Use
Back to Top