Share Email Print
cover

Proceedings Paper

Fast algorithm for a neocognitron neural network with back-propagation
Author(s): Kent Pu Qing; Robert W. Means
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

The neocognitron is a neural network that consists of many layers of partially connected cells. A new neocognitron architecture called the multilayer neocognitron with backpropagation learning (MNEOBP) is proposed, and it is shown that the original Neocognitron that is learned by backpropagation is a special case of what is proposed here. The MNEOBP has a number of advantages: (1) The algorithm for the MNEOBP is four times faster than the earlier algorithm since the number of cells calculated is reduced by a factor of four. (2) During the learning process, the mask (kernel) size is changed, this can speed up the training time by almost a factor of three. (3) The MNEOBP architecture can be implemented with a new digital neural network VLSI chip set called the Vision Processor (ViP). The ViP exploits the convolution structure of the network and can process a single 32 X 32 input layer in only 25.6 microsecond(s) with an 8 X 8 receptive field kernel.

Paper Details

Date Published: 1 October 1991
PDF: 10 pages
Proc. SPIE 1569, Stochastic and Neural Methods in Signal Processing, Image Processing, and Computer Vision, (1 October 1991); doi: 10.1117/12.48371
Show Author Affiliations
Kent Pu Qing, HNC, Inc. (United States)
Robert W. Means, HNC, Inc. (United States)


Published in SPIE Proceedings Vol. 1569:
Stochastic and Neural Methods in Signal Processing, Image Processing, and Computer Vision
Su-Shing Chen, Editor(s)

© SPIE. Terms of Use
Back to Top
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?
close_icon_gray