Share Email Print
cover

Proceedings Paper

Multithresholding: a neural approach
Author(s): Wai Pan Cheung; C. K. Lee; K. C. Li
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

In this paper we present a neural computation model for histogram-based multithresholding. An optimal thresholding vector is determined which is image dependent. The number of elements in the vector is characterized by the histogram. Since our model is the parallel implementation of maximum interclass variance thresholding, the time for convergence is much faster. Together with a real-time histogram builder, real-time adaptive image segmentation can be achieved. The multithresholding criterion is derived from maximizing the interclass variance and hence the average of the center of gravity of two neighboring class pixel values should be equal to the interclass threshold value. The learning (weight matrix evolution) procedure of the neural model is developed based on the above condition and it is a kind of unsupervised competitive learning. We use a three-layer neural network with binary weight synapses. The number of neurons in the first layer equals that of gray levels of the image and complex number inputs are used because the arguments of second-layer outputs represent the center of gravity of the class. The third-layer neurons receive the argument output of the second layer and give an indication of the reach of the optimum condition.

Paper Details

Date Published: 1 May 1994
PDF: 10 pages
Proc. SPIE 2180, Nonlinear Image Processing V, (1 May 1994); doi: 10.1117/12.172569
Show Author Affiliations
Wai Pan Cheung, Hong Kong Polytechnic (Hong Kong)
C. K. Lee, Hong Kong Polytechnic (Hong Kong)
K. C. Li, Hong Kong Polytechnic (Hong Kong)


Published in SPIE Proceedings Vol. 2180:
Nonlinear Image Processing V
Edward R. Dougherty; Jaakko Astola; Harold G. Longbotham, Editor(s)

© SPIE. Terms of Use
Back to Top