Share Email Print
cover

Proceedings Paper

New neural-networks-based 3D object recognition system
Author(s): Purang Abolmaesumi; M. Jahed
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Three-dimensional object recognition has always been one of the challenging fields in computer vision. In recent years, Ulman and Basri (1991) have proposed that this task can be done by using a database of 2-D views of the objects. The main problem in their proposed system is that the correspondent points should be known to interpolate the views. On the other hand, their system should have a supervisor to decide which class does the represented view belong to. In this paper, we propose a new momentum-Fourier descriptor that is invariant to scale, translation, and rotation. This descriptor provides the input feature vectors to our proposed system. By using the Dystal network, we show that the objects can be classified with over 95% precision. We have used this system to classify the objects like cube, cone, sphere, torus, and cylinder. Because of the nature of the Dystal network, this system reaches to its stable point by a single representation of the view to the system. This system can also classify the similar views to a single class (e.g., for the cube, the system generated 9 different classes for 50 different input views), which can be used to select an optimum database of training views. The system is also very flexible to the noise and deformed views.

Paper Details

Date Published: 26 September 1997
PDF: 10 pages
Proc. SPIE 3208, Intelligent Robots and Computer Vision XVI: Algorithms, Techniques, Active Vision, and Materials Handling, (26 September 1997); doi: 10.1117/12.290302
Show Author Affiliations
Purang Abolmaesumi, Sharif Univ. of Technology (Iran)
M. Jahed, Sharif Univ. of Technology (Iran)


Published in SPIE Proceedings Vol. 3208:
Intelligent Robots and Computer Vision XVI: Algorithms, Techniques, Active Vision, and Materials Handling
David P. Casasent, Editor(s)

© SPIE. Terms of Use
Back to Top