Share Email Print
cover

Proceedings Paper

Exploring point-cloud features from partial body views for gender classification
Author(s): Aaron Fouts; Ryan McCoppin; Mateen Rizki; Louis Tamburino; Olga Mendoza-Schrock
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

In this paper we extend a previous exploration of histogram features extracted from 3D point cloud images of human subjects for gender discrimination. Feature extraction used a collection of concentric cylinders to define volumes for counting 3D points. The histogram features are characterized by a rotational axis and a selected set of volumes derived from the concentric cylinders. The point cloud images are drawn from the CAESAR anthropometric database provided by the Air Force Research Laboratory (AFRL) Human Effectiveness Directorate and SAE International. This database contains approximately 4400 high resolution LIDAR whole body scans of carefully posed human subjects. Success from our previous investigation was based on extracting features from full body coverage which required integration of multiple camera images. With the full body coverage, the central vertical body axis and orientation are readily obtainable; however, this is not the case with a one camera view providing less than one half body coverage. Assuming that the subjects are upright, we need to determine or estimate the position of the vertical axis and the orientation of the body about this axis relative to the camera. In past experiments the vertical axis was located through the center of mass of torso points projected on the ground plane and the body orientation derived using principle component analysis. In a natural extension of our previous work to partial body views, the absence of rotational invariance about the cylindrical axis greatly increases the difficulty for gender classification. Even the problem of estimating the axis is no longer simple. We describe some simple feasibility experiments that use partial image histograms. Here, the cylindrical axis is assumed to be known. We also discuss experiments with full body images that explore the sensitivity of classification accuracy relative to displacements of the cylindrical axis. Our initial results provide the basis for further investigation of more complex partial body viewing problems and new methods for estimating the two position coordinates for the axis location and the unknown body orientation angle.

Paper Details

Date Published: 8 May 2012
PDF: 9 pages
Proc. SPIE 8402, Evolutionary and Bio-Inspired Computation: Theory and Applications VI, 84020L (8 May 2012); doi: 10.1117/12.921880
Show Author Affiliations
Aaron Fouts, Wright State Univ. (United States)
Ryan McCoppin, Wright State Univ. (United States)
Mateen Rizki, Wright State Univ. (United States)
Louis Tamburino, Wright State Univ. (United States)
Olga Mendoza-Schrock, Air Force Research Lab. (United States)


Published in SPIE Proceedings Vol. 8402:
Evolutionary and Bio-Inspired Computation: Theory and Applications VI
Olga Mendoza-Schrock; Mateen M. Rizki; Todd V. Rovito, Editor(s)

© SPIE. Terms of Use
Back to Top