Share Email Print

Proceedings Paper

Error minimizing algorithms for nearest neighbor classifiers
Author(s): Reid B. Porter; Don Hush; G. Beate Zimmer
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Stack Filters define a large class of discrete nonlinear filter first introduced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.

Paper Details

Date Published: 3 February 2011
PDF: 10 pages
Proc. SPIE 7870, Image Processing: Algorithms and Systems IX, 787005 (3 February 2011); doi: 10.1117/12.877299
Show Author Affiliations
Reid B. Porter, Los Alamos National Lab. (United States)
Don Hush, Los Alamos National Lab. (United States)
G. Beate Zimmer, Texas A&M Univ. (United States)

Published in SPIE Proceedings Vol. 7870:
Image Processing: Algorithms and Systems IX
Jaakko T. Astola; Karen O. Egiazarian, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?