Share Email Print

Proceedings Paper

Context-based pixelization model for the artificial retina using saliency map and skin color detection algorithm
Author(s): S. M. Jin; I. B. Lee; J. M. Han; J. M. Seo; K. S. Park
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

A key problem of artificial visual prosthesis is the low resolution due to the limited number of electrodes. Various methods such as edge detection, contrast enhancement have been studied as the solutions of the low resolution problem and these methods have been performed to face or object recognition in the close-up image. In this paper, we proposed the region-of-interest detection method using a context-based model, which is appropriate for real situations. The visually-salient region was detected by combining the saliency map with color information. In experiment, to evaluate the proposed model, gaze was estimated using an eye tracker when subjects watch the original image and two types of 10 × 10 pixelized images produced by conventional and saliency based method, respectively. Each gaze of pixelized images was compared with the gaze of the original image. The experiment showed that the gaze using the proposed context based model much more correlates with the gaze of the original image than that of conventional model.

Paper Details

Date Published: 13 February 2008
PDF: 8 pages
Proc. SPIE 6806, Human Vision and Electronic Imaging XIII, 68060L (13 February 2008); doi: 10.1117/12.766312
Show Author Affiliations
S. M. Jin, Seoul National Univ. (South Korea)
I. B. Lee, Seoul National Univ. (South Korea)
J. M. Han, Seoul National Univ. (South Korea)
J. M. Seo, Dongguk Univ. College of Medicine (South Korea)
K. S. Park, Seoul National Univ. (South Korea)

Published in SPIE Proceedings Vol. 6806:
Human Vision and Electronic Imaging XIII
Bernice E. Rogowitz; Thrasyvoulos N. Pappas, Editor(s)

© SPIE. Terms of Use
Back to Top