Share Email Print
cover

Proceedings Paper

Part-based deep representation for product tagging and search
Author(s): Keqing Chen
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Despite previous studies, tagging and indexing the product images remain challenging due to the large inner-class variation of the products. In the traditional methods, the quantized hand-crafted features such as SIFTs are extracted as the representation of the product images, which are not discriminative enough to handle the inner-class variation. For discriminative image representation, this paper firstly presents a novel deep convolutional neural networks (DCNNs) architect true pre-trained on a large-scale general image dataset. Compared to the traditional features, our DCNNs representation is of more discriminative power with fewer dimensions. Moreover, we incorporate the part-based model into the framework to overcome the negative effect of bad alignment and cluttered background and hence the descriptive ability of the deep representation is further enhanced. Finally, we collect and contribute a well-labeled shoe image database, i.e., the TBShoes, on which we apply the part-based deep representation for product image tagging and search, respectively. The experimental results highlight the advantages of the proposed part-based deep representation.

Paper Details

Date Published: 19 June 2017
PDF: 8 pages
Proc. SPIE 10443, Second International Workshop on Pattern Recognition, 104431D (19 June 2017); doi: 10.1117/12.2280300
Show Author Affiliations
Keqing Chen, Tsinghua Univ. (China)


Published in SPIE Proceedings Vol. 10443:
Second International Workshop on Pattern Recognition
Xudong Jiang; Masayuki Arai; Guojian Chen, Editor(s)

© SPIE. Terms of Use
Back to Top