Share Email Print
cover

Proceedings Paper • new

Deep convolutional network based on rank learning for OCT retinal images quality assessment
Author(s): Jia Yang Wang; Lei Zhang; Min Zhang; Jun Feng; Yi Lv
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

The visual quality measurement of optical coherence tomography (OCT) images is very important for the diagnosis of diseases in the later stage. This paper presented a novel OCT image quality assessment method. The concept of pairwise learning in learning to rank (LTR) is introduced to extract image features sensitive to OCT image quality levels. First, a simple multi-input network (Ranking-based OCT image features extraction network) is constructed by using the residual structure. Second, the ROFE Network is trained by pairwise images. Third, the trained ROFE Network is used to extract the ranking sensitive features of OCT images. Finally, support vector regression (SVR) model is used to get the objective quality scores of OCT images. In order to verify the effectiveness of the proposed method, 608 OCT images with subjective perceptual quality are collected, and a number of experiments are carried out. The experimental results show the proposed method has strong correlations with subjective quality evaluations.

Paper Details

Date Published: 15 March 2019
PDF: 6 pages
Proc. SPIE 10953, Medical Imaging 2019: Biomedical Applications in Molecular, Structural, and Functional Imaging, 1095309 (15 March 2019); doi: 10.1117/12.2513689
Show Author Affiliations
Jia Yang Wang, Northwest Univ. (China)
Lei Zhang, Northwest Univ. (China)
Min Zhang, Northwest Univ. (China)
Jun Feng, Northwest Univ. (China)
Yi Lv, Xi'an Jiaotong Univ. (China)


Published in SPIE Proceedings Vol. 10953:
Medical Imaging 2019: Biomedical Applications in Molecular, Structural, and Functional Imaging
Barjor Gimi; Andrzej Krol, Editor(s)

© SPIE. Terms of Use
Back to Top