Share Email Print
cover

Proceedings Paper

A dropout distribution model on deep networks
Author(s): Fengqi Li; Helin Yang
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Dropout is proved to have a good ability of controlling overfitting and improving deep networks’ generalization. However, dropout adopts a constant rate to train the parameters of each layer, reducing the classification accuracy and efficiency. Aiming at this problem, the paper proposes a dropout rate distribution model by analyzing the relationship between the dropout rate and the layers of the deep network. First, we gave the formal description of the dropout rate to reveal the relationship between the dropout rate and the layers of the deep network. Second, we proposed a distribution model for determining the dropout rate in each layer training. Experiments are performed on MNIST and CIFAR-10 datasets to evaluate the performance of the proposed model by comparison with networks of constant dropout rates. Experimental results demonstrate that our proposed model performs better than the conventional dropout in classification accuracy and efficiency.

Paper Details

Date Published: 29 August 2016
PDF: 7 pages
Proc. SPIE 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016), 1003360 (29 August 2016); doi: 10.1117/12.2243971
Show Author Affiliations
Fengqi Li, Dalian Univ. of Technology (China)
Helin Yang, Dalian Univ. of Technology (China)


Published in SPIE Proceedings Vol. 10033:
Eighth International Conference on Digital Image Processing (ICDIP 2016)
Charles M. Falco; Xudong Jiang, Editor(s)

© SPIE. Terms of Use
Back to Top