Share Email Print
cover

Proceedings Paper

Convergence rates of finite difference stochastic approximation algorithms part II: implementation via common random numbers
Author(s): Liyi Dai
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Stochastic optimization is a fundamental problem that finds applications in many areas including biological and cognitive sciences. The classical stochastic approximation algorithm for iterative stochastic optimization requires gradient information of the sample object function that is typically difficult to obtain in practice. Recently there has been renewed interests in derivative free approaches to stochastic optimization. In this paper, we examine the rates of convergence for the Kiefer-Wolfowitz algorithm and the mirror descent algorithm, by approximating gradient using finite differences generated through common random numbers. It is shown that the convergence of these algorithms can be accelerated by controlling the implementation of the finite differences. Particularly, it is shown that the rate can be increased to n−2/5 in general and to n−1/2, the best possible rate of stochastic approximation, in Monte Carlo optimization for a broad class of problems, in the iteration number n.

Paper Details

Date Published: 19 May 2016
PDF: 19 pages
Proc. SPIE 9871, Sensing and Analysis Technologies for Biomedical and Cognitive Applications 2016, 98710J (19 May 2016); doi: 10.1117/12.2228251
Show Author Affiliations
Liyi Dai, U.S. Army Research Office (United States)


Published in SPIE Proceedings Vol. 9871:
Sensing and Analysis Technologies for Biomedical and Cognitive Applications 2016
Liyi Dai; Yufeng Zheng; Henry Chu; Anke D. Meyer-Bäse, Editor(s)

© SPIE. Terms of Use
Back to Top