arrow
Volume 37, Issue 1
Descent Direction Stochastic Approximation Algorithm with Adaptive Step Sizes

Zorana Lužanin, Irena Stojkovska & Milena Kresoja

J. Comp. Math., 37 (2019), pp. 76-94.

Published online: 2018-08

Export citation
  • Abstract

A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed. New adaptive step size scheme uses ordered statistics of fixed number of previous noisy function values as a criterion for accepting good and rejecting bad steps. The scheme allows the algorithm to move in bigger steps and avoid steps proportional to $1/k$ when it is expected that larger steps will improve the performance. An algorithm with the new adaptive scheme is defined for a general descent direction. The almost sure convergence is established. The performance of new algorithm is tested on a set of standard test problems and compared with relevant algorithms. Numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level. Numerical results on problems arising in machine learning are also presented. Linear regression problem is considered using real data set. The results suggest that the proposed algorithm shows promise.

  • AMS Subject Headings

90C15, 62L20

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address

zorana@dmi.uns.ac.rs (Zorana Lužanin)

irenatra@pmf.ukim.mk (Irena Stojkovska)

milena.kresoja@dmi.uns.ac.rs (Milena Kresoja)

  • BibTex
  • RIS
  • TXT
@Article{JCM-37-76, author = {Lužanin , ZoranaStojkovska , Irena and Kresoja , Milena}, title = {Descent Direction Stochastic Approximation Algorithm with Adaptive Step Sizes}, journal = {Journal of Computational Mathematics}, year = {2018}, volume = {37}, number = {1}, pages = {76--94}, abstract = {

A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed. New adaptive step size scheme uses ordered statistics of fixed number of previous noisy function values as a criterion for accepting good and rejecting bad steps. The scheme allows the algorithm to move in bigger steps and avoid steps proportional to $1/k$ when it is expected that larger steps will improve the performance. An algorithm with the new adaptive scheme is defined for a general descent direction. The almost sure convergence is established. The performance of new algorithm is tested on a set of standard test problems and compared with relevant algorithms. Numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level. Numerical results on problems arising in machine learning are also presented. Linear regression problem is considered using real data set. The results suggest that the proposed algorithm shows promise.

}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.1710-m2017-0021}, url = {http://global-sci.org/intro/article_detail/jcm/12650.html} }
TY - JOUR T1 - Descent Direction Stochastic Approximation Algorithm with Adaptive Step Sizes AU - Lužanin , Zorana AU - Stojkovska , Irena AU - Kresoja , Milena JO - Journal of Computational Mathematics VL - 1 SP - 76 EP - 94 PY - 2018 DA - 2018/08 SN - 37 DO - http://doi.org/10.4208/jcm.1710-m2017-0021 UR - https://global-sci.org/intro/article_detail/jcm/12650.html KW - Unconstrained optimization, Stochastic optimization, Stochastic approximation, Noisy function, Adaptive step size, Descent direction, Linear regression model. AB -

A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed. New adaptive step size scheme uses ordered statistics of fixed number of previous noisy function values as a criterion for accepting good and rejecting bad steps. The scheme allows the algorithm to move in bigger steps and avoid steps proportional to $1/k$ when it is expected that larger steps will improve the performance. An algorithm with the new adaptive scheme is defined for a general descent direction. The almost sure convergence is established. The performance of new algorithm is tested on a set of standard test problems and compared with relevant algorithms. Numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level. Numerical results on problems arising in machine learning are also presented. Linear regression problem is considered using real data set. The results suggest that the proposed algorithm shows promise.

Zorana Lužanin, Irena Stojkovska & Milena Kresoja. (2020). Descent Direction Stochastic Approximation Algorithm with Adaptive Step Sizes. Journal of Computational Mathematics. 37 (1). 76-94. doi:10.4208/jcm.1710-m2017-0021
Copy to clipboard
The citation has been copied to your clipboard