Research Article

On the Convergence of Stochastic Aggregated Gradient Method

Volume: 15 Number: 1 June 30, 2023
EN

On the Convergence of Stochastic Aggregated Gradient Method

Abstract

The minimization problem of the sum of a large set of convex functions arises in various applications. Methods such as incremental gradient, stochastic gradient, and aggregated gradient are popular choices for solving those problems as they do not require a full gradient evaluation at every iteration. In this paper, we analyze a generalization of the stochastic aggregated gradient method via an alternative technique based on the convergence of iterative linear systems. The technique provides a short proof for the $O(\kappa^{-1})$ linear convergence rate in the quadratic case. We observe that the technique is rather restrictive for the general case, and can provide weaker results.

Keywords

References

  1. Bertsekas, D.P., Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey, In: Optimization for Machine Learning, MIT Press, 2012.
  2. Bottou, L., Curtis, F.E., Nocedal, J., Optimization methods for large-scale machine learning, SIAM Review, 60(2018), 223–311.
  3. Gurbuzbalaban, M., Ozdaglar, A., Parrilo, P.A., On the convergence rate of incremental aggregated gradient algorithms, SIAM Journal on Optimization, 27(2017), 1035–1048.
  4. Hartfiel, D.J., Nonhomogeneous Matrix Products, World Scientific, 2002.
  5. Schmidt, M., Le Roux, N., Bach, F., Minimizing finite sums with the stochastic average gradient, Mathematical Programming, 162(2017), 83–112.
  6. Silvester, J.R., Determinants of block matrices, The Mathematical Gazette, 84(2000), 460–467.
  7. Strang, G., Introduction to Applied Mathematics, Wellesley-Cambridge Press, 1986.
  8. Van Loan, C.F., Golub, G., Matrix Computations, The Johns Hopkins University Press, 1996.

Details

Primary Language

English

Subjects

Mathematical Sciences

Journal Section

Research Article

Publication Date

June 30, 2023

Submission Date

December 16, 2021

Acceptance Date

January 26, 2023

Published in Issue

Year 2023 Volume: 15 Number: 1

APA
Oztoprak Topkaya, F. (2023). On the Convergence of Stochastic Aggregated Gradient Method. Turkish Journal of Mathematics and Computer Science, 15(1), 89-95. https://doi.org/10.47000/tjmcs.1037384
AMA
1.Oztoprak Topkaya F. On the Convergence of Stochastic Aggregated Gradient Method. TJMCS. 2023;15(1):89-95. doi:10.47000/tjmcs.1037384
Chicago
Oztoprak Topkaya, Figen. 2023. “On the Convergence of Stochastic Aggregated Gradient Method”. Turkish Journal of Mathematics and Computer Science 15 (1): 89-95. https://doi.org/10.47000/tjmcs.1037384.
EndNote
Oztoprak Topkaya F (June 1, 2023) On the Convergence of Stochastic Aggregated Gradient Method. Turkish Journal of Mathematics and Computer Science 15 1 89–95.
IEEE
[1]F. Oztoprak Topkaya, “On the Convergence of Stochastic Aggregated Gradient Method”, TJMCS, vol. 15, no. 1, pp. 89–95, June 2023, doi: 10.47000/tjmcs.1037384.
ISNAD
Oztoprak Topkaya, Figen. “On the Convergence of Stochastic Aggregated Gradient Method”. Turkish Journal of Mathematics and Computer Science 15/1 (June 1, 2023): 89-95. https://doi.org/10.47000/tjmcs.1037384.
JAMA
1.Oztoprak Topkaya F. On the Convergence of Stochastic Aggregated Gradient Method. TJMCS. 2023;15:89–95.
MLA
Oztoprak Topkaya, Figen. “On the Convergence of Stochastic Aggregated Gradient Method”. Turkish Journal of Mathematics and Computer Science, vol. 15, no. 1, June 2023, pp. 89-95, doi:10.47000/tjmcs.1037384.
Vancouver
1.Figen Oztoprak Topkaya. On the Convergence of Stochastic Aggregated Gradient Method. TJMCS. 2023 Jun. 1;15(1):89-95. doi:10.47000/tjmcs.1037384