Research Article
BibTex RIS Cite
Year 2023, , 89 - 95, 30.06.2023
https://doi.org/10.47000/tjmcs.1037384

Abstract

References

  • Bertsekas, D.P., Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey, In: Optimization for Machine Learning, MIT Press, 2012.
  • Bottou, L., Curtis, F.E., Nocedal, J., Optimization methods for large-scale machine learning, SIAM Review, 60(2018), 223–311.
  • Gurbuzbalaban, M., Ozdaglar, A., Parrilo, P.A., On the convergence rate of incremental aggregated gradient algorithms, SIAM Journal on Optimization, 27(2017), 1035–1048.
  • Hartfiel, D.J., Nonhomogeneous Matrix Products, World Scientific, 2002.
  • Schmidt, M., Le Roux, N., Bach, F., Minimizing finite sums with the stochastic average gradient, Mathematical Programming, 162(2017), 83–112.
  • Silvester, J.R., Determinants of block matrices, The Mathematical Gazette, 84(2000), 460–467.
  • Strang, G., Introduction to Applied Mathematics, Wellesley-Cambridge Press, 1986.
  • Van Loan, C.F., Golub, G., Matrix Computations, The Johns Hopkins University Press, 1996.

On the Convergence of Stochastic Aggregated Gradient Method

Year 2023, , 89 - 95, 30.06.2023
https://doi.org/10.47000/tjmcs.1037384

Abstract

The minimization problem of the sum of a large set of convex functions arises in various applications. Methods such as incremental gradient, stochastic gradient, and aggregated gradient are popular choices for solving those problems as they do not require a full gradient evaluation at every iteration. In this paper, we analyze a generalization of the stochastic aggregated gradient method via an alternative technique based on the convergence of iterative linear systems. The technique provides a short proof for the $O(\kappa^{-1})$ linear convergence rate in the quadratic case. We observe that the technique is rather restrictive for the general case, and can provide weaker results.

References

  • Bertsekas, D.P., Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey, In: Optimization for Machine Learning, MIT Press, 2012.
  • Bottou, L., Curtis, F.E., Nocedal, J., Optimization methods for large-scale machine learning, SIAM Review, 60(2018), 223–311.
  • Gurbuzbalaban, M., Ozdaglar, A., Parrilo, P.A., On the convergence rate of incremental aggregated gradient algorithms, SIAM Journal on Optimization, 27(2017), 1035–1048.
  • Hartfiel, D.J., Nonhomogeneous Matrix Products, World Scientific, 2002.
  • Schmidt, M., Le Roux, N., Bach, F., Minimizing finite sums with the stochastic average gradient, Mathematical Programming, 162(2017), 83–112.
  • Silvester, J.R., Determinants of block matrices, The Mathematical Gazette, 84(2000), 460–467.
  • Strang, G., Introduction to Applied Mathematics, Wellesley-Cambridge Press, 1986.
  • Van Loan, C.F., Golub, G., Matrix Computations, The Johns Hopkins University Press, 1996.
There are 8 citations in total.

Details

Primary Language English
Subjects Mathematical Sciences
Journal Section Articles
Authors

Figen Oztoprak Topkaya 0000-0003-3300-2750

Publication Date June 30, 2023
Published in Issue Year 2023

Cite

APA Oztoprak Topkaya, F. (2023). On the Convergence of Stochastic Aggregated Gradient Method. Turkish Journal of Mathematics and Computer Science, 15(1), 89-95. https://doi.org/10.47000/tjmcs.1037384
AMA Oztoprak Topkaya F. On the Convergence of Stochastic Aggregated Gradient Method. TJMCS. June 2023;15(1):89-95. doi:10.47000/tjmcs.1037384
Chicago Oztoprak Topkaya, Figen. “On the Convergence of Stochastic Aggregated Gradient Method”. Turkish Journal of Mathematics and Computer Science 15, no. 1 (June 2023): 89-95. https://doi.org/10.47000/tjmcs.1037384.
EndNote Oztoprak Topkaya F (June 1, 2023) On the Convergence of Stochastic Aggregated Gradient Method. Turkish Journal of Mathematics and Computer Science 15 1 89–95.
IEEE F. Oztoprak Topkaya, “On the Convergence of Stochastic Aggregated Gradient Method”, TJMCS, vol. 15, no. 1, pp. 89–95, 2023, doi: 10.47000/tjmcs.1037384.
ISNAD Oztoprak Topkaya, Figen. “On the Convergence of Stochastic Aggregated Gradient Method”. Turkish Journal of Mathematics and Computer Science 15/1 (June 2023), 89-95. https://doi.org/10.47000/tjmcs.1037384.
JAMA Oztoprak Topkaya F. On the Convergence of Stochastic Aggregated Gradient Method. TJMCS. 2023;15:89–95.
MLA Oztoprak Topkaya, Figen. “On the Convergence of Stochastic Aggregated Gradient Method”. Turkish Journal of Mathematics and Computer Science, vol. 15, no. 1, 2023, pp. 89-95, doi:10.47000/tjmcs.1037384.
Vancouver Oztoprak Topkaya F. On the Convergence of Stochastic Aggregated Gradient Method. TJMCS. 2023;15(1):89-95.