IAD Index of Academic Documents
  • Home Page
  • About
    • About Izmir Academy Association
    • About IAD Index
    • IAD Team
    • IAD Logos and Links
    • Policies
    • Contact
  • Submit A Journal
  • Submit A Conference
  • Submit Paper/Book
    • Submit a Preprint
    • Submit a Book
  • Contact
  • Turkish Journal of Mathematics and Computer Science
  • Volume:15 Issue:1
  • On the Convergence of Stochastic Aggregated Gradient Method

On the Convergence of Stochastic Aggregated Gradient Method

Authors : Figen OZTOPRAK TOPKAYA
Pages : 89-95
Doi:10.47000/tjmcs.1037384
View : 50 | Download : 26
Publication Date : 2023-06-30
Article Type : Research Paper
Abstract :The minimization problem of the sum of a large set of convex functions arises in various applications. Methods such as incremental gradient, stochastic gradient, and aggregated gradient are popular choices for solving those problems as they do not require a full gradient evaluation at every iteration. In this paper, we analyze a generalization of the stochastic aggregated gradient method via an alternative technique based on the convergence of iterative linear systems. The technique provides a short proof for the $Oinsert ignore into journalissuearticles values(\\kappa^{-1});$ linear convergence rate in the quadratic case. We observe that the technique is rather restrictive for the general case, and can provide weaker results.
Keywords : Unconstrained optimization, stochastic gradient, incremental methods

ORIGINAL ARTICLE URL

* There may have been changes in the journal, article,conference, book, preprint etc. informations. Therefore, it would be appropriate to follow the information on the official page of the source. The information here is shared for informational purposes. IAD is not responsible for incorrect or missing information.


Index of Academic Documents
İzmir Academy Association
CopyRight © 2023-2026