IAD Index of Academic Documents
  • Home Page
  • About
    • About Izmir Academy Association
    • About IAD Index
    • IAD Team
    • IAD Logos and Links
    • Policies
    • Contact
  • Submit A Journal
  • Submit A Conference
  • Submit Paper/Book
    • Submit a Preprint
    • Submit a Book
  • Contact
  • Online Journal of Mathematics, Science and Technology Education
  • Vol:6 Issue:2
  • A Mathematically Modified Adam Algorithm for Improved Convergence in Deep Neural Networks

A Mathematically Modified Adam Algorithm for Improved Convergence in Deep Neural Networks

Authors : Mark Laisin, Bright O Osu, Prisca Udodiri Duruojinkeya, Chigozie Chibuisi
Pages : 40-64
Doi:https://doi.org/10.5281/zenodo.16275796
View : 6 | Download : 0
Publication Date : 2025-12-30
Article Type : Research Paper
Abstract :This study introduces the Adaptive Moment Gradient Thresholding (AMGT) algorithm, a modified version of the Adam optimizer, aimed at enhancing convergence stability in deep neural networks. By leveraging optimization theory and addressing the limitations of Adam, AMGT was designed to tackle non-convexity, constrained environments, and gradient-based learning instability.
Keywords : Adaptive Optimization, Gradient Descent, Deep Neural Networks, Convergence Analysis, Momentum Thresholding, Non-Convex Optimization, Step Size Scheduling, Stochastic Optimization

ORIGINAL ARTICLE URL

* There may have been changes in the journal, article,conference, book, preprint etc. informations. Therefore, it would be appropriate to follow the information on the official page of the source. The information here is shared for informational purposes. IAD is not responsible for incorrect or missing information.


Index of Academic Documents
İzmir Academy Association
CopyRight © 2023-2026