- Online Journal of Mathematics, Science and Technology Education
- Vol:6 Issue:2
- A Mathematically Modified Adam Algorithm for Improved Convergence in Deep Neural Networks
A Mathematically Modified Adam Algorithm for Improved Convergence in Deep Neural Networks
Authors : Mark Laisin, Bright O Osu, Prisca Udodiri Duruojinkeya, Chigozie Chibuisi
Pages : 40-64
Doi:https://doi.org/10.5281/zenodo.16275796
View : 6 | Download : 0
Publication Date : 2025-12-30
Article Type : Research Paper
Abstract :This study introduces the Adaptive Moment Gradient Thresholding (AMGT) algorithm, a modified version of the Adam optimizer, aimed at enhancing convergence stability in deep neural networks. By leveraging optimization theory and addressing the limitations of Adam, AMGT was designed to tackle non-convexity, constrained environments, and gradient-based learning instability.Keywords : Adaptive Optimization, Gradient Descent, Deep Neural Networks, Convergence Analysis, Momentum Thresholding, Non-Convex Optimization, Step Size Scheduling, Stochastic Optimization
ORIGINAL ARTICLE URL
