Author
Listed:
- Elnady, Sroor M.
- El-Beltagy, Mohamed
- Radwan, Ahmed G.
- Fouda, Mohammed E.
Abstract
Fractional Gradient Descent (FGD) methods extend classical optimization algorithms by integrating fractional calculus, leading to notable improvements in convergence speed, stability, and accuracy. However, recent studies indicate that engineering challenges—such as tensor-based differentiation in deep neural networks—remain partially unresolved, prompting further investigation into the scalability and computational feasibility of FGD. This paper provides a comprehensive review of recent advancements in FGD techniques, focusing on their approximation methods and convergence properties. These methods are systematically categorized based on their strategies to overcome convergence challenges inherent in fractional-order calculations, such as non-locality and long-memory effects. Key techniques examined include modified fractional-order gradients designed to avoid singularities and ensure convergence to the true extremum. Adaptive step-size strategies and variable fractional-order schemes are analyzed, balancing rapid convergence with precise parameter estimation. Additionally, the application of truncation methods is explored to mitigate oscillatory behavior associated with fractional derivatives. By synthesizing convergence analyses from multiple studies, insights are offered into the theoretical foundations of these methods, including proofs of linear convergence. Ultimately, this paper highlights the effectiveness of various FGD approaches in accelerating convergence and enhancing stability. While also acknowledging significant gaps in practical implementations for large-scale engineering tasks, including deep learning. The presented review serves as a resource for researchers and practitioners in the selection of appropriate FGD techniques for different optimization problems.
Suggested Citation
Elnady, Sroor M. & El-Beltagy, Mohamed & Radwan, Ahmed G. & Fouda, Mohammed E., 2025.
"A comprehensive survey of fractional gradient descent methods and their convergence analysis,"
Chaos, Solitons & Fractals, Elsevier, vol. 194(C).
Handle:
RePEc:eee:chsofr:v:194:y:2025:i:c:s0960077925001675
DOI: 10.1016/j.chaos.2025.116154
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:chsofr:v:194:y:2025:i:c:s0960077925001675. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Thayer, Thomas R. (email available below). General contact details of provider: https://www.journals.elsevier.com/chaos-solitons-and-fractals .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.