Research Article

Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms

Volume: 10 Number: 1 June 30, 2024
EN

Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms

Abstract

This work investigates the integration of multiplicative calculus into gradient descent algorithms, including Adaptive Gradient algorithm (AdaGrad), Root Mean Squared Propagation (RMSProp), Nesterov Accelerated Gradient (NAG), and Momentum, to optimize exponential-quadratic-logarithmic composite functions with the positivity constrained. This research, conducted across five scenarios within the Constrained and Unconstrained Testing Environment (CUTEst), compares these multiplicative methods with their classical counterparts under a variety of constraints environments such as bounded, quadratic, and other types, and unconstrained environments. The results demonstrate the significant superiority of multiplicative-based algorithms, especially in unconstrained and bounded constrained scenarios, and demonstrate their potential for complex optimization tasks. Statistical analysis supports the observed performance advantages, indicating significant opportunities for optimization strate-gies in positive domains.

Keywords

References

  1. Ruder, S. (2016). An overview of gradient descent optimization algorithms. 1–14, [Online]. Available: http://arxiv.org/abs/1609.04747.
  2. Baldi, P. (1995). Gradient Descent Learning Algorithm Overview: A General Dynamical Systems Perspective. IEEE Trans. Neural Networks, 6(1), 182–195.
  3. Zou, F., Shen, L., Jie, Z., Zhang, W. and Liu, W. (2019). A sufficient condition for convergences of adam and rmsprop. Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., 2019(1), 11119–11127.
  4. Wilson, A. C., Roelofs, R., Stern, M., Srebro, N. and Recht, B. (2017). The marginal value of adaptive gradient methods in machine learning. Adv. Neural Inf. Process. Syst., 2017, 4149–4159.
  5. Qian, N. (1999). On the momentum term in gradient descent learning algorithms. Neural Networks, 12(1), 145–151.
  6. Gould, N. I. M., Orban, D. and Toint, P. L. (2015). CUTEst: a Constrained and Unconstrained Testing Environment with safe threads for mathematical optimization. Comput. Optim. Appl., 60(3),545–557.
  7. Zhang, J. (2019). Gradient Descent based Optimization Algorithms for Deep Learning Models Training. [Online]. Available: http://arxiv.org/abs/1903.03614.
  8. Polyak, B. T. (1964). Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys., 4(5), 1–17.

Details

Primary Language

English

Subjects

Performance Evaluation

Journal Section

Research Article

Early Pub Date

June 28, 2024

Publication Date

June 30, 2024

Submission Date

April 12, 2024

Acceptance Date

June 16, 2024

Published in Issue

Year 2024 Volume: 10 Number: 1

APA
Kıymık, E., & Öztürk, A. E. (2024). Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences, 10(1), 209-227. https://doi.org/10.29132/ijpas.1467644
AMA
1.Kıymık E, Öztürk AE. Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences. 2024;10(1):209-227. doi:10.29132/ijpas.1467644
Chicago
Kıymık, Erkan, and Ali Emre Öztürk. 2024. “Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms”. International Journal of Pure and Applied Sciences 10 (1): 209-27. https://doi.org/10.29132/ijpas.1467644.
EndNote
Kıymık E, Öztürk AE (June 1, 2024) Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences 10 1 209–227.
IEEE
[1]E. Kıymık and A. E. Öztürk, “Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms”, International Journal of Pure and Applied Sciences, vol. 10, no. 1, pp. 209–227, June 2024, doi: 10.29132/ijpas.1467644.
ISNAD
Kıymık, Erkan - Öztürk, Ali Emre. “Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms”. International Journal of Pure and Applied Sciences 10/1 (June 1, 2024): 209-227. https://doi.org/10.29132/ijpas.1467644.
JAMA
1.Kıymık E, Öztürk AE. Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences. 2024;10:209–227.
MLA
Kıymık, Erkan, and Ali Emre Öztürk. “Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms”. International Journal of Pure and Applied Sciences, vol. 10, no. 1, June 2024, pp. 209-27, doi:10.29132/ijpas.1467644.
Vancouver
1.Erkan Kıymık, Ali Emre Öztürk. Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences. 2024 Jun. 1;10(1):209-27. doi:10.29132/ijpas.1467644
download?token=eyJ1aWQiOjExNDQyMSwiYXV0aF9yb2xlcyI6WyJST0xFX1VTRVIiXSwiZW5kcG9pbnQiOiJqb3VybmFsIiwib3JpZ2luYWxuYW1lIjoiVFJEaXppbmxvZ29fbGl2ZS1lMTU4Njc2Mzk1Nzc0Ni5wbmciLCJwYXRoIjoiZmQ0MS83M2Q5LzM2NDkvNjlhMDA3ODA1YTlmMTcuOTY1MTM2NDYucG5nIiwiZXhwIjoxNzcyMDk4OTYwLCJub25jZSI6IjZiYTZlMjJkZWUxOWZkZmQ0Y2Y5ZGU2ZDM5ZGYxYWIwIn0.cBh4PLOiOk2HZxiMIuHbYkE-VqlAI6yS9_1ogzjRrlY

154501544915448154471544615445