Araştırma Makalesi
BibTex RIS Kaynak Göster

Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms

Yıl 2024, , 209 - 227, 30.06.2024
https://doi.org/10.29132/ijpas.1467644

Öz

This work investigates the integration of multiplicative calculus into gradient descent algorithms, including Adaptive Gradient algorithm (AdaGrad), Root Mean Squared Propagation (RMSProp), Nesterov Accelerated Gradient (NAG), and Momentum, to optimize exponential-quadratic-logarithmic composite functions with the positivity constrained. This research, conducted across five scenarios within the Constrained and Unconstrained Testing Environment (CUTEst), compares these multiplicative methods with their classical counterparts under a variety of constraints environments such as bounded, quadratic, and other types, and unconstrained environments. The results demonstrate the significant superiority of multiplicative-based algorithms, especially in unconstrained and bounded constrained scenarios, and demonstrate their potential for complex optimization tasks. Statistical analysis supports the observed performance advantages, indicating significant opportunities for optimization strate-gies in positive domains.

Kaynakça

  • Ruder, S. (2016). An overview of gradient descent optimization algorithms. 1–14, [Online]. Available: http://arxiv.org/abs/1609.04747.
  • Baldi, P. (1995). Gradient Descent Learning Algorithm Overview: A General Dynamical Systems Perspective. IEEE Trans. Neural Networks, 6(1), 182–195.
  • Zou, F., Shen, L., Jie, Z., Zhang, W. and Liu, W. (2019). A sufficient condition for convergences of adam and rmsprop. Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., 2019(1), 11119–11127.
  • Wilson, A. C., Roelofs, R., Stern, M., Srebro, N. and Recht, B. (2017). The marginal value of adaptive gradient methods in machine learning. Adv. Neural Inf. Process. Syst., 2017, 4149–4159.
  • Qian, N. (1999). On the momentum term in gradient descent learning algorithms. Neural Networks, 12(1), 145–151.
  • Gould, N. I. M., Orban, D. and Toint, P. L. (2015). CUTEst: a Constrained and Unconstrained Testing Environment with safe threads for mathematical optimization. Comput. Optim. Appl., 60(3),545–557.
  • Zhang, J. (2019). Gradient Descent based Optimization Algorithms for Deep Learning Models Training. [Online]. Available: http://arxiv.org/abs/1903.03614.
  • Polyak, B. T. (1964). Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys., 4(5), 1–17.
  • Duchi, J. C., Bartlett, P. L.and Wainwright, M. J. (2012). Randomized smoothing for (parallel) stochastic optimization. Proc. IEEE Conf. Decis. Control, 12, 5442–5444.
  • Bashirov, A. E., Kurpinar, E. M. and Özyapici, A. (2008). Multiplicative calculus and its applications. J. Math. Anal. Appl., 337(1), 36–48.
  • Georgiev S. G. and Zennir, K. (2022). Multiplicative Differential Calculus, 1st ed., vol. 1, no. 1. Taylor & Francis.
  • Gürefe Y. and Misirli, D. D. E.(2011). Product Calculi And Its Applications. J. Phys. A Math. Theor., 44(8), 1–22.
  • Uzer, A. (2010). Multiplicative type complex calculus as an alternative to the classical calculus. Comput. Math. with Appl., 60(10), 2725–2737.
  • Stanley, D. (1999). A multiplicative calculus. Primus, 9(4), 310–326.
  • Özyapici A. and Misirli, A. P. D. E. (2009). Multiplicative Calculus And Its Aplications,” Ege University.
  • Özyapıcı, A., Riza, M., Bilgehan, B. and Bashirov, A. E. (2014). On multiplicative and Volterra minimization methods. Numer. Algorithms, 67(3), 623–636.
  • Filip, D., Piatecki, C. and Andrada Filip, D. (2014). A non-newtonian examination of the theory of exogenous economic growth. Work. Pap., X(XX), 2014, [Online]. Available: https://hal.archives-ouvertes.fr/hal-00945781.
  • Florack L.and Van Assen, H. (2012). Multiplicative calculus in biomedical image analysis. J. Math. Imaging Vis., 42(1), 64–75.
  • Riza M.and Aktöre, H. (2015). The Runge-Kutta method in geometric multiplicative calculus. LMS J. Comput. Math., 18(1), 539–554.
  • Riza M. and Eminağa, B. (2014). Bigeometric Calculus and Runge Kutta Method. 1–19, [Online]. Available: http://arxiv.org/abs/1402.2877.
  • Cubillos, M. (2018). Modelling wave propagation without sampling restrictions using the multiplicative calculus I: Theoretical considerations. 1–18, [Online]. Available: http://arxiv.org/abs/1801.03402.
  • Stephen Boyd Lieven Vandenberghe, (2013). Convex Optimization 中文影印. Cambridge University Press.
Yıl 2024, , 209 - 227, 30.06.2024
https://doi.org/10.29132/ijpas.1467644

Öz

Kaynakça

  • Ruder, S. (2016). An overview of gradient descent optimization algorithms. 1–14, [Online]. Available: http://arxiv.org/abs/1609.04747.
  • Baldi, P. (1995). Gradient Descent Learning Algorithm Overview: A General Dynamical Systems Perspective. IEEE Trans. Neural Networks, 6(1), 182–195.
  • Zou, F., Shen, L., Jie, Z., Zhang, W. and Liu, W. (2019). A sufficient condition for convergences of adam and rmsprop. Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., 2019(1), 11119–11127.
  • Wilson, A. C., Roelofs, R., Stern, M., Srebro, N. and Recht, B. (2017). The marginal value of adaptive gradient methods in machine learning. Adv. Neural Inf. Process. Syst., 2017, 4149–4159.
  • Qian, N. (1999). On the momentum term in gradient descent learning algorithms. Neural Networks, 12(1), 145–151.
  • Gould, N. I. M., Orban, D. and Toint, P. L. (2015). CUTEst: a Constrained and Unconstrained Testing Environment with safe threads for mathematical optimization. Comput. Optim. Appl., 60(3),545–557.
  • Zhang, J. (2019). Gradient Descent based Optimization Algorithms for Deep Learning Models Training. [Online]. Available: http://arxiv.org/abs/1903.03614.
  • Polyak, B. T. (1964). Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys., 4(5), 1–17.
  • Duchi, J. C., Bartlett, P. L.and Wainwright, M. J. (2012). Randomized smoothing for (parallel) stochastic optimization. Proc. IEEE Conf. Decis. Control, 12, 5442–5444.
  • Bashirov, A. E., Kurpinar, E. M. and Özyapici, A. (2008). Multiplicative calculus and its applications. J. Math. Anal. Appl., 337(1), 36–48.
  • Georgiev S. G. and Zennir, K. (2022). Multiplicative Differential Calculus, 1st ed., vol. 1, no. 1. Taylor & Francis.
  • Gürefe Y. and Misirli, D. D. E.(2011). Product Calculi And Its Applications. J. Phys. A Math. Theor., 44(8), 1–22.
  • Uzer, A. (2010). Multiplicative type complex calculus as an alternative to the classical calculus. Comput. Math. with Appl., 60(10), 2725–2737.
  • Stanley, D. (1999). A multiplicative calculus. Primus, 9(4), 310–326.
  • Özyapici A. and Misirli, A. P. D. E. (2009). Multiplicative Calculus And Its Aplications,” Ege University.
  • Özyapıcı, A., Riza, M., Bilgehan, B. and Bashirov, A. E. (2014). On multiplicative and Volterra minimization methods. Numer. Algorithms, 67(3), 623–636.
  • Filip, D., Piatecki, C. and Andrada Filip, D. (2014). A non-newtonian examination of the theory of exogenous economic growth. Work. Pap., X(XX), 2014, [Online]. Available: https://hal.archives-ouvertes.fr/hal-00945781.
  • Florack L.and Van Assen, H. (2012). Multiplicative calculus in biomedical image analysis. J. Math. Imaging Vis., 42(1), 64–75.
  • Riza M.and Aktöre, H. (2015). The Runge-Kutta method in geometric multiplicative calculus. LMS J. Comput. Math., 18(1), 539–554.
  • Riza M. and Eminağa, B. (2014). Bigeometric Calculus and Runge Kutta Method. 1–19, [Online]. Available: http://arxiv.org/abs/1402.2877.
  • Cubillos, M. (2018). Modelling wave propagation without sampling restrictions using the multiplicative calculus I: Theoretical considerations. 1–18, [Online]. Available: http://arxiv.org/abs/1801.03402.
  • Stephen Boyd Lieven Vandenberghe, (2013). Convex Optimization 中文影印. Cambridge University Press.
Toplam 22 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Performans Değerlendirmesi
Bölüm Makaleler
Yazarlar

Erkan Kıymık 0000-0002-6383-1878

Ali Emre Öztürk 0000-0001-5904-9931

Erken Görünüm Tarihi 28 Haziran 2024
Yayımlanma Tarihi 30 Haziran 2024
Gönderilme Tarihi 12 Nisan 2024
Kabul Tarihi 16 Haziran 2024
Yayımlandığı Sayı Yıl 2024

Kaynak Göster

APA Kıymık, E., & Öztürk, A. E. (2024). Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences, 10(1), 209-227. https://doi.org/10.29132/ijpas.1467644
AMA Kıymık E, Öztürk AE. Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences. Haziran 2024;10(1):209-227. doi:10.29132/ijpas.1467644
Chicago Kıymık, Erkan, ve Ali Emre Öztürk. “Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms”. International Journal of Pure and Applied Sciences 10, sy. 1 (Haziran 2024): 209-27. https://doi.org/10.29132/ijpas.1467644.
EndNote Kıymık E, Öztürk AE (01 Haziran 2024) Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences 10 1 209–227.
IEEE E. Kıymık ve A. E. Öztürk, “Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms”, International Journal of Pure and Applied Sciences, c. 10, sy. 1, ss. 209–227, 2024, doi: 10.29132/ijpas.1467644.
ISNAD Kıymık, Erkan - Öztürk, Ali Emre. “Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms”. International Journal of Pure and Applied Sciences 10/1 (Haziran 2024), 209-227. https://doi.org/10.29132/ijpas.1467644.
JAMA Kıymık E, Öztürk AE. Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences. 2024;10:209–227.
MLA Kıymık, Erkan ve Ali Emre Öztürk. “Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms”. International Journal of Pure and Applied Sciences, c. 10, sy. 1, 2024, ss. 209-27, doi:10.29132/ijpas.1467644.
Vancouver Kıymık E, Öztürk AE. Exponential-Quadratic-Logarithmic Composite Function Optimization In Positive Domains: Leveraging Multiplicative Calculus In Gradient Descent Algorithms. International Journal of Pure and Applied Sciences. 2024;10(1):209-27.

154501544915448154471544615445