Araştırma Makalesi
BibTex RIS Kaynak Göster

Training feed forward neural network with modified Fletcher-Reeves method

Yıl 2018, Cilt: 1 Sayı: 1, 14 - 22, 01.08.2018

Öz

In this research, a modified Fletcher-Reeves
(FR) conjugate gradient algorithm for training large scale feed forward neural
network (FFNN) is presented. Under mild conditions, we establish that the
proposed method satisfies the sufficient descent condition, and it is globally
convergent under wolfe line search condition. The evidence which is provided by
experimental results showed that our proposed method is preferable and superior
to the classic methods.

Kaynakça

  • [1] Bishop C. (1995) ‘ Neural Network for pottern Reconition . Oxford.
  • [2] Haykin S.(1994) ‘ Neural Network : A comprehensre Function’ Macnillan College Publishing Company New York.
  • [3] Hmich A. Badri A. and Sahel A. (2011). Automatic speaker identification by using the nural network. In IEEE 2011 International Conference on Multimedia Computing Systems (ICMCS).
  • [4] Rumelhart D. and McClell J. (1986) Parallel distributed processing . Exploration in the microstructure of Cognition. Cambridge Bradford Books . MTT press.
  • [5] Livieris I . and Pintelas P.(2009) ‘ Performance evaluation of descent CG methods . For neural networks traning. In E.A. Lipitakis editor 9th Hellenic European Research on Computer Mathematics and it is Applications Conference (HERCMA 09).
  • [6] Rumelhart D. , Hintion G. and Williams R. (1986) ‘ Learning internal representations and error propagation ; In Rumelhart and Mc Cleeard. Editors , parallel Distributed processing ; Exploration in the Microstructure of Cognition Cambridge.
  • [7] Battiti R. (1992) . 1st and 2 and order method for learning between steepest descent and network method. Neural Comp 4(2).
  • [8] Abbo K. and Mohammed H. (2014) ‘ Conjugate gradient algorithm based on Aitken’s Process for training neural networks ‘ Raf J. of comp and Math Vol (11) , No (1) .
  • [9] Livieris I. and Pintelas P. (2008). "A survey on algorithms for training artificial neural networks", Technical Report No. TR08-01, Department of Mathematics University of Patras.
  • [10] Livieris I. and Pintelas R. (2011). "An advanced conjugate gradient training algorithm based on a modified secant equation", Technical Report NO. TR11 03.
  • [11] Livieris I., Sotiropoulos D., and Pintelas P. (2009). "On descent spectral CG algorithms for training recurrent neural networks", IEEE Computer Society. 13th Panellenic Conference of Informatics, pages 65–69.
  • [12] Fletcher R .and Reeves G . (1964) Function Minimization by Conjugate gradients computer Journal Vol (7)
  • [13] Gilbert C. and Nocedal J . (1992) ‘ lobal Convergence Properties of Conjugate gradient methods for . Optimization.
  • [14] Powell D. (1977) Restart Procedares for the Conjugate gradient method .Math Programming 12.
  • [15] Touti Ahmed and Story C (1990) efficient hybrid Conjugate gradient techniques .J. of optimization Theory and Applications G4.
Yıl 2018, Cilt: 1 Sayı: 1, 14 - 22, 01.08.2018

Öz

Kaynakça

  • [1] Bishop C. (1995) ‘ Neural Network for pottern Reconition . Oxford.
  • [2] Haykin S.(1994) ‘ Neural Network : A comprehensre Function’ Macnillan College Publishing Company New York.
  • [3] Hmich A. Badri A. and Sahel A. (2011). Automatic speaker identification by using the nural network. In IEEE 2011 International Conference on Multimedia Computing Systems (ICMCS).
  • [4] Rumelhart D. and McClell J. (1986) Parallel distributed processing . Exploration in the microstructure of Cognition. Cambridge Bradford Books . MTT press.
  • [5] Livieris I . and Pintelas P.(2009) ‘ Performance evaluation of descent CG methods . For neural networks traning. In E.A. Lipitakis editor 9th Hellenic European Research on Computer Mathematics and it is Applications Conference (HERCMA 09).
  • [6] Rumelhart D. , Hintion G. and Williams R. (1986) ‘ Learning internal representations and error propagation ; In Rumelhart and Mc Cleeard. Editors , parallel Distributed processing ; Exploration in the Microstructure of Cognition Cambridge.
  • [7] Battiti R. (1992) . 1st and 2 and order method for learning between steepest descent and network method. Neural Comp 4(2).
  • [8] Abbo K. and Mohammed H. (2014) ‘ Conjugate gradient algorithm based on Aitken’s Process for training neural networks ‘ Raf J. of comp and Math Vol (11) , No (1) .
  • [9] Livieris I. and Pintelas P. (2008). "A survey on algorithms for training artificial neural networks", Technical Report No. TR08-01, Department of Mathematics University of Patras.
  • [10] Livieris I. and Pintelas R. (2011). "An advanced conjugate gradient training algorithm based on a modified secant equation", Technical Report NO. TR11 03.
  • [11] Livieris I., Sotiropoulos D., and Pintelas P. (2009). "On descent spectral CG algorithms for training recurrent neural networks", IEEE Computer Society. 13th Panellenic Conference of Informatics, pages 65–69.
  • [12] Fletcher R .and Reeves G . (1964) Function Minimization by Conjugate gradients computer Journal Vol (7)
  • [13] Gilbert C. and Nocedal J . (1992) ‘ lobal Convergence Properties of Conjugate gradient methods for . Optimization.
  • [14] Powell D. (1977) Restart Procedares for the Conjugate gradient method .Math Programming 12.
  • [15] Touti Ahmed and Story C (1990) efficient hybrid Conjugate gradient techniques .J. of optimization Theory and Applications G4.
Toplam 15 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Matematik
Bölüm Articles
Yazarlar

Yoksal A. Laylani

Khalil K. Abbo

Hisham M. Khudhur Bu kişi benim

Yayımlanma Tarihi 1 Ağustos 2018
Yayımlandığı Sayı Yıl 2018 Cilt: 1 Sayı: 1

Kaynak Göster

APA Laylani, Y. A., Abbo, K. K., & Khudhur, H. M. (2018). Training feed forward neural network with modified Fletcher-Reeves method. Journal of Multidisciplinary Modeling and Optimization, 1(1), 14-22.
AMA Laylani YA, Abbo KK, Khudhur HM. Training feed forward neural network with modified Fletcher-Reeves method. jmmo. Ağustos 2018;1(1):14-22.
Chicago Laylani, Yoksal A., Khalil K. Abbo, ve Hisham M. Khudhur. “Training Feed Forward Neural Network With Modified Fletcher-Reeves Method”. Journal of Multidisciplinary Modeling and Optimization 1, sy. 1 (Ağustos 2018): 14-22.
EndNote Laylani YA, Abbo KK, Khudhur HM (01 Ağustos 2018) Training feed forward neural network with modified Fletcher-Reeves method. Journal of Multidisciplinary Modeling and Optimization 1 1 14–22.
IEEE Y. A. Laylani, K. K. Abbo, ve H. M. Khudhur, “Training feed forward neural network with modified Fletcher-Reeves method”, jmmo, c. 1, sy. 1, ss. 14–22, 2018.
ISNAD Laylani, Yoksal A. vd. “Training Feed Forward Neural Network With Modified Fletcher-Reeves Method”. Journal of Multidisciplinary Modeling and Optimization 1/1 (Ağustos 2018), 14-22.
JAMA Laylani YA, Abbo KK, Khudhur HM. Training feed forward neural network with modified Fletcher-Reeves method. jmmo. 2018;1:14–22.
MLA Laylani, Yoksal A. vd. “Training Feed Forward Neural Network With Modified Fletcher-Reeves Method”. Journal of Multidisciplinary Modeling and Optimization, c. 1, sy. 1, 2018, ss. 14-22.
Vancouver Laylani YA, Abbo KK, Khudhur HM. Training feed forward neural network with modified Fletcher-Reeves method. jmmo. 2018;1(1):14-22.