Abstract – The performances of learning algorithms employed in artificial neural networks (ANNs) have been analyzed for classifying baseband signals that are subjected to additive white Gaussian noise (AWGN) and frequency selective Rayleigh fading channel in this paper. The high order cumulants of the received signals have been utilized in the ANN classifier. Different learning algorithms have been used in finding the optimal weight set which directly affects the performance of artificial neural networks. The performances of Levenberg Marquardt (LM) and scaled conjugate gradient (SCG) algorithm, the most widely employed learning algorithms, have been compared for training of artificial neural networks. Computer simulation results have demonstrated that the LM-ANN classifier can reach much better classification accuracy than the SCG-ANN recognizer in even low training steps.
Abstract – The performances of learning algorithms employed in artificial neural networks (ANNs) have been analyzed for classifying baseband signals that are subjected to additive white Gaussian noise (AWGN) and frequency selective Rayleigh fading channel in this paper. The high order cumulants of the received signals have been utilized in the ANN classifier. Different learning algorithms have been used in finding the optimal weight set which directly affects the performance of artificial neural networks. The performances of Levenberg Marquardt (LM) and scaled conjugate gradient (SCG) algorithm, the most widely employed learning algorithms, have been compared for training of artificial neural networks. Computer simulation results have demonstrated that the LM-ANN classifier can reach much better classification accuracy than the SCG-ANN recognizer in even low training steps.
Bölüm | Articles |
---|---|
Yazarlar | |
Yayımlanma Tarihi | 7 Kasım 2016 |
Yayımlandığı Sayı | Yıl 2016 Cilt: 5 |
As of 2021, JNRS is licensed under a Creative Commons Attribution-NonCommercial 4.0 International Licence (CC BY-NC).