Abstract – The performances of learning algorithms employed in artificial neural networks (ANNs) have been analyzed for classifying baseband signals that are subjected to additive white Gaussian noise (AWGN) and frequency selective Rayleigh fading channel in this paper. The high order cumulants of the received signals have been utilized in the ANN classifier. Different learning algorithms have been used in finding the optimal weight set which directly affects the performance of artificial neural networks. The performances of Levenberg Marquardt (LM) and scaled conjugate gradient (SCG) algorithm, the most widely employed learning algorithms, have been compared for training of artificial neural networks. Computer simulation results have demonstrated that the LM-ANN classifier can reach much better classification accuracy than the SCG-ANN recognizer in even low training steps.
Abstract – The performances of learning algorithms employed in artificial neural networks (ANNs) have been analyzed for classifying baseband signals that are subjected to additive white Gaussian noise (AWGN) and frequency selective Rayleigh fading channel in this paper. The high order cumulants of the received signals have been utilized in the ANN classifier. Different learning algorithms have been used in finding the optimal weight set which directly affects the performance of artificial neural networks. The performances of Levenberg Marquardt (LM) and scaled conjugate gradient (SCG) algorithm, the most widely employed learning algorithms, have been compared for training of artificial neural networks. Computer simulation results have demonstrated that the LM-ANN classifier can reach much better classification accuracy than the SCG-ANN recognizer in even low training steps.
Journal Section | Articles |
---|---|
Authors | |
Publication Date | November 7, 2016 |
Published in Issue | Year 2016 Volume: 5 |