Conference Paper

An Adaptive Sigmoidal Activation Function for Training Feed Forward Neural Network Equalizer

Volume: 14 December 31, 2021
  • Zohra Zerdoumı
  • Fadila Benmeddour
  • Latifa Abdou
  • Djamel Benatıa
EN

An Adaptive Sigmoidal Activation Function for Training Feed Forward Neural Network Equalizer

Abstract

Feed for word neural networks (FFNN) have attracted a great attention, in digital communication area. Especially they are investigated as nonlinear equalizers at the receiver, to mitigate channel distortions and additive noise. The major drawback of the FFNN is their extensive training. We present a new approach to enhance their training efficiency by adapting the activation function. Adapting procedure for activation function extensively increases the flexibility and the nonlinear approximation capability of FFNN. Consequently, the learning process presents better performances, offers more flexibility and enhances nonlinear capability of NN structure thus the final state kept away from undesired saturation regions. The effectiveness of the proposed method is demonstrated through different challenging channel models, it performs quite well for nonlinear channels which are severe and hard to equalize. The performance is measured throughout, convergence properties, minimum bit error achieved. The proposed algorithm was found to converge rapidly, and accomplish the minimum steady state value. All simulation shows that the proposed method improves significantly the training efficiency of FFNN based equalizer compared to the standard training one.

Keywords

References

  1. Chandra, P., & Singh, Y. (2004). An activation function adapting training algorithm for sigmoidal feedforward networks. Neurocomputing, 61, 429-437.
  2. Corral, P., Ludwig, O., & Lima, A. D. C. (2010). Time-varying channel neural equalisation using Gauss-Newton algorithm. Electronics Letters, 46(15), 1055-1056.
  3. Daqi, G., & Genxing, Y. (2003). Influences of variable scales and activation functions on the performances of multilayer feedforward neural networks. Pattern Recognition, 36(4), 869-878.
  4. Haykin, S. (1999). Neural networks: A comprehensive foundation, 2nd ed. Englewood Cliffs, NJ: Prentice-Hall
  5. Lyu, X., Feng, W., Shi, R., Pei, Y., & Ge, N. (2015, April). Artificial neural network-based nonlinear channel equalization: A soft-output perspective. In 2015 22nd International Conference on Telecommunications (ICT) (pp. 243-248). IEEE.Proakis, J.G.& Salehi.M. (2008) Digital communications (5th Ed). McGraw-Hill.
  6. Saduf, M. A. W. (2013). Comparative study of back propagation learning algorithms for neural networks. International Journal of Advanced Research in Computer Science and Software Engineering, 3(12), 1151-1156.
  7. Schmidh ber, J. (2015). ‘Deep learning in neural networks n overview’.Neural Networks, vol.61, pp. 85–117.
  8. Wang, X., Tang, Z., Tamura, H., Ishii, M., & Sun, W. D. (2004). An improved backpropagation algorithm to avoid the local minima problem. Neurocomputing, 56, 455-460.

Details

Primary Language

English

Subjects

Engineering

Journal Section

Conference Paper

Authors

Zohra Zerdoumı This is me
Algeria

Fadila Benmeddour This is me
Algeria

Latifa Abdou This is me
Algeria

Djamel Benatıa This is me
Algeria

Publication Date

December 31, 2021

Submission Date

March 10, 2021

Acceptance Date

May 31, 2021

Published in Issue

Year 2021 Volume: 14

APA
Zerdoumı, Z., Benmeddour, F., Abdou, L., & Benatıa, D. (2021). An Adaptive Sigmoidal Activation Function for Training Feed Forward Neural Network Equalizer. The Eurasia Proceedings of Science Technology Engineering and Mathematics, 14, 1-7. https://doi.org/10.55549/epstem.1050144