Research Article
BibTex RIS Cite

AMI: An Auditory Machine Intelligence Algorithm for Predicting Sensory-Like Data

Year 2020, Volume: 5 Issue: 2, 71 - 89, 01.12.2020

Abstract

In this paper, we present the results of our experiments using a new biologically constrained machine intelligence algorithm based on neural processing in the auditory cortex called auditory machine intelligence (AMI). This algorithm is an online learning technique for predicting sensory time series data i.e. data that comes in streams or a sequential order. The AMI algorithm is particularly inspired by the mismatch negativity effect which provides important evidence that the brain learns a statistical structure of the world it senses. We show through a number of experiments with popular benchmarks, how this algorithm may be applied in a real world sense. The results of these experiments have also been compared with two very popular techniques that have been used for time series predictions and are very encouraging.

References

  • Ahmad, S., Lavin, A., Purdy, S., & Agha, Z. (2017). Unsupervised real-time anomaly detection for streaming data. Neurocomputing, 262, 134-147.
  • Cui, Y., Surpur, C., Ahmad, S., & Hawkins, J. (2016a). A comparative study of HTM and other neural network models for online sequence learning with streaming data. In 2016 International Joint Conference on Neural Networks (IJCNN) (pp. 1530-1538). IEEE.
  • Cui, Y., Ahmad, S., & Hawkins, J. (2016b). Continuous online sequence learning with an unsupervised neural network model. Neural computation, 28(11), 2474-2504. Cui, Y., Ahmad, S., & Hawkins, J. (2017). The HTM Spatial Pooler—A Neocortical Algorithm for Online Sparse Distributed Coding. Frontiers in Computational Neuroscience, 11.
  • Cui, Y., Ahmad, S., & Hawkins, J. (2017). The HTM spatial pooler—A neocortical algorithm for online sparse distributed coding. Frontiers in computational neuroscience, 11, 111.
  • Goodfellow, I. (2016). Deeplearning, MIT press. doi:10.1016/B978-0-12-801775-3.00001-9.
  • Gers, Felix A., Jürgen Schmidhuber, and Fred Cummins. "Learning to Forget: Continual Prediction with LSTM." Neural Computation 12, no. 10 (2000): 2451-2471.
  • Graves, A., Mohamed, A. R., & Hinton, G. (2013). Speech recognition with deep recurrent neural networks. In 2013 IEEE international conference on acoustics, speech and signal processing (pp. 6645-6649). IEEE.
  • Hawkins, J., Ahmad, S., & Dubinsky, D. (2010). Hierarchical temporal memory including HTM cortical learning algorithms. Techical report, Numenta, Inc, Palto Alto. https://web.archive.org/web/20110714213347/http://www.numenta.com/htm- overview/education/HTM_CorticalLearningAlgorithms.pdf
  • Hawkins, J., Ahmad, S., Purdy, S., & Lavin, A. (2016). Biological and machine intelligence (BAMI). Initial online release 0.4.
  • Hinton, G. E. (2007). Learning multiple layers of representation. Trends in cognitive sciences, 11(10), 428-434.
  • Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9(8), 1735-1780.
  • Ivakhnenko, A. G. (1968). The group method of data of handling; a rival of the method of stochastic approximation. Soviet Automatic Control, 13, 43-55.
  • Lavin, A., & Ahmad, S. (2015). Evaluating Real-Time Anomaly Detection Algorithms--The Numenta Anomaly Benchmark. In 2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA) (pp. 38-44). IEEE.
  • Liang, N. Y., Huang, G. B., Saratchandran, P., & Sundararajan, N. (2006). A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Transactions on neural networks, 17(6), 1411-1423.
  • Lieder, F., Daunizeau, J., Garrido, M. I., Friston, K. J., & Stephan, K. E. (2013a). Modelling trial- by-trial changes in the mismatch negativity. PLoS computational biology, 9(2).
  • Lieder, F., Stephan, K. E., Daunizeau, J., Garrido, M. I., & Friston, K. J. (2013b). A neurocomputational model of the mismatch negativity. PLoS computational biology, 9(11).
  • Moritz, S., Sardá, A., Bartz-Beielstein, T., Zaefferer, M., & Stork, J. (2015). Comparison of different methods for univariate time series imputation in R. arXiv preprint arXiv:1510.03924.
  • Näätänen, R., Gaillard, A. W., & Mäntysalo, S. (1978). Early selective-attention effect on evoked potential reinterpreted. Acta psychologica, 42(4), 313-329.
  • Näätänen, R., Paavilainen, P., Rinne, T., & Alho, K. (2007). The mismatch negativity (MMN) in basic research of central auditory processing: a review. Clinical neurophysiology,118(12), 2544-2590.
  • Osegi, E. N., & Anireh, V. I. (2016). Deviant Learning Algorithm: Learning Sparse Mismatch Representations through Time and Space. arXiv preprint arXiv:1609.01459.
  • Osegi, E. N., Anireh, V. I., & Onukwugha, C. G. (2018, June). pCWoT-MOBILE: a collaborative web based platform for real time control in the smart space. iSTEAMS SMART-MIINDs Conference, 13(3), 237-250.
  • Osegi, E.N., Taylor, O.E., Wokoma, B.A., & Idachaba, A.O. (2020, in-press). A Smart Grid Technique for Dynamic Load Prediction in Nigerian Power Distribution Network. International Conference on Sustainable and Innovative Solutions for Current Challenges in Engineering & Technology (ICSISCET- 2019), Gwalior, India
  • Rodriguez-Cobo, L., Ruiz-Lombera, R., Conde, O. M., López-Higuera, J. M., Cobo, A., & Mirapeix, J. (2013). Feasibility study of Hierarchical Temporal Memories applied to welding diagnostics. Sensors and Actuators A: Physical, 204, 58-66.
  • Socher, R., Pennington, J., Huang, E. H., Ng, A. Y., & Manning, C. D. (2011a). Semi- supervised recursive autoencoders for predicting sentiment distributions. In Proceedings of the conference on empirical methods in natural language processing (pp.151-161). Association for Computational Linguistics.
  • Socher, R., Huang, E. H., Pennin, J., Manning, C. D., & Ng, A. Y. (2011b). Dynamic pooling and unfolding recursive autoencoders for paraphrase detection. In Advances in neural information processing systems (pp. 801-809).
  • Sollini, J., Chapuis, G. A., Clopath, C., & Chadderton, P. (2018). ON-OFF receptive fields in auditory cortex diverge during development and contribute to directional sweep selectivity. Nature communications, 9(1), 1-12.
  • Struye, J., & Latré, S. (2019, in-press). Hierarchical temporal memory and recurrent neural networks for time series prediction: an empirical validation and reduction to multilayer perceptrons. Neurocomputing. https://doi.org/10.1016/j.neucom.2018.09.098
  • Sutskever, I., & Hinton, G. (2007). Learning multilevel distributed representations for high- dimensional sequences. In Artificial intelligence and statistics (pp. 548-555).
  • Takaura, K., & Fujii, N. (2016). Facilitative effect of repetitive presentation of one stimulus on cortical responses to other stimuli in macaque monkeys–a possible neural mechanism for mismatch negativity. European Journal of Neuroscience, 43(4), 516-528.
  • Wokoma, B. A., & Osegi, E. N. (2019, October). A Resonant Fault Current Limiting Prediction Technique based on Auditory Machine Intelligence. In 2019 2nd International Conference of the IEEE Nigeria Computer Chapter (NigeriaComputConf) (pp. 1-5). IEEE.
  • Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems for optimization. IEEE transactions on evolutionary computation, 1(1), 67-82.
  • Yöntem, M. K., Adem, K., İlhan, T., & Kılıçarslan, S. (2019). Divorce prediction using correlation based feature selection and artificial neural networks. Nevşehir Hacı Bektaş Veli Üniversitesi SBE Dergisi, 9(1), 259-273.

AMI: An Auditory Machine Intelligence Algorithm for Predicting Sensory-Like Data

Year 2020, Volume: 5 Issue: 2, 71 - 89, 01.12.2020

Abstract

In this paper, we present the results of our experiments using a new biologically constrained machine intelligence algorithm based on neural processing in the auditory cortex called auditory machine intelligence (AMI). This algorithm is an online learning technique for predicting sensory time series data i.e. data that comes in streams or a sequential order. The AMI algorithm is particularly inspired by the mismatch negativity effect which provides important evidence that the brain learns a statistical structure of the world it senses. We show through a number of experiments with popular benchmarks, how this algorithm may be applied in a real world sense. The results of these experiments have also been compared with two very popular techniques that have been used for time series predictions and are very encouraging.

References

  • Ahmad, S., Lavin, A., Purdy, S., & Agha, Z. (2017). Unsupervised real-time anomaly detection for streaming data. Neurocomputing, 262, 134-147.
  • Cui, Y., Surpur, C., Ahmad, S., & Hawkins, J. (2016a). A comparative study of HTM and other neural network models for online sequence learning with streaming data. In 2016 International Joint Conference on Neural Networks (IJCNN) (pp. 1530-1538). IEEE.
  • Cui, Y., Ahmad, S., & Hawkins, J. (2016b). Continuous online sequence learning with an unsupervised neural network model. Neural computation, 28(11), 2474-2504. Cui, Y., Ahmad, S., & Hawkins, J. (2017). The HTM Spatial Pooler—A Neocortical Algorithm for Online Sparse Distributed Coding. Frontiers in Computational Neuroscience, 11.
  • Cui, Y., Ahmad, S., & Hawkins, J. (2017). The HTM spatial pooler—A neocortical algorithm for online sparse distributed coding. Frontiers in computational neuroscience, 11, 111.
  • Goodfellow, I. (2016). Deeplearning, MIT press. doi:10.1016/B978-0-12-801775-3.00001-9.
  • Gers, Felix A., Jürgen Schmidhuber, and Fred Cummins. "Learning to Forget: Continual Prediction with LSTM." Neural Computation 12, no. 10 (2000): 2451-2471.
  • Graves, A., Mohamed, A. R., & Hinton, G. (2013). Speech recognition with deep recurrent neural networks. In 2013 IEEE international conference on acoustics, speech and signal processing (pp. 6645-6649). IEEE.
  • Hawkins, J., Ahmad, S., & Dubinsky, D. (2010). Hierarchical temporal memory including HTM cortical learning algorithms. Techical report, Numenta, Inc, Palto Alto. https://web.archive.org/web/20110714213347/http://www.numenta.com/htm- overview/education/HTM_CorticalLearningAlgorithms.pdf
  • Hawkins, J., Ahmad, S., Purdy, S., & Lavin, A. (2016). Biological and machine intelligence (BAMI). Initial online release 0.4.
  • Hinton, G. E. (2007). Learning multiple layers of representation. Trends in cognitive sciences, 11(10), 428-434.
  • Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9(8), 1735-1780.
  • Ivakhnenko, A. G. (1968). The group method of data of handling; a rival of the method of stochastic approximation. Soviet Automatic Control, 13, 43-55.
  • Lavin, A., & Ahmad, S. (2015). Evaluating Real-Time Anomaly Detection Algorithms--The Numenta Anomaly Benchmark. In 2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA) (pp. 38-44). IEEE.
  • Liang, N. Y., Huang, G. B., Saratchandran, P., & Sundararajan, N. (2006). A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Transactions on neural networks, 17(6), 1411-1423.
  • Lieder, F., Daunizeau, J., Garrido, M. I., Friston, K. J., & Stephan, K. E. (2013a). Modelling trial- by-trial changes in the mismatch negativity. PLoS computational biology, 9(2).
  • Lieder, F., Stephan, K. E., Daunizeau, J., Garrido, M. I., & Friston, K. J. (2013b). A neurocomputational model of the mismatch negativity. PLoS computational biology, 9(11).
  • Moritz, S., Sardá, A., Bartz-Beielstein, T., Zaefferer, M., & Stork, J. (2015). Comparison of different methods for univariate time series imputation in R. arXiv preprint arXiv:1510.03924.
  • Näätänen, R., Gaillard, A. W., & Mäntysalo, S. (1978). Early selective-attention effect on evoked potential reinterpreted. Acta psychologica, 42(4), 313-329.
  • Näätänen, R., Paavilainen, P., Rinne, T., & Alho, K. (2007). The mismatch negativity (MMN) in basic research of central auditory processing: a review. Clinical neurophysiology,118(12), 2544-2590.
  • Osegi, E. N., & Anireh, V. I. (2016). Deviant Learning Algorithm: Learning Sparse Mismatch Representations through Time and Space. arXiv preprint arXiv:1609.01459.
  • Osegi, E. N., Anireh, V. I., & Onukwugha, C. G. (2018, June). pCWoT-MOBILE: a collaborative web based platform for real time control in the smart space. iSTEAMS SMART-MIINDs Conference, 13(3), 237-250.
  • Osegi, E.N., Taylor, O.E., Wokoma, B.A., & Idachaba, A.O. (2020, in-press). A Smart Grid Technique for Dynamic Load Prediction in Nigerian Power Distribution Network. International Conference on Sustainable and Innovative Solutions for Current Challenges in Engineering & Technology (ICSISCET- 2019), Gwalior, India
  • Rodriguez-Cobo, L., Ruiz-Lombera, R., Conde, O. M., López-Higuera, J. M., Cobo, A., & Mirapeix, J. (2013). Feasibility study of Hierarchical Temporal Memories applied to welding diagnostics. Sensors and Actuators A: Physical, 204, 58-66.
  • Socher, R., Pennington, J., Huang, E. H., Ng, A. Y., & Manning, C. D. (2011a). Semi- supervised recursive autoencoders for predicting sentiment distributions. In Proceedings of the conference on empirical methods in natural language processing (pp.151-161). Association for Computational Linguistics.
  • Socher, R., Huang, E. H., Pennin, J., Manning, C. D., & Ng, A. Y. (2011b). Dynamic pooling and unfolding recursive autoencoders for paraphrase detection. In Advances in neural information processing systems (pp. 801-809).
  • Sollini, J., Chapuis, G. A., Clopath, C., & Chadderton, P. (2018). ON-OFF receptive fields in auditory cortex diverge during development and contribute to directional sweep selectivity. Nature communications, 9(1), 1-12.
  • Struye, J., & Latré, S. (2019, in-press). Hierarchical temporal memory and recurrent neural networks for time series prediction: an empirical validation and reduction to multilayer perceptrons. Neurocomputing. https://doi.org/10.1016/j.neucom.2018.09.098
  • Sutskever, I., & Hinton, G. (2007). Learning multilevel distributed representations for high- dimensional sequences. In Artificial intelligence and statistics (pp. 548-555).
  • Takaura, K., & Fujii, N. (2016). Facilitative effect of repetitive presentation of one stimulus on cortical responses to other stimuli in macaque monkeys–a possible neural mechanism for mismatch negativity. European Journal of Neuroscience, 43(4), 516-528.
  • Wokoma, B. A., & Osegi, E. N. (2019, October). A Resonant Fault Current Limiting Prediction Technique based on Auditory Machine Intelligence. In 2019 2nd International Conference of the IEEE Nigeria Computer Chapter (NigeriaComputConf) (pp. 1-5). IEEE.
  • Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems for optimization. IEEE transactions on evolutionary computation, 1(1), 67-82.
  • Yöntem, M. K., Adem, K., İlhan, T., & Kılıçarslan, S. (2019). Divorce prediction using correlation based feature selection and artificial neural networks. Nevşehir Hacı Bektaş Veli Üniversitesi SBE Dergisi, 9(1), 259-273.
There are 32 citations in total.

Details

Primary Language English
Subjects Computer Software
Journal Section PAPERS
Authors

Emmanuel Ndidi Osegi

Vincent Anireh This is me 0000-0002-8864-0226

Publication Date December 1, 2020
Submission Date March 23, 2020
Acceptance Date May 27, 2020
Published in Issue Year 2020 Volume: 5 Issue: 2

Cite

APA Osegi, E. N., & Anireh, V. (2020). AMI: An Auditory Machine Intelligence Algorithm for Predicting Sensory-Like Data. Computer Science, 5(2), 71-89.

The Creative Commons Attribution 4.0 International License 88x31.png is applied to all research papers published by JCS and

A Digital Object Identifier (DOI) Logo_TM.png is assigned for each published paper