Research Article
BibTex RIS Cite

Year 2026, Volume: 8 Issue: 1 , 16 - 23 , 28.03.2026
https://doi.org/10.51537/chaos.1857261
https://izlik.org/JA73JD34RZ

Abstract

References

  • Amini, M.-R., V. Feofanov, L. Pauletto, L. Hadjadj, E. Devijver, et al., 2025 Self-training: A survey. Neurocomputing 616: 128904.
  • Anusree, M. and P. N. Pramod, 2025 Understanding chaotic neural networks: A comprehensive review. Nonlinear Dynamics pp. 1–16.
  • Anusree, M., P. Reshmi, J. Valadi, P. P. Nair, and P. Suravajhala, 2024 Hypothetical protein classification using neurochaos learning architecture. In International Conference on Information and Communication Technology for Competitive Strategies, pp. 337–346, Springer.
  • AS, R. A., N. B. Harikrishnan, and N. Nagaraj, 2023 Analysis of logistic map based neurons in neurochaos learning architectures for data classification. Chaos, Solitons and Fractals 170: 113347.
  • AS, R. A. and N. Nagaraj, 2025 Random heterogeneous neurochaos learning architecture for data classification. Chaos Theory and Applications 7: 10–30.
  • Balakrishnan, H. N., A. Kathpalia, S. Saha, and N. Nagaraj, 2019 Chaosnet: A chaos based artificial neural network architecture for classification. Chaos: An Interdisciplinary Journal of Nonlinear Science 29.
  • Bennett, K. and A. Demiriz, 1998 Semi-supervised support vector machines. Advances in Neural Information processing systems 11.
  • Blum, A. and T. Mitchell, 1998 Combining labeled and unlabeled data with co-training. In Proceedings of the eleventh annual conference on Computational learning theory, pp. 92–100, ACM.
  • Deng, L. and X. Li, 2013 Machine learning paradigms for speech recognition: An overview. IEEE Transactions on Audio, Speech, and Language Processing 21: 1060–1089.
  • Du, J., E. Grave, B. Gunel, V. Chaudhary, O. Celebi, et al., 2021 Self-training improves pre-training for natural language understanding. In Proceedings of the 2021 Conference of the North American chapter of the association for computational linguistics: human language technologies, pp. 5408–5418.
  • Duarte, J. M. and L. Berton, 2023 A review of semi-supervised learning for text classification. Artificial intelligence review 56: 9401–9469.
  • Feofanov, V., E. Devijver, and M.-R. Amini, 2024 Multi-class probabilistic bounds for majority vote classifiers with partially labeled data. Journal of Machine Learning Research 25: 1–47.
  • Ghiasi, G., B. Zoph, E. D. Cubuk, Q. V. Le, and T.-Y. Lin, 2021 Multi-task self-training for learning general representations. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 8856–8865.
  • Harikrishnan, N. and N. Nagaraj, 2020 Neurochaos inspired hybrid machine learning architecture for classification. In 2020 International Conference on Signal Processing and Communications (SPCOM), pp. 1–5, IEEE.
  • Harikrishnan, N., S. Pranay, and N. Nagaraj, 2022 Classification of sars-cov-2 viral genome sequences using neurochaos learning. Medical & Biological Engineering & Computing 60: 2245–2255.
  • Harikrishnan, N. B. and N. Nagaraj, 2021 When noise meets chaos: Stochastic resonance in neurochaos learning. Neural Networks 143: 425–435.
  • Henry, A. and N. Nagaraj, 2025a Augmented regression models using neurochaos learning. Chaos, Solitons and Fractals 201: 117213.
  • Henry, A. and N. Nagaraj, 2025b Neurochaos learning for classification using composition of chaotic maps. Chaos Theory and Applications 7: 107–116.
  • Henry, A., R. Sundaravaradhan, and N. Nagaraj, 2025 Simplified neurochaos learning architectures for data classification. Chaos: An Interdisciplinary Journal of Nonlinear Science 35.
  • Jin, C., T. F. Ng, and H. Ibrahim, 2025 Advancements in semisupervised deep learning for brain tumor segmentation in mri: A literature review. AI 6: 153.
  • Kahn, J., A. Lee, and A. Hannun, 2020 Self-training for end-toend speech recognition. In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 7084–7088, IEEE.
  • Mallapragada, P. K., R. Jin, A. K. Jain, and Y. Liu, 2008 Semiboost: Boosting for semi-supervised learning. IEEE transactions on pattern analysis and machine intelligence 31: 2000–2014.
  • NB, H., A. Kathpalia, and N. Nagaraj, 2022 Causality preserving chaotic transformation and classification using neurochaos learning. Advances in Neural Information Processing Systems 35: 2046–2058.
  • Pant, K. K., N. Nagaraj, et al., 2025 Advancing forest fires classification using neurochaos learning. arXiv preprint arXiv:2510.26383 . Scudder, H., 1965 Adaptive communication receivers. IEEE Transactions on Information Theory 11: 167–174.
  • Sethi, D., N. Nagaraj, et al., 2023 Neurochaos feature transformation for machine learning. Integration 90: 157–162.
  • Silva, N. F. F. D., L. F. Coletta, and E. R. Hruschka, 2016 A survey and comparative study of tweet sentiment analysis via semisupervised learning. ACM Computing Surveys (CSUR) 49: 1–26.
  • Sneha, K., A. Sudeesh, P. P. Nair, and P. Suravajhala, 2023 Biologically inspired chaosnet architecture for hypothetical protein classification. In 2023 Fifth International Conference on Electrical, Computer and Communication Technologies (ICECCT), pp. 1–6, IEEE.
  • Song, Z., X. Yang, Z. Xu, and I. King, 2022 Graph-based semisupervised learning: A comprehensive review. IEEE Transactions on Neural Networks and Learning Systems 34: 8174–8194.
  • Tamposis, I. A., K. D. Tsirigos, M. C. Theodoropoulou, P. I. Kontou, and P. G. Bagos, 2019 Semi-supervised learning of hidden markov models for biological sequence analysis. Bioinformatics 35: 2208–2215.
  • Tur, G., D. Hakkani-Tür, and R. E. Schapire, 2005 Combining active and semi-supervised learning for spoken language understanding. Speech Communication 45: 171–186.
  • Van Engelen, J. E. and H. H. Hoos, 2020 A survey on semisupervised learning. Machine learning 109: 373–440.
  • Wang, Y., H. Chen, Q. Heng, W. Hou, Y. Fan, et al., 2022 Freematch: Self-adaptive thresholding for semi-supervised learning. arXiv preprint arXiv:2205.07246 .
  • Zhang, B., Y.Wang,W. Hou, H.Wu, J.Wang, et al., 2021 Flexmatch: Boosting semi-supervised learning with curriculum pseudo labeling. Advances in neural information processing systems 34: 18408–18419.
  • Zhou, Z.-H. and M. Li, 2005 Tri-training: Exploiting unlabeled data using three classifiers. IEEE Transactions on knowledge and Data Engineering 17: 1529–1541.
  • Zou, Y., Z. Yu, B. Kumar, and J. Wang, 2018 Unsupervised domain adaptation for semantic segmentation via class-balanced selftraining. In Proceedings of the European conference on computer vision (ECCV), pp. 289–305.

Self-Training the Neurochaos Learning Algorithm

Year 2026, Volume: 8 Issue: 1 , 16 - 23 , 28.03.2026
https://doi.org/10.51537/chaos.1857261
https://izlik.org/JA73JD34RZ

Abstract

In numerous practical applications, acquiring substantial quantities of labelled data is challenging and expensive, but unlabelled data is readily accessible. Conventional supervised learning methods frequently underperform in scenarios characterised by little labelled data or imbalanced datasets. This study introduces a hybrid semi-supervised learning (SSL) architecture that integrates Neurochaos Learning (NL) with a threshold-based Self-Training (ST) method to overcome this constraint. The NL architecture converts input characteristics into chaos-based firing-rate representations that encapsulate nonlinear relationships within the data, whereas ST progressively enlarges the labelled set utilising high-confidence pseudo-labelled samples. The model’s performance is assessed using ten benchmark datasets and five machine learning classifiers, with 85% of the training data considered unlabelled and just 15% utilised as labelled data. The proposed Self-Training Neurochaos Learning (NL+ST) architecture consistently attains superior performance gain relative to standalone ST models, especially on limited, nonlinear and imbalanced datasets like Wine (162.42 %), Iris (121.34 %) and Glass Identification (95.46 %). The results indicate that using chaos-based feature extraction with SSL improves generalisation, resilience, and classification accuracy in low-data contexts.

References

  • Amini, M.-R., V. Feofanov, L. Pauletto, L. Hadjadj, E. Devijver, et al., 2025 Self-training: A survey. Neurocomputing 616: 128904.
  • Anusree, M. and P. N. Pramod, 2025 Understanding chaotic neural networks: A comprehensive review. Nonlinear Dynamics pp. 1–16.
  • Anusree, M., P. Reshmi, J. Valadi, P. P. Nair, and P. Suravajhala, 2024 Hypothetical protein classification using neurochaos learning architecture. In International Conference on Information and Communication Technology for Competitive Strategies, pp. 337–346, Springer.
  • AS, R. A., N. B. Harikrishnan, and N. Nagaraj, 2023 Analysis of logistic map based neurons in neurochaos learning architectures for data classification. Chaos, Solitons and Fractals 170: 113347.
  • AS, R. A. and N. Nagaraj, 2025 Random heterogeneous neurochaos learning architecture for data classification. Chaos Theory and Applications 7: 10–30.
  • Balakrishnan, H. N., A. Kathpalia, S. Saha, and N. Nagaraj, 2019 Chaosnet: A chaos based artificial neural network architecture for classification. Chaos: An Interdisciplinary Journal of Nonlinear Science 29.
  • Bennett, K. and A. Demiriz, 1998 Semi-supervised support vector machines. Advances in Neural Information processing systems 11.
  • Blum, A. and T. Mitchell, 1998 Combining labeled and unlabeled data with co-training. In Proceedings of the eleventh annual conference on Computational learning theory, pp. 92–100, ACM.
  • Deng, L. and X. Li, 2013 Machine learning paradigms for speech recognition: An overview. IEEE Transactions on Audio, Speech, and Language Processing 21: 1060–1089.
  • Du, J., E. Grave, B. Gunel, V. Chaudhary, O. Celebi, et al., 2021 Self-training improves pre-training for natural language understanding. In Proceedings of the 2021 Conference of the North American chapter of the association for computational linguistics: human language technologies, pp. 5408–5418.
  • Duarte, J. M. and L. Berton, 2023 A review of semi-supervised learning for text classification. Artificial intelligence review 56: 9401–9469.
  • Feofanov, V., E. Devijver, and M.-R. Amini, 2024 Multi-class probabilistic bounds for majority vote classifiers with partially labeled data. Journal of Machine Learning Research 25: 1–47.
  • Ghiasi, G., B. Zoph, E. D. Cubuk, Q. V. Le, and T.-Y. Lin, 2021 Multi-task self-training for learning general representations. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 8856–8865.
  • Harikrishnan, N. and N. Nagaraj, 2020 Neurochaos inspired hybrid machine learning architecture for classification. In 2020 International Conference on Signal Processing and Communications (SPCOM), pp. 1–5, IEEE.
  • Harikrishnan, N., S. Pranay, and N. Nagaraj, 2022 Classification of sars-cov-2 viral genome sequences using neurochaos learning. Medical & Biological Engineering & Computing 60: 2245–2255.
  • Harikrishnan, N. B. and N. Nagaraj, 2021 When noise meets chaos: Stochastic resonance in neurochaos learning. Neural Networks 143: 425–435.
  • Henry, A. and N. Nagaraj, 2025a Augmented regression models using neurochaos learning. Chaos, Solitons and Fractals 201: 117213.
  • Henry, A. and N. Nagaraj, 2025b Neurochaos learning for classification using composition of chaotic maps. Chaos Theory and Applications 7: 107–116.
  • Henry, A., R. Sundaravaradhan, and N. Nagaraj, 2025 Simplified neurochaos learning architectures for data classification. Chaos: An Interdisciplinary Journal of Nonlinear Science 35.
  • Jin, C., T. F. Ng, and H. Ibrahim, 2025 Advancements in semisupervised deep learning for brain tumor segmentation in mri: A literature review. AI 6: 153.
  • Kahn, J., A. Lee, and A. Hannun, 2020 Self-training for end-toend speech recognition. In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 7084–7088, IEEE.
  • Mallapragada, P. K., R. Jin, A. K. Jain, and Y. Liu, 2008 Semiboost: Boosting for semi-supervised learning. IEEE transactions on pattern analysis and machine intelligence 31: 2000–2014.
  • NB, H., A. Kathpalia, and N. Nagaraj, 2022 Causality preserving chaotic transformation and classification using neurochaos learning. Advances in Neural Information Processing Systems 35: 2046–2058.
  • Pant, K. K., N. Nagaraj, et al., 2025 Advancing forest fires classification using neurochaos learning. arXiv preprint arXiv:2510.26383 . Scudder, H., 1965 Adaptive communication receivers. IEEE Transactions on Information Theory 11: 167–174.
  • Sethi, D., N. Nagaraj, et al., 2023 Neurochaos feature transformation for machine learning. Integration 90: 157–162.
  • Silva, N. F. F. D., L. F. Coletta, and E. R. Hruschka, 2016 A survey and comparative study of tweet sentiment analysis via semisupervised learning. ACM Computing Surveys (CSUR) 49: 1–26.
  • Sneha, K., A. Sudeesh, P. P. Nair, and P. Suravajhala, 2023 Biologically inspired chaosnet architecture for hypothetical protein classification. In 2023 Fifth International Conference on Electrical, Computer and Communication Technologies (ICECCT), pp. 1–6, IEEE.
  • Song, Z., X. Yang, Z. Xu, and I. King, 2022 Graph-based semisupervised learning: A comprehensive review. IEEE Transactions on Neural Networks and Learning Systems 34: 8174–8194.
  • Tamposis, I. A., K. D. Tsirigos, M. C. Theodoropoulou, P. I. Kontou, and P. G. Bagos, 2019 Semi-supervised learning of hidden markov models for biological sequence analysis. Bioinformatics 35: 2208–2215.
  • Tur, G., D. Hakkani-Tür, and R. E. Schapire, 2005 Combining active and semi-supervised learning for spoken language understanding. Speech Communication 45: 171–186.
  • Van Engelen, J. E. and H. H. Hoos, 2020 A survey on semisupervised learning. Machine learning 109: 373–440.
  • Wang, Y., H. Chen, Q. Heng, W. Hou, Y. Fan, et al., 2022 Freematch: Self-adaptive thresholding for semi-supervised learning. arXiv preprint arXiv:2205.07246 .
  • Zhang, B., Y.Wang,W. Hou, H.Wu, J.Wang, et al., 2021 Flexmatch: Boosting semi-supervised learning with curriculum pseudo labeling. Advances in neural information processing systems 34: 18408–18419.
  • Zhou, Z.-H. and M. Li, 2005 Tri-training: Exploiting unlabeled data using three classifiers. IEEE Transactions on knowledge and Data Engineering 17: 1529–1541.
  • Zou, Y., Z. Yu, B. Kumar, and J. Wang, 2018 Unsupervised domain adaptation for semantic segmentation via class-balanced selftraining. In Proceedings of the European conference on computer vision (ECCV), pp. 289–305.
There are 35 citations in total.

Details

Primary Language English
Subjects Applied Mathematics (Other)
Journal Section Research Article
Authors

Anusree M 0009-0000-9385-001X

Akhila Henry 0000-0002-0496-0362

Pramod Nair 0000-0001-6142-5326

Submission Date January 6, 2026
Acceptance Date March 13, 2026
Publication Date March 28, 2026
DOI https://doi.org/10.51537/chaos.1857261
IZ https://izlik.org/JA73JD34RZ
Published in Issue Year 2026 Volume: 8 Issue: 1

Cite

APA M, A., Henry, A., & Nair, P. (2026). Self-Training the Neurochaos Learning Algorithm. Chaos Theory and Applications, 8(1), 16-23. https://doi.org/10.51537/chaos.1857261

Chaos Theory and Applications in Applied Sciences and Engineering: An interdisciplinary journal of nonlinear science 23830 28903   

The published articles in CHTA are licensed under a Creative Commons Attribution-NonCommercial 4.0 International License Cc_by-nc_icon.svg