Research Article
BibTex RIS Cite

A SYSTEMATIC AND EFFICIENT INPUT SELECTION METHOD FOR ARTIFICIAL NEURAL NETWORKS USING MIXED-INTEGER NONLINEAR PROGRAMMING

Year 2022, Volume: 10 Issue: 3, 762 - 773, 01.09.2022
https://doi.org/10.36306/konjes.1077177

Abstract

Selection of input variables of the empirical models has vital effect on the prediction performance, reduced overfitting and reduced computational load. Various trials and error and sequential methods in the literature to deal with input selection for artificial neural networks (ANNs). However, these methods are not considered as automatic and systematic. This study proposes a novel and efficient mixed integer nonlinear programming-based approach to handle optimal input selection and the ANN training simultaneously for classification problems. Such selection uses binary (0-1) variables to represent the presence of the input variables and trains traditional continuous network weights simultaneously. Two classification case studies are given to demonstrate the advantages by using widely used data sets and statistical measures. The first data set is related to the characterization of the type of a tumor related to breast cancer, the second data set is about predicting the type of a biotechnological product using different features, the last one is related to heart failure prediction. Results show that better test performance can be achieved with optimally selected inputs, resulting in reduced overfitting. The proposed approach delivers a significant advantage during the design and training of the ANNs and is also applicable to other empirical models.

Supporting Institution

TÜBİTAK

Project Number

118C245

Thanks

This publication has been produced benefiting from the 2232 International Fellowship for Outstanding Researchers Program of TUBITAK (Project No: 118C245). However, the entire responsibility of the publication belongs to the owner of the publication.

References

  • Agarap, Abien Fred M. 2018. “On Breast Cancer Detection: An Application of Machine Learning Algorithms on the Wisconsin Diagnostic Dataset.” In Proceedings of the 2nd International Conference on Machine Learning and Soft Computing, 5–9.
  • Aha, David W, and Richard L Bankert. 1996. “A Comparative Evaluation of Sequential Feature Selection Algorithms.” In Learning from Data, Springer, 199–206.
  • Akdag, Unal, M. Aydin Komur, and A. Feridun Ozguc. 2009. “Estimation of Heat Transfer in Oscillating Annular Flow Using Artifical Neural Networks.” Advances in Engineering Software 40(9): 864–70.
  • Alom, Md Zahangir et al. 2019. “A State-of-the-Art Survey on Deep Learning Theory and Architectures.” Electronics (Switzerland) 8(3): 292.
  • Azadeh, A., S. F. Ghaderi, and S. Sohrabkhani. 2008. “Annual Electricity Consumption Forecasting by Neural Network in High Energy Consuming Industrial Sectors.” Energy Conversion and Management 49(8): 2272–78.
  • Benbrahim, Houssam, Hanaâ Hachimi, and Aouatif Amine. 2019. “Comparative Study of Machine Learning Algorithms Using the Breast Cancer Dataset.” In International Conference on Advanced Intelligent Systems for Sustainable Development, Springer, 83–91.
  • Bredensteiner, Erin J., and Kristin P. Bennett. 1999. “Multicategory Classification by Support Vector Machines.” Computational Optimization and Applications 12(1–3): 53–79.
  • Castellano, Giovanna, and Anna Maria Fanelli. 2000. “Variable Selection Using Neural-Network Models.” Neurocomputing 31(1–4): 1–13.
  • Chicco, Davide, and Giuseppe Jurman. 2020. “Machine Learning Can Predict Survival of Patients with Heart Failure from Serum Creatinine and Ejection Fraction Alone.” BMC medical informatics and decision making 20(1): 1–16.
  • Diaz, G I, A Fokoue-Nkoutche, G Nannicini, and H Samulowitz. 2017. “An Effective Algorithm for Hyperparameter Optimization of Neural Networks.” IBM Journal of Research and Development 61(4/5): 9:1-9:11.
  • Duran, Marco A., and Ignacio E. Grossmann. 1986. “An Outer-Approximation Algorithm for a Class of Mixed-Integer Nonlinear Programs.” Mathematical Programming 36(3): 307–39.
  • Ferri, Francesc J, Pavel Pudil, Mohamad Hatef, and Josef Kittler. 1994. “Comparative Study of Techniques for Large-Scale Feature Selection.” In Machine Intelligence and Pattern Recognition, Elsevier, 403–13.
  • Feurer, Matthias, and Frank Hutter. 2019. “Hyperparameter Optimization.” In Automated Machine Learning, Springer, Cham, 3–33.
  • Hart, William E., Jean Paul Watson, and David L. Woodruff. 2011. “Pyomo: Modeling and Solving Mathematical Programs in Python.” Mathematical Programming Computation 3(3): 219–60.
  • Kocak, Habip, and Turgut Un. 2014. “Forecasting the Gold Returns with Artifical Neural Network and Time Series.” International Business Research 7(11).
  • Kocis, Gary R, and Ignacio E Grossmann. 1989. “Computational Experience with DICOPT Solving MINLP Problems in Process Systems Engineering.” Computers & Chemical Engineering 13(3): 307–15.
  • Kronqvist, Jan, David E Bernal, Andreas Lundell, and Ignacio E Grossmann. 2019. “A Review and Comparison of Solvers for Convex MINLP.” Optimization and Engineering 20(2): 397–455.
  • Lavanya, D, and Dr K Usha Rani. 2011. “Analysis of Feature Selection with Classification: Breast Cancer Datasets.” Indian Journal of Computer Science and Engineering (IJCSE) 2(5): 756–63.
  • Leahy, Paul, Ger Kiely, and Gearóid Corcoran. 2008. “Structural Optimisation and Input Selection of an Artificial Neural Network for River Level Prediction.” Journal of Hydrology 355(1–4): 192–201.
  • Ledesma, Sergio et al. 2008. “Feature Selection Using Artificial Neural Networks.” In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 351–59.
  • Manngård, Mikael, Jan Kronqvist, and Jari M Böling. 2018. “Structural Learning in Artificial Neural Networks Using Sparse Optimization.” Neurocomputing 272: 660–67.
  • Mutlu, Ali Yener, and Ozgun Yucel. 2018. “An Artificial Intelligence Based Approach to Predicting Syngas Composition for Downdraft Biomass Gasification.” Energy 165: 895–901.
  • Poernomo, Alvin, and Dae-Ki Kang. 2018. “Biased Dropout and Crossmap Dropout: Learning towards Effective Dropout Regularization in Convolutional Neural Network.” Neural Networks 104: 60–67. https://www.sciencedirect.com/science/article/pii/S0893608018301096.
  • Rückstieß, Thomas, Christian Osendorfer, and Patrick van der Smagt. 2011. “Sequential Feature Selection for Classification.” In Australasian Joint Conference on Artificial Intelligence, Springer, 132–41.
  • Sahinidis, Nikolaos V. 1996. “BARON: A General Purpose Global Optimization Software Package.” Journal of Global Optimization 8(2): 201–5.
  • Schittkowski, K. 2007. “Experimental Design Tools for Ordinary and Algebraic Differential Equations.” In Industrial and Engineering Chemistry Research, 9137–47.
  • Sildir, Hasan, Erdal Aydin, and Taskin Kavzoglu. 2020. “Design of Feedforward Neural Networks in the Classification of Hyperspectral Imagery Using Superstructural Optimization.” Remote Sensing 12(6). https://www.mdpi.com/2072-4292/12/6/956.
  • Stamoulis, Dimitrios, Ermao Cai, Da-Cheng Juan, and Diana Marculescu. 2018. “Hyperpower: Power-and Memory-Constrained Hyper-Parameter Optimization for Neural Networks.” In 2018 Design, Automation & Test in Europe Conference & Exhibition (DATE), IEEE, 19–24.
  • Verikas, A., and M. Bacauskiene. 2002. “Feature Selection with Neural Networks.” Pattern Recognition Letters 23(11): 1323–35.
  • Van De Wal, Marc, and Bram De Jager. A Review of Methods for Input/Output Selection.
  • Yetilmezsoy, Kaan, Bestamin Ozkaya, and Mehmet Cakmakci. 2011. “Artificial Intelligence-Based Prediction Models for Environmental Engineering.” Neural Network World 21(3): 193–218.
  • Zhong, Ping, and Masao Fukushima. 2007. “Regularized Nonsmooth Newton Method for Multi-Class Support Vector Machines.” In Optimization Methods and Software, 225–36.

Kesikli ve Sürekli Optimizasyon Kullanarak Yapay Sinir Ağları için Sistematik Girdi Seçimi Yöntemi

Year 2022, Volume: 10 Issue: 3, 762 - 773, 01.09.2022
https://doi.org/10.36306/konjes.1077177

Abstract

Ampirik modellerin girdi değişkenlerinin seçimi, tahmin performansı, azaltılmış fazla uydurma ve hesaplama yükünün azaltılması üzerinde önemli etkiye sahiptir. Literatürde yapay sinir ağları (YSA) için girdi seçimi ile ilgili çeşitli deneme yanılma yöntemleri mevcuttur ancak bu metodlar sistematik ve otomatik olarak kabul edilmemektedir. Bu çalışma, sınıflandırma problemleri için optimal girdi seçimi ve YSA eğitimini aynı anda ele almak için yeni ve verimli bir karma tamsayılı doğrusal olmayan programlama tabanlı bir yaklaşım önermektedir. Bu seçim, girdi değişkenlerinin varlığını temsil etmek için ikili (0-1) değişkenleri kullanır ve geleneksel sürekli ağ ağırlıklarını veya parametrelerini aynı anda eğitir. Yaygın olarak kullanılan veri setleri ve istatistiksel ölçümler kullanarak avantajları göstermek amacıyla üç sınıflandırma vaka çalışması sunulmuştur. Birinci veri seti meme kanseri ile ilgili tümörün tipin-in karakterizasyonu ile ilgili olup, ikinci veri seti ise farklı özellikler kullanılarak bir biyoteknolojik ürünün tipinin tahmin edilmesi ile ilgilidir, son veri seti ise kalp sağlığı ile ilgilidir. Sonuçlar, optimal olarak seçilen girdiler ile düşük fazla uydurma sayesinde daha iyi test performansının elde edilebileceğini göstermektedir. Önerilen yaklaşım, YSA'ların tasarımı ve eğitimi sırasında önemli bir avantaj sağlar ve diğer ampirik modellere de uygulanabilir.

Project Number

118C245

References

  • Agarap, Abien Fred M. 2018. “On Breast Cancer Detection: An Application of Machine Learning Algorithms on the Wisconsin Diagnostic Dataset.” In Proceedings of the 2nd International Conference on Machine Learning and Soft Computing, 5–9.
  • Aha, David W, and Richard L Bankert. 1996. “A Comparative Evaluation of Sequential Feature Selection Algorithms.” In Learning from Data, Springer, 199–206.
  • Akdag, Unal, M. Aydin Komur, and A. Feridun Ozguc. 2009. “Estimation of Heat Transfer in Oscillating Annular Flow Using Artifical Neural Networks.” Advances in Engineering Software 40(9): 864–70.
  • Alom, Md Zahangir et al. 2019. “A State-of-the-Art Survey on Deep Learning Theory and Architectures.” Electronics (Switzerland) 8(3): 292.
  • Azadeh, A., S. F. Ghaderi, and S. Sohrabkhani. 2008. “Annual Electricity Consumption Forecasting by Neural Network in High Energy Consuming Industrial Sectors.” Energy Conversion and Management 49(8): 2272–78.
  • Benbrahim, Houssam, Hanaâ Hachimi, and Aouatif Amine. 2019. “Comparative Study of Machine Learning Algorithms Using the Breast Cancer Dataset.” In International Conference on Advanced Intelligent Systems for Sustainable Development, Springer, 83–91.
  • Bredensteiner, Erin J., and Kristin P. Bennett. 1999. “Multicategory Classification by Support Vector Machines.” Computational Optimization and Applications 12(1–3): 53–79.
  • Castellano, Giovanna, and Anna Maria Fanelli. 2000. “Variable Selection Using Neural-Network Models.” Neurocomputing 31(1–4): 1–13.
  • Chicco, Davide, and Giuseppe Jurman. 2020. “Machine Learning Can Predict Survival of Patients with Heart Failure from Serum Creatinine and Ejection Fraction Alone.” BMC medical informatics and decision making 20(1): 1–16.
  • Diaz, G I, A Fokoue-Nkoutche, G Nannicini, and H Samulowitz. 2017. “An Effective Algorithm for Hyperparameter Optimization of Neural Networks.” IBM Journal of Research and Development 61(4/5): 9:1-9:11.
  • Duran, Marco A., and Ignacio E. Grossmann. 1986. “An Outer-Approximation Algorithm for a Class of Mixed-Integer Nonlinear Programs.” Mathematical Programming 36(3): 307–39.
  • Ferri, Francesc J, Pavel Pudil, Mohamad Hatef, and Josef Kittler. 1994. “Comparative Study of Techniques for Large-Scale Feature Selection.” In Machine Intelligence and Pattern Recognition, Elsevier, 403–13.
  • Feurer, Matthias, and Frank Hutter. 2019. “Hyperparameter Optimization.” In Automated Machine Learning, Springer, Cham, 3–33.
  • Hart, William E., Jean Paul Watson, and David L. Woodruff. 2011. “Pyomo: Modeling and Solving Mathematical Programs in Python.” Mathematical Programming Computation 3(3): 219–60.
  • Kocak, Habip, and Turgut Un. 2014. “Forecasting the Gold Returns with Artifical Neural Network and Time Series.” International Business Research 7(11).
  • Kocis, Gary R, and Ignacio E Grossmann. 1989. “Computational Experience with DICOPT Solving MINLP Problems in Process Systems Engineering.” Computers & Chemical Engineering 13(3): 307–15.
  • Kronqvist, Jan, David E Bernal, Andreas Lundell, and Ignacio E Grossmann. 2019. “A Review and Comparison of Solvers for Convex MINLP.” Optimization and Engineering 20(2): 397–455.
  • Lavanya, D, and Dr K Usha Rani. 2011. “Analysis of Feature Selection with Classification: Breast Cancer Datasets.” Indian Journal of Computer Science and Engineering (IJCSE) 2(5): 756–63.
  • Leahy, Paul, Ger Kiely, and Gearóid Corcoran. 2008. “Structural Optimisation and Input Selection of an Artificial Neural Network for River Level Prediction.” Journal of Hydrology 355(1–4): 192–201.
  • Ledesma, Sergio et al. 2008. “Feature Selection Using Artificial Neural Networks.” In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 351–59.
  • Manngård, Mikael, Jan Kronqvist, and Jari M Böling. 2018. “Structural Learning in Artificial Neural Networks Using Sparse Optimization.” Neurocomputing 272: 660–67.
  • Mutlu, Ali Yener, and Ozgun Yucel. 2018. “An Artificial Intelligence Based Approach to Predicting Syngas Composition for Downdraft Biomass Gasification.” Energy 165: 895–901.
  • Poernomo, Alvin, and Dae-Ki Kang. 2018. “Biased Dropout and Crossmap Dropout: Learning towards Effective Dropout Regularization in Convolutional Neural Network.” Neural Networks 104: 60–67. https://www.sciencedirect.com/science/article/pii/S0893608018301096.
  • Rückstieß, Thomas, Christian Osendorfer, and Patrick van der Smagt. 2011. “Sequential Feature Selection for Classification.” In Australasian Joint Conference on Artificial Intelligence, Springer, 132–41.
  • Sahinidis, Nikolaos V. 1996. “BARON: A General Purpose Global Optimization Software Package.” Journal of Global Optimization 8(2): 201–5.
  • Schittkowski, K. 2007. “Experimental Design Tools for Ordinary and Algebraic Differential Equations.” In Industrial and Engineering Chemistry Research, 9137–47.
  • Sildir, Hasan, Erdal Aydin, and Taskin Kavzoglu. 2020. “Design of Feedforward Neural Networks in the Classification of Hyperspectral Imagery Using Superstructural Optimization.” Remote Sensing 12(6). https://www.mdpi.com/2072-4292/12/6/956.
  • Stamoulis, Dimitrios, Ermao Cai, Da-Cheng Juan, and Diana Marculescu. 2018. “Hyperpower: Power-and Memory-Constrained Hyper-Parameter Optimization for Neural Networks.” In 2018 Design, Automation & Test in Europe Conference & Exhibition (DATE), IEEE, 19–24.
  • Verikas, A., and M. Bacauskiene. 2002. “Feature Selection with Neural Networks.” Pattern Recognition Letters 23(11): 1323–35.
  • Van De Wal, Marc, and Bram De Jager. A Review of Methods for Input/Output Selection.
  • Yetilmezsoy, Kaan, Bestamin Ozkaya, and Mehmet Cakmakci. 2011. “Artificial Intelligence-Based Prediction Models for Environmental Engineering.” Neural Network World 21(3): 193–218.
  • Zhong, Ping, and Masao Fukushima. 2007. “Regularized Nonsmooth Newton Method for Multi-Class Support Vector Machines.” In Optimization Methods and Software, 225–36.
There are 32 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Research Article
Authors

Hasan Şıldır 0000-0003-1016-9865

Erdal Aydın 0000-0002-8498-4830

Project Number 118C245
Publication Date September 1, 2022
Submission Date February 22, 2022
Acceptance Date August 4, 2022
Published in Issue Year 2022 Volume: 10 Issue: 3

Cite

IEEE H. Şıldır and E. Aydın, “A SYSTEMATIC AND EFFICIENT INPUT SELECTION METHOD FOR ARTIFICIAL NEURAL NETWORKS USING MIXED-INTEGER NONLINEAR PROGRAMMING”, KONJES, vol. 10, no. 3, pp. 762–773, 2022, doi: 10.36306/konjes.1077177.