<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.4 20241031//EN"
        "https://jats.nlm.nih.gov/publishing/1.4/JATS-journalpublishing1-4.dtd">
<article  article-type="research-article"        dtd-version="1.4">
            <front>

                <journal-meta>
                                                                <journal-id>saucis</journal-id>
            <journal-title-group>
                                                                                    <journal-title>Sakarya University Journal of Computer and Information Sciences</journal-title>
            </journal-title-group>
                                        <issn pub-type="epub">2636-8129</issn>
                                                                                            <publisher>
                    <publisher-name>Sakarya University</publisher-name>
                </publisher>
                    </journal-meta>
                <article-meta>
                                        <article-id pub-id-type="doi">10.35377/saucis...1661247</article-id>
                                                                <article-categories>
                                            <subj-group  xml:lang="en">
                                                            <subject>Computer Software</subject>
                                                    </subj-group>
                                            <subj-group  xml:lang="tr">
                                                            <subject>Bilgisayar Yazılımı</subject>
                                                    </subj-group>
                                    </article-categories>
                                                                                                                                                        <title-group>
                                                                                                                        <trans-title-group xml:lang="tr">
                                    <trans-title>An Environmental Sustainable Approach to Machine Learning, Training and Development</trans-title>
                                </trans-title-group>
                                                                                                                                                                                                <article-title>An Environmental Sustainable Approach to Machine Learning, Training and Development</article-title>
                                                                                                    </title-group>
            
                                                    <contrib-group content-type="authors">
                                                                        <contrib contrib-type="author">
                                                                    <contrib-id contrib-id-type="orcid">
                                        https://orcid.org/0000-0002-2632-4674</contrib-id>
                                                                <name>
                                    <surname>Jegadeeswari</surname>
                                    <given-names>K</given-names>
                                </name>
                                                            </contrib>
                                                    <contrib contrib-type="author">
                                                                    <contrib-id contrib-id-type="orcid">
                                        https://orcid.org/0000-0002-3970-262X</contrib-id>
                                                                <name>
                                    <surname>R</surname>
                                    <given-names>Rathipriya</given-names>
                                </name>
                                                            </contrib>
                                                                                </contrib-group>
                        
                                        <pub-date pub-type="pub" iso-8601-date="20250930">
                    <day>09</day>
                    <month>30</month>
                    <year>2025</year>
                </pub-date>
                                        <volume>8</volume>
                                        <issue>3</issue>
                                        <fpage>457</fpage>
                                        <lpage>469</lpage>
                        
                        <history>
                                    <date date-type="received" iso-8601-date="20250324">
                        <day>03</day>
                        <month>24</month>
                        <year>2025</year>
                    </date>
                                                    <date date-type="accepted" iso-8601-date="20250623">
                        <day>06</day>
                        <month>23</month>
                        <year>2025</year>
                    </date>
                            </history>
                                        <permissions>
                    <copyright-statement>Copyright © 2018, Sakarya University Journal of Computer and Information Sciences</copyright-statement>
                    <copyright-year>2018</copyright-year>
                    <copyright-holder>Sakarya University Journal of Computer and Information Sciences</copyright-holder>
                </permissions>
            
                                                                                                <trans-abstract xml:lang="tr">
                            <p>Artificial intelligence has the potential to drive sustainability by minimizing the impact of machine learning (ML) development on the environment. However, many ML techniques, particularly ensemble methods like the Random Forest classifier, require large computational resources during the tuning of hyperparameters. These hyperparameters are the number of trees, the depth of the tree, and the number of features considered at each split of the tree. These hyperparameters considerably impact model performance and energy consumption. This paper proposes an eco-friendly multi-objective framework (EFMOF) to optimize the hyperparameters with minimal environmental impact while retaining high model accuracy. By leveraging advanced hyperparameter optimization techniques like Optuna, Hyperopt, and Grid Search, the framework effectively explores the hyperparameter space, focusing on energy efficiency and carbon reduction. From the above, incorporating sustainable AI into ML development requires monitoring energy consumption and carbon emissions at every hyperparameter tuning. This will ensure that the models developed perform well and are sustainable without too much environmental cost. The Experimental result shows that the most dominant hyperparameter is the number of estimators, which leads to higher energy consumption. In contrast, minimum samples per leaf and split have a moderate effect, while maximum depth has a minor impact.</p></trans-abstract>
                                                                                                                                    <abstract><p>Artificial intelligence has the potential to drive sustainability by minimizing the impact of machine learning (ML) development on the environment. However, many ML techniques, particularly ensemble methods like the Random Forest classifier, require large computational resources during the tuning of hyperparameters. These hyperparameters are the number of trees, the depth of the tree, and the number of features considered at each split of the tree. These hyperparameters considerably impact model performance and energy consumption. This paper proposes an eco-friendly multi-objective framework (EFMOF) to optimize the hyperparameters with minimal environmental impact while retaining high model accuracy. By leveraging advanced hyperparameter optimization techniques like Optuna, Hyperopt, and Grid Search, the framework effectively explores the hyperparameter space, focusing on energy efficiency and carbon reduction. From the above, incorporating sustainable AI into ML development requires monitoring energy consumption and carbon emissions at every hyperparameter tuning. This will ensure that the models developed perform well and are sustainable without too much environmental cost. The Experimental result shows that the most dominant hyperparameter is the number of estimators, which leads to higher energy consumption. In contrast, minimum samples per leaf and split have a moderate effect, while maximum depth has a minor impact.</p></abstract>
                                                            
            
                                                                                        <kwd-group>
                                                    <kwd>Multi-objective</kwd>
                                                    <kwd>  Ensemble Classification</kwd>
                                                    <kwd>  Hyperparameter Optimization</kwd>
                                                    <kwd>  Eco-friendly</kwd>
                                                    <kwd>  Sustainable ML.</kwd>
                                            </kwd-group>
                            
                                                <kwd-group xml:lang="tr">
                                                    <kwd>Multi-objective</kwd>
                                                    <kwd>  Ensemble Classification</kwd>
                                                    <kwd>  Hyperparameter Optimization</kwd>
                                                    <kwd>  Eco-friendly</kwd>
                                                    <kwd>  Sustainable ML.</kwd>
                                            </kwd-group>
                                                                                                                                        </article-meta>
    </front>
    <back>
                            <ref-list>
                                    <ref id="ref1">
                        <label>1</label>
                        <mixed-citation publication-type="journal">Sharma, P., &amp; Puri, S. “Random Forest-Based Prediction of Breast Cancer Survival: Cross-Validation and Hyperparameter Tuning,” In International Conference on Advances in Computing and Data Sciences, 138-145. 2020. DOI: 10.1007/978-981-15-0277-0_13.</mixed-citation>
                    </ref>
                                    <ref id="ref2">
                        <label>2</label>
                        <mixed-citation publication-type="journal">Alghamdi, F., Alsuhaibani, R., &amp; Albattah, K. “Breast Cancer Diagnosis and Prediction Using Machine Learning and Data Mining Techniques: A Review,” IEEE Access, 9, 18152-18164. 2021, DOI: 10.1109/ACCESS.2021.3052953.</mixed-citation>
                    </ref>
                                    <ref id="ref3">
                        <label>3</label>
                        <mixed-citation publication-type="journal">Feurer, M., &amp; Hutter, F. “Hyperparameter Optimization in Machine Learning: A Comprehensive Survey,” Journal of Machine Learning Research, 20(1), 1-45, 2019. Available at: https://www.jmlr.org/papers/v20/18-444.html.</mixed-citation>
                    </ref>
                                    <ref id="ref4">
                        <label>4</label>
                        <mixed-citation publication-type="journal">Gamage, G., Samarakoon, S., &amp; Nguyen, N. T. “Energy-Efficient Machine Learning Models for Healthcare Applications.” IEEE Access, 9, 150357-150373, 2021. DOI: 10.1109/ACCESS.2021.3124182.</mixed-citation>
                    </ref>
                                    <ref id="ref5">
                        <label>5</label>
                        <mixed-citation publication-type="journal">Rolnick, D., Donti, P. L., Kaack, L. H., Kochanski, K., Lacoste, A., Sankaran, K., ... &amp; Bengio, Y. “Sustainable AI: Environmental Implications, Challenges, and Opportunities.” In Proceedings of the 2022 Conference on Fairness, Accountability, and Transparency, 145-156, 2022, DOI: 10.1145/3442188.3445934.</mixed-citation>
                    </ref>
                                    <ref id="ref6">
                        <label>6</label>
                        <mixed-citation publication-type="journal">Zheng, B., Yoon, S. W., &amp; Lam, S. S. “Breast cancer diagnosis based on feature extraction using a hybrid of K-means and support vector machine algorithms,” Expert Systems with Applications, 41(4), 1476-1482, 2021. https://doi.org/10.1016/j.eswa.2021.08.027</mixed-citation>
                    </ref>
                                    <ref id="ref7">
                        <label>7</label>
                        <mixed-citation publication-type="journal">Delen, D., Walker, G., &amp; Kadam, A. “Predicting breast cancer survivability: A comparison of three data mining methods,” Artificial Intelligence in Medicine, 34(2), 113-127, 2020. https://doi.org/10.1016/j.artmed.2020.08.003</mixed-citation>
                    </ref>
                                    <ref id="ref8">
                        <label>8</label>
                        <mixed-citation publication-type="journal">Zizaan, Asma, and Ali Idri. “Evaluating and Comparing Bagging and Boosting of Hybrid Learning for Breast Cancer Screening.” Scientific African, vol. 23, Mar. 2024, doi:10.1016/j.sciaf.2023.e01989.</mixed-citation>
                    </ref>
                                    <ref id="ref9">
                        <label>9</label>
                        <mixed-citation publication-type="journal">Jegadeeswari, K., and R. Rathipriya. &quot;Optimized Stacking Ensemble Classifier for Early Cancer Detection Using Biomarker Data.&quot; Advance Sustainable Science Engineering and Technology 6, no. 4 (2024): 02404017-02404017.</mixed-citation>
                    </ref>
                                    <ref id="ref10">
                        <label>10</label>
                        <mixed-citation publication-type="journal">Akiba, T., Sano, S., Yanase, T., Ohta, T., &amp; Koyama, M. “Optuna: A next-generation hyperparameter optimization framework,” Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &amp; Data Mining, 2623-2631, 2019. https://doi.org/10.1145/3292500.3330701</mixed-citation>
                    </ref>
                                    <ref id="ref11">
                        <label>11</label>
                        <mixed-citation publication-type="journal">Bergstra, J., Yamins, D., &amp; Cox, D. D. “Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures,” Proceedings of the 30th International Conference on Machine Learning, 28, 115-123, 2013.</mixed-citation>
                    </ref>
                                    <ref id="ref12">
                        <label>12</label>
                        <mixed-citation publication-type="journal">Henderson, P., Hu, J., Romoff, J., Brunskill, E., Jurafsky, D., &amp; Pineau, J. “Towards the systematic reporting of the energy and carbon footprints of machine learning,” Journal of Machine Learning Research, 21(1), 1-43, 2020.</mixed-citation>
                    </ref>
                                    <ref id="ref13">
                        <label>13</label>
                        <mixed-citation publication-type="journal">Liu, Z., Cheng, S., Zhou, H., &amp; You, Y. “Hanayo: Harnessing Wave-like Pipeline Parallelism for Enhanced Large Model Training Efficiency”, 2023. https://doi.org/10.1145/3581784.3607073</mixed-citation>
                    </ref>
                                    <ref id="ref14">
                        <label>14</label>
                        <mixed-citation publication-type="journal">Strubell, E., Ganesh, A., &amp; McCallum, A. “Energy and policy considerations for deep learning in NLP,” Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 3645-3650, 2019. https://doi.org/10.18653/v1/P19-1355</mixed-citation>
                    </ref>
                                    <ref id="ref15">
                        <label>15</label>
                        <mixed-citation publication-type="journal">Lottick, K., Sakaguchi, K., Schwartz, R., &amp; Smith, N. A. “Energy and policy considerations for deep learning in NLP,” Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 364-367, 2020. https://doi.org/10.18653/v1/2020.acl-main.34</mixed-citation>
                    </ref>
                                    <ref id="ref16">
                        <label>16</label>
                        <mixed-citation publication-type="journal">Schwartz, R., Dodge, J., Smith, N. A., &amp; Etzioni, O. “ Green AI,” Communications of the ACM, 63(12), 54-63, 2020. https://doi.org/10.1145/3381831</mixed-citation>
                    </ref>
                                    <ref id="ref17">
                        <label>17</label>
                        <mixed-citation publication-type="journal">Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L. M., Rothchild, D., Socher, R., &amp; Dean, J. “Carbon Emissions and large neural network training”. arXiv preprint arXiv:2104.10350, 2021.</mixed-citation>
                    </ref>
                                    <ref id="ref18">
                        <label>18</label>
                        <mixed-citation publication-type="journal">Lo, F., Bitz, C. M., and Hess, J. J. “Development of a Random Forest Model for Forecasting Allergenic Pollen in North America”. Sci. Total Environ. 773, 145590, 2021. doi: 10.1016/j.scitotenv.2021.145590</mixed-citation>
                    </ref>
                                    <ref id="ref19">
                        <label>19</label>
                        <mixed-citation publication-type="journal">Akiba, T., Sano, S., Yanase, T., Ohta, T., &amp; Koyama, M. “Optuna: A next-generation hyperparameter optimization framework.” Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &amp; Data Mining, 2623–2631, 2019.</mixed-citation>
                    </ref>
                                    <ref id="ref20">
                        <label>20</label>
                        <mixed-citation publication-type="journal">Li, L., Jamieson, K., DeSalvo, G., Rostamizadeh, A., &amp; Talwalkar, A. “Hyperband: Bandit-based configuration evaluation for hyperparameter optimization.” International Conference on Learning Representations, 2017.</mixed-citation>
                    </ref>
                                    <ref id="ref21">
                        <label>21</label>
                        <mixed-citation publication-type="journal">Saranya, G., &amp; Pravin, A. “GridSearch based optimum feature selection by tuning hyperparameters for heart disease diagnosis in machine learning.” The Open Biomedical Engineering Journal, 17(1), 2023a. https://doi.org/10.2174/18741207-v17-e230510-2022-ht28-4371-8</mixed-citation>
                    </ref>
                                    <ref id="ref22">
                        <label>22</label>
                        <mixed-citation publication-type="journal">Zhu, N.; Zhu, C.; Zhou, L.; Zhu, Y.; Zhang, X. “Optimization of the Random Forest Hyperparameters for Power Industrial Control Systems Intrusion Detection Using an Improved GridSearch Algorithm.” Appl. Sci. 2022, 12, 10456. https://doi.org/10.3390/app122010456</mixed-citation>
                    </ref>
                                    <ref id="ref23">
                        <label>23</label>
                        <mixed-citation publication-type="journal">K. Jegadeeswari, R. Rathipriya, &quot;Green AI Practices in Multi-objective Hyperparameter Optimization for Sustainable Machine Learning&quot;, International Journal of Information Technology and Computer Science (IJITCS), Vol.17, No.2, pp.1-9, 2025. DOI:10.5815/ijitcs.2025.02.01.</mixed-citation>
                    </ref>
                                    <ref id="ref24">
                        <label>24</label>
                        <mixed-citation publication-type="journal">Jegadeeswari, K., &amp; Rathipriya, R. “Minimizing the carbon footprint of machine learning techniques through sustainable AI training methods.” In Sustainable information security in the age of AI and green computing. IGI Global, 2025.</mixed-citation>
                    </ref>
                                    <ref id="ref25">
                        <label>25</label>
                        <mixed-citation publication-type="journal">Dodge, J., Prewitt, T., Combes, R T D., Odmark, E., Schwartz, R., Strubell, E., Luccioni, A S., Smith, N A., DeCario, N., &amp; Buchanan, W. “Measuring the Carbon Intensity of AI in Cloud Instances”, 2022.</mixed-citation>
                    </ref>
                                    <ref id="ref26">
                        <label>26</label>
                        <mixed-citation publication-type="journal">K. Jegadeeswari, R. Rathipriya and J. Renugadevi, &quot;Fusion Learning of Regression Models for Missing Data Imputation in Breast Cancer Dataset,&quot; 2023 International Conference on Artificial Intelligence for Innovations in Healthcare Industries (ICAIIHI), Raipur, India, 2023, pp. 1-14, doi: 10.1109/ICAIIHI57871.2023.10489656.</mixed-citation>
                    </ref>
                                    <ref id="ref27">
                        <label>27</label>
                        <mixed-citation publication-type="journal">K Jegadeeswari, R Ragunath, R Rathipriya, “A Prediction Model with Multi-Pattern Missing Data Imputation for Medical Dataset”, Advanced Network Technologies and Intelligent Computing, ANTIC 2022, CCIS 1798, Singapore Nature, 2023, 798, 2023, https://doi.org/10.1007/978-3-031-28183-9_38.</mixed-citation>
                    </ref>
                            </ref-list>
                    </back>
    </article>
