Araştırma Makalesi
BibTex RIS Kaynak Göster

Predicting CPU Performance Score with Regression Analysis

Yıl 2025, Cilt: 16 Sayı: 1, 1 - 11
https://doi.org/10.24012/dumf.1493049

Öz

The purpose of this research is to use regression analysis to predict a CPU's performance score based on its features. CPU performance is incredibly important to evaluate when choosing a computer, along with system configuration and design. Support Vector Regression (SVR), Random Forest Regression (RFR), Multiple Linear Regression (MLR), Gradient Boosting Regression (GBR) and Neural Network Regression (NNR) are used to estimate the CPU's performance score. To test the algorithms, 30 percent of the data set was selected as test data and 70 percent as training data, separated randomly. As a result, the NNR has the highest of the coefficient of determination score which is 0.976, followed by GBR, 0.958. MLR, RFR and SVR algorithms have the R-squared score of 0.952, 0.934 and 0.865, respectively.

Kaynakça

  • [1] J. Cao, J. Fu, M. Li, and J. Chen, “CPU load prediction for cloud environment based on a dynamic ensemble model,” Software, Practice & Experience/Software, Practice and Experience, vol. 44, no. 7, pp. 793–804, Oct. 2013, doi: 10.1002/spe.2231.
  • [2] G. Contreras and M. Martonosi, “Power prediction for intel XScale® processors using performance monitoring unit events,” Jan. 2005, doi: https://doi.org/10.1145/1077603.1077657.
  • [3] W. Wang, X. Huang, X. Qin, W. Zhang, J. Wei, and H. Zhong, “Application-Level CPU Consumption Estimation: Towards Performance Isolation of Multi-tenancy Web Applications,” Jun. 2012, doi: https://doi.org/10.1109/cloud.2012.81.
  • [4] C. Foots, P. Pal, R. Datta, and A. Segev, “CPU Hardware Classification and Performance Prediction using Neural Networks and Statistical Learning,” International Journal of Artificial Intelligence & Applications, vol. 11, no. 4, pp. 1–13, Jul. 2020, doi: https://doi.org/10.5121/ijaia.2020.11401.
  • [5] “Unbiased hardware comparisons,” technical.city. https://technical.city/ (accessed Nov 25, 2022).
  • [6] “PassMark - CPU Benchmarks - CPU Mega Page - Detailed List of Benchmarked CPUs,” www.cpubenchmark.net. https://www.cpubenchmark.net/CPU_mega_page.html (accessed Nov 25, 2022)
  • [7] “Intel® Processors for PC, Laptops, Servers, and AI,” Intel. https://www.intel.com/content/www/us/en/products/details/processors.html (accessed Nov 25, 2022)
  • [8] AMD Processors | AMD. https://www.amd.com/en/processors/. [Accessed: Nov 25, 2022]
  • [9] L. Manikandan and R. Selvakumar, "A review on data mining concepts and tools", International Journal of Scientific Research in Computer Science, Engineering and Information Technology, p. 600-605, 2022. https://doi.org/10.32628/cseit228683
  • [10] S. Garcia, J. Luengo, and F. Herrera, Data preprocessing in data mining. 2015. doi: 10.1007/978-3-319-10247-4
  • [11] J. Han, J. Pei, and H. Tong, "Data mining: concepts and techniques," Morgan Kaufmann, 2022.
  • [12] C. Cortes and V. Vapnik, “Support-vector networks,” Machine Learning, vol. 20, no. 3, pp. 273–297, Sep. 1995, doi: https://doi.org/10.1007/bf00994018.
  • [13] M. Awad and R. Khanna, “Support Vector Regression,” Efficient Learning Machines, pp. 67–80, 2015, doi: https://doi.org/10.1007/978-1-4302-5990-9_4.
  • [14] Kavitha S, Varuna S, and Ramya R, “A comparative analysis on linear regression and support vector regression,” 2016 Online International Conference on Green Engineering and Technologies (IC-GET), Nov. 2016, doi: https://doi.org/10.1109/get.2016.7916627.
  • [15] A. Ali, R. Darvishzadeh, A. Skidmore, T. Gara, & M. Heurich, "Machine learning methods’ performance in radiative transfer model inversion to retrieve plant traits from sentinel-2 data of a mixed mountain forest", International Journal of Digital Earth, vol. 14, no. 1, p. 106-120, 2020. https://doi.org/10.1080/17538947.2020.1794064
  • [16] L. Breiman, “Random Forests,” Machine Learning, vol. 45, no. 1, pp. 5–32, 2001, doi: https://doi.org/10.1023/a:1010933404324.
  • [17] E. Melišová, A. Vizina, M. Hanel, P. Pavlík, & P. Šuhájková, "Evaluation of evaporation from water reservoirs in local conditions at czech republic", Hydrology, vol. 8, no. 4, p. 153, 2021. https://doi.org/10.3390/hydrology8040153
  • [18] Yang, D., Li, S., Chong, Y., Li, M., Hu, W., Wang, B., … & Liu, P. (2020). Applying random forest model algorithm to gfr estimation.. https://doi.org/10.21203/rs.3.rs-22422/v1
  • [19] F. Hutter, Lars Kotthoff, and J. Vanschoren, Automated machine learning : methods, systems, challenges. Cham Springer, 2019.
  • [20] P. Pandit, P. Dey, & K. Krishnamurthy, "Comparative assessment of multiple linear regression and fuzzy linear regression models", SN Computer Science, vol. 2, no. 2, 2021. https://doi.org/10.1007/s42979-021-00473-3
  • [21] O. Altay, T. Gurgenc, M. Ulas, and C. Özel, “Prediction of wear loss quantities of ferro-alloy coating using different machine learning algorithms,” Friction, vol. 8, no. 1, pp. 107–114, Jan. 2019, doi: https://doi.org/10.1007/s40544-018-0249-z.
  • [22] Thorson, J., Collier-Oxandale, A., & Hannigan, M. P. (2019). Using a low-cost sensor array and machine learning techniques to detect complex pollutant mixtures and identify likely sources. Sensors, 19(17), 3723. https://doi.org/10.3390/s19173723
  • [23] D. A. Freedman, Statistical Models: Theory and Practice, -2nd ed. Cambridge University Press, 2009.
  • [24] J. H. Friedman, “Stochastic gradient boosting,” Computational Statistics & Data Analysis, vol. 38, no. 4, pp. 367–378, Feb. 2002, doi: https://doi.org/10.1016/s0167-9473(01)00065-2.
  • [25] C. Bentéjac, A. Csörgő, and G. Martínez-Muñoz, “A comparative analysis of gradient boosting algorithms,” Artificial Intelligence Review, vol. 54, Aug. 2020, doi: https://doi.org/10.1007/s10462-020-09896-5.
  • [26] Y. Veisani, H. Sayyadi, A. Sahebi, G. Moradi, F. Mohamadian, and A. Delpisheh, “Comparison of machine learning algorithms to predict intentional and unintentional poisoning risk factors,” Heliyon, vol. 9, no. 6, p. e17337, Jun. 2023, doi: https://doi.org/10.1016/j.heliyon.2023.e17337 .
  • [27] W. S. McCulloch and W. Pitts, “A logical calculus of the ideas immanent in nervous activity,” The Bulletin of Mathematical Biophysics, vol. 5, no. 4, pp. 115–133, Dec. 1943, doi: https://doi.org/10.1007/bf02478259.
  • [28] Z. Feng and W. Niu, “Hybrid artificial neural network and cooperation search algorithm for nonlinear river flow time series forecasting in humid and semi-humid regions,” Knowledge-Based Systems, vol. 211, p. 106580, Jan. 2021, doi: https://doi.org/10.1016/j.knosys.2020.106580.
  • [29] A. Ameri, M. A. Akhaee, E. Scheme, and K. Englehart, “Regression convolutional neural network for improved simultaneous EMG control,” Journal of Neural Engineering, vol. 16, no. 3, p. 036015, Apr. 2019, doi: https://doi.org/10.1088/1741-2552/ab0e2e.
  • [30] V. W. Y. Tam, A. Butera, K. N. Le, L. C. F. D. Silva, and A. C. J. Evangelista, “A prediction model for compressive strength of CO2 concrete using regression analysis and artificial neural networks,” Construction and Building Materials, vol. 324, p. 126689, Mar. 2022, doi: https://doi.org/10.1016/j.conbuildmat.2022.126689.
  • [31] H. Hristov and G. Momcheva, “Hyperparameter adjustment in regression neural networks for predicting support case durations,” AIP conference proceedings, Jan. 2021, doi: https://doi.org/10.1063/5.0041936.
  • [32] L. N. Smith, “A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay,” arXiv.org, Apr. 24, 2018. https://arxiv.org/abs/1803.09820v2 (accessed Nov. 17, 2023).
  • [33] Gurgenc, E., Altay, O., and Altay, E. V. (2024). AOSMA-MLP: A Novel Method for Hybrid Metaheuristics Artificial Neural Networks and a New Approach for Prediction of Geothermal Reservoir Temperature. Applied Sciences, 14(8), 3534. doi: https://doi.org/10.3390/app14083534
  • [34] Altay, O., and Gurgenc, T. (2024). GJO-MLP: A Novel Method for Hybrid Metaheuristics Multi-Layer Perceptron And A New Approach For Prediction Of Wear Loss Of Az91d Magnesium Alloy Worn At Dry, Oil, And H-Bn Nanoadditive Oil. Surface Review and Letters (SRL), 31(06), 1-16. doi: https://doi.org/10.1142/S0218625X24500483
  • [35] Altay, O., and Varol Altay, E. (2023). A novel hybrid multilayer perceptron neural network with improved grey wolf optimizer. Neural Computing and Applications, 35(1), 529-556. doi: https://doi.org/10.1007/s00521-022-07775-4
  • [36] Altay, E. V., Gurgenc, E., Altay, O., and Dikici, A. (2022). Hybrid artificial neural network based on a metaheuristic optimization algorithm for the prediction of reservoir temperature using hydrogeochemical data of different geothermal areas in Anatolia (Turkey). Geothermics, 104, 102476. doi: https://doi.org/10.1016/j.geothermics.2022.102476
  • [37] Gurgenc, T., and Altay, O. (2022). Surface roughness prediction of wire electric discharge machining (WEDM)-machined AZ91D magnesium alloy using multilayer perceptron, ensemble neural network, and evolving product-unit neural network. Materials Testing, 64(3), 350-362. doi: https://doi.org/10.1515/mt-2021-2034
  • [38] L. A. Shalabi, Z. Shaaban, and B. Kasasbeh, “Data Mining: A Preprocessing Engine,” Journal of Computer Science, vol. 2, no. 9, pp. 735–739, Sep. 2006, doi: https://doi.org/10.3844/jcssp.2006.735.739.
  • [39] C. Willmott and K. Matsuura, “Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance,” Climate Research, vol. 30, no. 1, pp. 79–82, 2005, doi: https://doi.org/10.3354/cr030079.
  • [40] D. Chicco, M. J. Warrens, and G. Jurman, “The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE and RMSE in regression analysis evaluation,” PeerJ Computer Science, vol. 7, no. 5, p. e623, Jul. 2021, doi: https://doi.org/10.7717/peerj-cs.623.
  • [41] Y. Shao, F. M. Dietrich, C. Nettelblad, and C. Zhang, “Training algorithm matters for the performance of neural network potential: A case study of Adam and the Kalman filter optimizers,” Journal of chemical physics online/The Journal of chemical physics/Journal of chemical physics, vol. 155, no. 20, Nov. 2021, doi: https://doi.org/10.1063/5.0070931.

Regresyon Analizi ile MİB Performans Puanını Tahmin Etme

Yıl 2025, Cilt: 16 Sayı: 1, 1 - 11
https://doi.org/10.24012/dumf.1493049

Öz

Bu araştırmanın amacı, bir MİB'in performans puanını özelliklerine göre tahmin etmek için regresyon analizini kullanmaktır. Bir bilgisayar seçerken MİB performansının yanı sıra sistem yapılandırması ve tasarımının değerlendirilmesi son derece önemlidir. Destek Vektör Regresyonu (DVR), Rastgele Orman Regresyonu (ROR), Çoklu Doğrusal Regresyon (ÇDR), Gradyan Arttırma Regresyonu (GAR) ve Yapay Sinir Ağları Regresyonu (YSA), MİB'in performans puanını tahmin etmek için kullanılmıştır. Algoritmaları test etmek için veri setinin yüzde 30'u test verileri, yüzde 70'i ise eğitim verileri olarak rastgele ayrıldı. Sonuç olarak, belirleme katsayısı en yüksek olan 0,976 ile YSA'dır ve onu 0,958 ile GAR takip etmektedir. ÇDR, ROR ve DVR algoritmaları sırasıyla 0,952, 0,934 ve 0,865 R-kare puanına sahiptir.

Kaynakça

  • [1] J. Cao, J. Fu, M. Li, and J. Chen, “CPU load prediction for cloud environment based on a dynamic ensemble model,” Software, Practice & Experience/Software, Practice and Experience, vol. 44, no. 7, pp. 793–804, Oct. 2013, doi: 10.1002/spe.2231.
  • [2] G. Contreras and M. Martonosi, “Power prediction for intel XScale® processors using performance monitoring unit events,” Jan. 2005, doi: https://doi.org/10.1145/1077603.1077657.
  • [3] W. Wang, X. Huang, X. Qin, W. Zhang, J. Wei, and H. Zhong, “Application-Level CPU Consumption Estimation: Towards Performance Isolation of Multi-tenancy Web Applications,” Jun. 2012, doi: https://doi.org/10.1109/cloud.2012.81.
  • [4] C. Foots, P. Pal, R. Datta, and A. Segev, “CPU Hardware Classification and Performance Prediction using Neural Networks and Statistical Learning,” International Journal of Artificial Intelligence & Applications, vol. 11, no. 4, pp. 1–13, Jul. 2020, doi: https://doi.org/10.5121/ijaia.2020.11401.
  • [5] “Unbiased hardware comparisons,” technical.city. https://technical.city/ (accessed Nov 25, 2022).
  • [6] “PassMark - CPU Benchmarks - CPU Mega Page - Detailed List of Benchmarked CPUs,” www.cpubenchmark.net. https://www.cpubenchmark.net/CPU_mega_page.html (accessed Nov 25, 2022)
  • [7] “Intel® Processors for PC, Laptops, Servers, and AI,” Intel. https://www.intel.com/content/www/us/en/products/details/processors.html (accessed Nov 25, 2022)
  • [8] AMD Processors | AMD. https://www.amd.com/en/processors/. [Accessed: Nov 25, 2022]
  • [9] L. Manikandan and R. Selvakumar, "A review on data mining concepts and tools", International Journal of Scientific Research in Computer Science, Engineering and Information Technology, p. 600-605, 2022. https://doi.org/10.32628/cseit228683
  • [10] S. Garcia, J. Luengo, and F. Herrera, Data preprocessing in data mining. 2015. doi: 10.1007/978-3-319-10247-4
  • [11] J. Han, J. Pei, and H. Tong, "Data mining: concepts and techniques," Morgan Kaufmann, 2022.
  • [12] C. Cortes and V. Vapnik, “Support-vector networks,” Machine Learning, vol. 20, no. 3, pp. 273–297, Sep. 1995, doi: https://doi.org/10.1007/bf00994018.
  • [13] M. Awad and R. Khanna, “Support Vector Regression,” Efficient Learning Machines, pp. 67–80, 2015, doi: https://doi.org/10.1007/978-1-4302-5990-9_4.
  • [14] Kavitha S, Varuna S, and Ramya R, “A comparative analysis on linear regression and support vector regression,” 2016 Online International Conference on Green Engineering and Technologies (IC-GET), Nov. 2016, doi: https://doi.org/10.1109/get.2016.7916627.
  • [15] A. Ali, R. Darvishzadeh, A. Skidmore, T. Gara, & M. Heurich, "Machine learning methods’ performance in radiative transfer model inversion to retrieve plant traits from sentinel-2 data of a mixed mountain forest", International Journal of Digital Earth, vol. 14, no. 1, p. 106-120, 2020. https://doi.org/10.1080/17538947.2020.1794064
  • [16] L. Breiman, “Random Forests,” Machine Learning, vol. 45, no. 1, pp. 5–32, 2001, doi: https://doi.org/10.1023/a:1010933404324.
  • [17] E. Melišová, A. Vizina, M. Hanel, P. Pavlík, & P. Šuhájková, "Evaluation of evaporation from water reservoirs in local conditions at czech republic", Hydrology, vol. 8, no. 4, p. 153, 2021. https://doi.org/10.3390/hydrology8040153
  • [18] Yang, D., Li, S., Chong, Y., Li, M., Hu, W., Wang, B., … & Liu, P. (2020). Applying random forest model algorithm to gfr estimation.. https://doi.org/10.21203/rs.3.rs-22422/v1
  • [19] F. Hutter, Lars Kotthoff, and J. Vanschoren, Automated machine learning : methods, systems, challenges. Cham Springer, 2019.
  • [20] P. Pandit, P. Dey, & K. Krishnamurthy, "Comparative assessment of multiple linear regression and fuzzy linear regression models", SN Computer Science, vol. 2, no. 2, 2021. https://doi.org/10.1007/s42979-021-00473-3
  • [21] O. Altay, T. Gurgenc, M. Ulas, and C. Özel, “Prediction of wear loss quantities of ferro-alloy coating using different machine learning algorithms,” Friction, vol. 8, no. 1, pp. 107–114, Jan. 2019, doi: https://doi.org/10.1007/s40544-018-0249-z.
  • [22] Thorson, J., Collier-Oxandale, A., & Hannigan, M. P. (2019). Using a low-cost sensor array and machine learning techniques to detect complex pollutant mixtures and identify likely sources. Sensors, 19(17), 3723. https://doi.org/10.3390/s19173723
  • [23] D. A. Freedman, Statistical Models: Theory and Practice, -2nd ed. Cambridge University Press, 2009.
  • [24] J. H. Friedman, “Stochastic gradient boosting,” Computational Statistics & Data Analysis, vol. 38, no. 4, pp. 367–378, Feb. 2002, doi: https://doi.org/10.1016/s0167-9473(01)00065-2.
  • [25] C. Bentéjac, A. Csörgő, and G. Martínez-Muñoz, “A comparative analysis of gradient boosting algorithms,” Artificial Intelligence Review, vol. 54, Aug. 2020, doi: https://doi.org/10.1007/s10462-020-09896-5.
  • [26] Y. Veisani, H. Sayyadi, A. Sahebi, G. Moradi, F. Mohamadian, and A. Delpisheh, “Comparison of machine learning algorithms to predict intentional and unintentional poisoning risk factors,” Heliyon, vol. 9, no. 6, p. e17337, Jun. 2023, doi: https://doi.org/10.1016/j.heliyon.2023.e17337 .
  • [27] W. S. McCulloch and W. Pitts, “A logical calculus of the ideas immanent in nervous activity,” The Bulletin of Mathematical Biophysics, vol. 5, no. 4, pp. 115–133, Dec. 1943, doi: https://doi.org/10.1007/bf02478259.
  • [28] Z. Feng and W. Niu, “Hybrid artificial neural network and cooperation search algorithm for nonlinear river flow time series forecasting in humid and semi-humid regions,” Knowledge-Based Systems, vol. 211, p. 106580, Jan. 2021, doi: https://doi.org/10.1016/j.knosys.2020.106580.
  • [29] A. Ameri, M. A. Akhaee, E. Scheme, and K. Englehart, “Regression convolutional neural network for improved simultaneous EMG control,” Journal of Neural Engineering, vol. 16, no. 3, p. 036015, Apr. 2019, doi: https://doi.org/10.1088/1741-2552/ab0e2e.
  • [30] V. W. Y. Tam, A. Butera, K. N. Le, L. C. F. D. Silva, and A. C. J. Evangelista, “A prediction model for compressive strength of CO2 concrete using regression analysis and artificial neural networks,” Construction and Building Materials, vol. 324, p. 126689, Mar. 2022, doi: https://doi.org/10.1016/j.conbuildmat.2022.126689.
  • [31] H. Hristov and G. Momcheva, “Hyperparameter adjustment in regression neural networks for predicting support case durations,” AIP conference proceedings, Jan. 2021, doi: https://doi.org/10.1063/5.0041936.
  • [32] L. N. Smith, “A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay,” arXiv.org, Apr. 24, 2018. https://arxiv.org/abs/1803.09820v2 (accessed Nov. 17, 2023).
  • [33] Gurgenc, E., Altay, O., and Altay, E. V. (2024). AOSMA-MLP: A Novel Method for Hybrid Metaheuristics Artificial Neural Networks and a New Approach for Prediction of Geothermal Reservoir Temperature. Applied Sciences, 14(8), 3534. doi: https://doi.org/10.3390/app14083534
  • [34] Altay, O., and Gurgenc, T. (2024). GJO-MLP: A Novel Method for Hybrid Metaheuristics Multi-Layer Perceptron And A New Approach For Prediction Of Wear Loss Of Az91d Magnesium Alloy Worn At Dry, Oil, And H-Bn Nanoadditive Oil. Surface Review and Letters (SRL), 31(06), 1-16. doi: https://doi.org/10.1142/S0218625X24500483
  • [35] Altay, O., and Varol Altay, E. (2023). A novel hybrid multilayer perceptron neural network with improved grey wolf optimizer. Neural Computing and Applications, 35(1), 529-556. doi: https://doi.org/10.1007/s00521-022-07775-4
  • [36] Altay, E. V., Gurgenc, E., Altay, O., and Dikici, A. (2022). Hybrid artificial neural network based on a metaheuristic optimization algorithm for the prediction of reservoir temperature using hydrogeochemical data of different geothermal areas in Anatolia (Turkey). Geothermics, 104, 102476. doi: https://doi.org/10.1016/j.geothermics.2022.102476
  • [37] Gurgenc, T., and Altay, O. (2022). Surface roughness prediction of wire electric discharge machining (WEDM)-machined AZ91D magnesium alloy using multilayer perceptron, ensemble neural network, and evolving product-unit neural network. Materials Testing, 64(3), 350-362. doi: https://doi.org/10.1515/mt-2021-2034
  • [38] L. A. Shalabi, Z. Shaaban, and B. Kasasbeh, “Data Mining: A Preprocessing Engine,” Journal of Computer Science, vol. 2, no. 9, pp. 735–739, Sep. 2006, doi: https://doi.org/10.3844/jcssp.2006.735.739.
  • [39] C. Willmott and K. Matsuura, “Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance,” Climate Research, vol. 30, no. 1, pp. 79–82, 2005, doi: https://doi.org/10.3354/cr030079.
  • [40] D. Chicco, M. J. Warrens, and G. Jurman, “The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE and RMSE in regression analysis evaluation,” PeerJ Computer Science, vol. 7, no. 5, p. e623, Jul. 2021, doi: https://doi.org/10.7717/peerj-cs.623.
  • [41] Y. Shao, F. M. Dietrich, C. Nettelblad, and C. Zhang, “Training algorithm matters for the performance of neural network potential: A case study of Adam and the Kalman filter optimizers,” Journal of chemical physics online/The Journal of chemical physics/Journal of chemical physics, vol. 155, no. 20, Nov. 2021, doi: https://doi.org/10.1063/5.0070931.
Toplam 41 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Makine Öğrenme (Diğer), Veri Yönetimi ve Veri Bilimi (Diğer)
Bölüm Makaleler
Yazarlar

Güney Kaya 0009-0007-7687-7350

Emre Şen 0009-0006-8640-8374

Osman Altay 0000-0003-3989-2432

Erken Görünüm Tarihi 26 Mart 2025
Yayımlanma Tarihi
Gönderilme Tarihi 31 Mayıs 2024
Kabul Tarihi 10 Ocak 2025
Yayımlandığı Sayı Yıl 2025 Cilt: 16 Sayı: 1

Kaynak Göster

IEEE G. Kaya, E. Şen, ve O. Altay, “Predicting CPU Performance Score with Regression Analysis”, DÜMF MD, c. 16, sy. 1, ss. 1–11, 2025, doi: 10.24012/dumf.1493049.
DUJE tarafından yayınlanan tüm makaleler, Creative Commons Atıf 4.0 Uluslararası Lisansı ile lisanslanmıştır. Bu, orijinal eser ve kaynağın uygun şekilde belirtilmesi koşuluyla, herkesin eseri kopyalamasına, yeniden dağıtmasına, yeniden düzenlemesine, iletmesine ve uyarlamasına izin verir. 24456