Araştırma Makalesi
BibTex RIS Kaynak Göster

A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)

Yıl 2022, Cilt: 10 Sayı: 2, 49 - 56, 30.06.2022
https://doi.org/10.18100/ijamec.1080843

Öz

The main motivation of the study is to prevent and optimize the deviations in linear connections with complex calculations related to the previous and next steps. This purpose is used for more stable detection and therefore segmentation of object edge/corner regions in Quality Control Systems with Image Processing and Artificial Intelligence algorithms produced by authors within Alpplas Industrial Investments Inc. The dataset used in this area was originally obtained as a result of the edge approaches of the plastic panels manufactured by Alpplas Inc., extracted from the images taken from the AlpVision Quality Control Machine patented with this research. The data consists entirely of the pixel values of the edge points. Dispersed numeric data sets have quite changeable values, create high complexity and require the computation of formidable correlation. In this study, dispersed numeric data optimized by fitting to linearity. The LFLD (Linear Fitting on Locally Deflection) algorithm developed to solve the problem of linear fitting. Dispersed numeric data can be regulated and could be rendered linearly which is curved line smoothing, or line fitting by desired tolerance values. The LFLD algorithm organizes the data by creating a regular linear line (fitting) from the complex data according to the desired tolerance values.

Kaynakça

  • [1] Ethem Alpaydın. “Adaptive Com-putation and Machine Learning”. Introduction to Machine Learning, MIT Press, Cambridge, MA, 3 edition, 2014, pp. 79.
  • [2] Y. W. Chang et al., “Training and testing low-degree polynomial data mappings via linear svm”, J. Mach. Learn. Res., 11, pp. 1471–1490, 2010.
  • [3] W. S. Cleveland, “Robust locally weighted regression and smooth-ing scatterplots”, Journal of the American Statistical Association, 74(368), pp. 829–836, 1979.
  • [4] W. S. Cleveland and S. J. Devlin, “Locally weighted regression: An approach to regression analysis by local fitting”, Journal of the American Statistical Association, 83(403), pp. 596–610, 1988.
  • [5] D. Freedman, “Statistical Models: Theory and Practice”, Cambridge University Press, August 2005.
  • [6] H. L. Seal, “Studies in the history of probability and statistics. xv: The historical development of the gauss linear model”, Biometrika, 54(1/2), pp. 1–24, 1967.
  • [7] Caruana, Rich and Virginia R. de Sa. “Benefitting from the Variables that Variable Selection Discards.” J. Mach. Learn. Res., 3, pp. 1245-1264, 2003.
  • [8] V. N. Vapnik, “Conditions for Consistency of Empirical Risk Minimization Principle”. Statistical Learning Theory, Wiley Interscience Publication, 1998, pp. 82.
  • [9] Chatfield, Chris, “Non-linear and non-stationary time series analysis”. M.B. Priestley Academic Press, London, 1989, pp. 237.
  • [10] H. J. Seltman, “Simple Linear Regression”. Experimental Design and Analysis, Carnegie Mellon University, 2013, pp.227.
  • [11] Y. Dodge, “Simple Linear Regression”. The Concise Encyclopedia of Statistics, Springer New York, 2008, pp.491-497.
  • [12] D. M. Lane, “Introduction to Linear Regression”. Introduction to Statistics, David Lane Rich University, 2008, pp.462.
  • [13] K. H. Zou, K. Tuncali, and S. G. Silverman, “Correlation and simple linear regression”, Radiology, 227(3), pp. 617–622, 2003.
  • [14] N. Altman, M. Krzywinski, “Simple linear regression”, Nat Methods, 12(11), pp. 999-1000, 2015.
  • [15] I. Sereda et al., “Segmentation by Neural Networks: Errors and Correction”, Computing Research Repository (CoRR), 2018.
  • [16] W. S. Cleveland and S. J. Devlin, “Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting”, Journal of the American Statistical Association, 83(403), pp. 596-610, 1988.
  • [17] J. Canny, “A Computational Approach to Edge Detection”, IEEE Transactions on Pattern Analysis andMachine Intelligence, PAMI-8 (6), pp. 679-698, 1986.
  • [18] Kaggle, Linear Regression: Signal and Noise, USA [Online]. Available:https://www.kaggle.com/competitions/inclass-signal-and-noise/data, Accessed on: May. 23, 2022.
Yıl 2022, Cilt: 10 Sayı: 2, 49 - 56, 30.06.2022
https://doi.org/10.18100/ijamec.1080843

Öz

Kaynakça

  • [1] Ethem Alpaydın. “Adaptive Com-putation and Machine Learning”. Introduction to Machine Learning, MIT Press, Cambridge, MA, 3 edition, 2014, pp. 79.
  • [2] Y. W. Chang et al., “Training and testing low-degree polynomial data mappings via linear svm”, J. Mach. Learn. Res., 11, pp. 1471–1490, 2010.
  • [3] W. S. Cleveland, “Robust locally weighted regression and smooth-ing scatterplots”, Journal of the American Statistical Association, 74(368), pp. 829–836, 1979.
  • [4] W. S. Cleveland and S. J. Devlin, “Locally weighted regression: An approach to regression analysis by local fitting”, Journal of the American Statistical Association, 83(403), pp. 596–610, 1988.
  • [5] D. Freedman, “Statistical Models: Theory and Practice”, Cambridge University Press, August 2005.
  • [6] H. L. Seal, “Studies in the history of probability and statistics. xv: The historical development of the gauss linear model”, Biometrika, 54(1/2), pp. 1–24, 1967.
  • [7] Caruana, Rich and Virginia R. de Sa. “Benefitting from the Variables that Variable Selection Discards.” J. Mach. Learn. Res., 3, pp. 1245-1264, 2003.
  • [8] V. N. Vapnik, “Conditions for Consistency of Empirical Risk Minimization Principle”. Statistical Learning Theory, Wiley Interscience Publication, 1998, pp. 82.
  • [9] Chatfield, Chris, “Non-linear and non-stationary time series analysis”. M.B. Priestley Academic Press, London, 1989, pp. 237.
  • [10] H. J. Seltman, “Simple Linear Regression”. Experimental Design and Analysis, Carnegie Mellon University, 2013, pp.227.
  • [11] Y. Dodge, “Simple Linear Regression”. The Concise Encyclopedia of Statistics, Springer New York, 2008, pp.491-497.
  • [12] D. M. Lane, “Introduction to Linear Regression”. Introduction to Statistics, David Lane Rich University, 2008, pp.462.
  • [13] K. H. Zou, K. Tuncali, and S. G. Silverman, “Correlation and simple linear regression”, Radiology, 227(3), pp. 617–622, 2003.
  • [14] N. Altman, M. Krzywinski, “Simple linear regression”, Nat Methods, 12(11), pp. 999-1000, 2015.
  • [15] I. Sereda et al., “Segmentation by Neural Networks: Errors and Correction”, Computing Research Repository (CoRR), 2018.
  • [16] W. S. Cleveland and S. J. Devlin, “Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting”, Journal of the American Statistical Association, 83(403), pp. 596-610, 1988.
  • [17] J. Canny, “A Computational Approach to Edge Detection”, IEEE Transactions on Pattern Analysis andMachine Intelligence, PAMI-8 (6), pp. 679-698, 1986.
  • [18] Kaggle, Linear Regression: Signal and Noise, USA [Online]. Available:https://www.kaggle.com/competitions/inclass-signal-and-noise/data, Accessed on: May. 23, 2022.
Toplam 18 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Mühendislik
Bölüm Research Article
Yazarlar

Mahmut Sami Yasak 0000-0003-4444-161X

Muhammed Said Bilgehan 0000-0002-8706-1943

Yayımlanma Tarihi 30 Haziran 2022
Yayımlandığı Sayı Yıl 2022 Cilt: 10 Sayı: 2

Kaynak Göster

APA Yasak, M. S., & Bilgehan, M. S. (2022). A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers, 10(2), 49-56. https://doi.org/10.18100/ijamec.1080843
AMA Yasak MS, Bilgehan MS. A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers. Haziran 2022;10(2):49-56. doi:10.18100/ijamec.1080843
Chicago Yasak, Mahmut Sami, ve Muhammed Said Bilgehan. “A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)”. International Journal of Applied Mathematics Electronics and Computers 10, sy. 2 (Haziran 2022): 49-56. https://doi.org/10.18100/ijamec.1080843.
EndNote Yasak MS, Bilgehan MS (01 Haziran 2022) A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers 10 2 49–56.
IEEE M. S. Yasak ve M. S. Bilgehan, “A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)”, International Journal of Applied Mathematics Electronics and Computers, c. 10, sy. 2, ss. 49–56, 2022, doi: 10.18100/ijamec.1080843.
ISNAD Yasak, Mahmut Sami - Bilgehan, Muhammed Said. “A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)”. International Journal of Applied Mathematics Electronics and Computers 10/2 (Haziran 2022), 49-56. https://doi.org/10.18100/ijamec.1080843.
JAMA Yasak MS, Bilgehan MS. A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers. 2022;10:49–56.
MLA Yasak, Mahmut Sami ve Muhammed Said Bilgehan. “A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD)”. International Journal of Applied Mathematics Electronics and Computers, c. 10, sy. 2, 2022, ss. 49-56, doi:10.18100/ijamec.1080843.
Vancouver Yasak MS, Bilgehan MS. A Line Fitting Algorithm: Linear Fitting on Locally Deflection (LFLD). International Journal of Applied Mathematics Electronics and Computers. 2022;10(2):49-56.