Research Article
BibTex RIS Cite

Manifold Öğrenme Tekniklerinin Uygulaması ve Karşılaştırmalı Analizi

Year 2025, Volume: 25 Issue: 6, 1348 - 1358

Abstract

Bu makalede, İzomap, Yerel Doğrusal Gömme (LLE), Hessian LLE (HLLE), t-dağıtımlı Stokastik Komşu Gömme (t-SNE) ve Boyut Azaltma için Düzgün Manifold Yaklaşımı ve Projeksiyonu (UMAP) manifold öğrenme tekniklerinin parametre uzayı konveks ve konveks olmayan manifoldlar üzerindeki veri setlerine uygulamaları yapılmıştır. Ayrıca bu tekniklerin performansları sayısal büyüklükler ve gömme sonuçlarının görselleri açısından incelenerek bir analiz sunulmaktadır. Bu çalışmada, kullanılan parametre uzayı konveks ve konveks olmayan manifoldlar Swiss Roll ve delikli Swiss Roll olarak adlandırılan manifoldlardır.

References

  • Amir, E. A. D., Davis, K. L., Tadmor, M. D., Simonds, E. F., Levine, J. H., Bendall, S. C., Shenfeld, D. K., Krishnaswamy, S., Nolan, G. P. and Pe'er, D., 2013. viSNE enables visualization of high dimensional single-cell data and reveals phenotypic heterogeneity of leukemia. Nature biotechnology, 31(6), 545-552.
  • Becht, E., McInnes, L., Healy, J., Dutertre, C. A., Kwok, I. W., Ng, L. G., Ginhoux, F. and Newell, E. W., 2019. Dimensionality reduction for visualizing single-cell data using UMAP. Nature biotechnology, 37(1), 38-44.
  • Belkin, M. and Niyogi, P., 2003. Laplacian eigenmaps for dimensionality reduction and data representation, Neural Computation, 15(6):1373-1396.
  • Bode, S., He, A. H., Soon, C. S., Trampel, R., Turner, R. and Haynes, J. D. (2011). Tracking the unconscious generation of free decisions using uitra-high field fMRI. PloS one, 6(6), e21612.
  • Cayton, L., 2005. Algorithms for manifold learning. Technical Report CS2008-0923, University of California, San Diego.
  • Charte, D., Charte, F., and Herrera, F., 2021. Reducing data complexity using autoencoders with class-informed loss functions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(12), 9549-9560.
  • De Ridder, D. and Duin, R. P., 2002. Locally linear embedding for classification. Pattern Recognition Group, Dept. of Imaging Science and Technology, Delft University of Technology, Delft, The Netherlands, Tech. Rep. PH-2002-01, 1-12.
  • De Silva, V. and Tenenbaum, J. B., 2004. Sparse multidimensional scaling using landmark points (Vol. 120). Technical report, Stanford University.
  • Diaz-Papkovich, A., Anderson-Trocmé, L., Ben-Eghan, C., and Gravel, S., 2019. UMAP reveals cryptic population structure and phenotype heterogeneity in large genomic cohorts. PLoS genetics, 15(11), e1008432.
  • Dijkstra, E. W., 2022. A note on two problems in connexion with graphs. In Edsger Wybe Dijkstra: his life, work, and legacy (pp. 287-290).
  • Donoho, D. L. and Grimes, C., 2003. Hessian eigenmaps: Locally linear embedding techniques for high dimensional data, The Proceedings of the National Academy of Sciences (PNAS), 100:5591-5596.
  • Floyd, R. W., 1962. Algorithm 97: shortest path. Communications of the ACM, 5(6), 345-345.
  • Jolliffe IT, Cadima J., 2016. Principal component analysis: a review and recent developments Philosophical transactions of the royal society A: Mathematical, Physical and Engineering Sciences 374(2065):20150202. https://doi.org/10.1098/rsta.2015.0202
  • Kaski, S., Nikkilä, J., Oja, M., Venna, J., Törönen, P. and Castrén, E., 2003. Trustworthiness and metrics in visualizing similarity of gene expression. BMC bioinformatics, 4, 1-13.
  • Kobak, D. and Berens, P., 2019. The art of using t-SNE for single-cell transcriptomics. Nature Communications 10, 5416. https://doi.org/10.1038/s41467-019-13056-x.
  • Kobak, D. and Linderman, G. C., 2021. Initialization is critical for preserving global data structure in both t-SNE and UMAP. Nature biotechnology, 39(2), 156-157.
  • Kraemer, D., Reichstein, M., and Mahecha, M. D., 2018. dimRed and coRanking—Unifying dimensionality reduction in R. The R Journal, 10(1), 342–358. https://doi.org/10.32614/RJ-2018-038.
  • Kullback, S. and Leibler, R. A., 1951. On information and sufficiency. The annals of mathematical statistics, 22(1), 79-86.
  • Lee, J., 2010. Introduction to topological manifolds (Vol. 202). Springer Science & Business Media.
  • Lee, J. A. and Verleysen, M., 2007. Nonlinear dimensionality reduction, Springer.
  • Lee, J. A. and Verleysen, M., 2009. Quality assessment of dimensionality reduction: Rank-based criteria. Neurocomputing., 72(7-9):1431–1443.
  • Lueks, W., Mokbel, B., Biehl, M. and Hammer, B., 2011. How to Evaluate Dimensionality Reduction? Improving the Co-ranking Matrix. arXiv:1110.3917.
  • McInnes, L., Healy, J. and Melville, J., 2018. Umap: Uniform manifold approximation and projection for dimension reduction. arXiv preprint arXiv:1802.03426.
  • Mikolov, T., Chen, K., Corrado, G. and Dean, J., 2013. Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781.
  • Moon, K. R., Van Dijk, D., Wang, Z., Gigante, S., Burkhardt, D. B., Chen, W. S., Yim, K., van den Elzen, A., Hirn, M. J., Coifman, R. R., Ivanova, N. B., Wolf, G., and Krishnaswamy, S., 2019. Visualizing structure and transitions in high-dimensional biological data. Nature biotechnology, 37(12), 1482-1492.
  • Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M. and Duchesnay, É., 2011. Scikit-learn: Machine learning in Python. The Journal of machine Learning research, 12, 2825-2830.
  • Roweis, S. T. and Saul L. K., 2000. Nonlinear dimensionality reduction by locally linear embedding, Science, 290:2323-2326.
  • Tenenbaum, J. B., De Silva, V. and Langford, J.C., 2000. A Global Geometric Framework for Nonlinear Dimensionality Reduction, Science, 290:2319-2323.
  • Van Der Maaten, L. and Hinton, G., 2008. Visualizing Data using t-SNE, Journal of Machine Learning Research, 9 2579-2605.
  • Venna, J. and Kaski, S., 2006. Local multidimensional scaling. Neural Networks, 19(6-7), 889-899.
  • Venna, J., Peltonen, J., Nybo, K., Aidos, H. and Kaski, S., 2010. Information retrieval perspective to nonlinear dimensionality reduction for data visualization. Journal of Machine Learning Research, 11(2).
  • Zhang, Z., 2003. Learning metrics via discriminant kernels and multidimensional scaling: Toward expected euclidean representation. In Icml, 2,872-879.

Application and Comparative Analysis of Manifold Learning Techniques

Year 2025, Volume: 25 Issue: 6, 1348 - 1358

Abstract

In this paper, Isomap, Local Linear Embedding (LLE), Hessian LLE (HLLE), t-distributed Stochastic Neighbour Embedding (t-SNE) and Uniform Manifold Approximation and Projection for Dimensionality Reduction (UMAP) manifold learning techniques are applied to data sets on manifolds with convex and non-convex parameter spaces. In addition, an analysis of the performance of these techniques in terms of numerical quantities and visualizations of the embedding results are presented. The parameter space convex and non-convex manifolds used in this study are the so-called Swiss Roll and Swiss Roll with a rectangular hole.

References

  • Amir, E. A. D., Davis, K. L., Tadmor, M. D., Simonds, E. F., Levine, J. H., Bendall, S. C., Shenfeld, D. K., Krishnaswamy, S., Nolan, G. P. and Pe'er, D., 2013. viSNE enables visualization of high dimensional single-cell data and reveals phenotypic heterogeneity of leukemia. Nature biotechnology, 31(6), 545-552.
  • Becht, E., McInnes, L., Healy, J., Dutertre, C. A., Kwok, I. W., Ng, L. G., Ginhoux, F. and Newell, E. W., 2019. Dimensionality reduction for visualizing single-cell data using UMAP. Nature biotechnology, 37(1), 38-44.
  • Belkin, M. and Niyogi, P., 2003. Laplacian eigenmaps for dimensionality reduction and data representation, Neural Computation, 15(6):1373-1396.
  • Bode, S., He, A. H., Soon, C. S., Trampel, R., Turner, R. and Haynes, J. D. (2011). Tracking the unconscious generation of free decisions using uitra-high field fMRI. PloS one, 6(6), e21612.
  • Cayton, L., 2005. Algorithms for manifold learning. Technical Report CS2008-0923, University of California, San Diego.
  • Charte, D., Charte, F., and Herrera, F., 2021. Reducing data complexity using autoencoders with class-informed loss functions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(12), 9549-9560.
  • De Ridder, D. and Duin, R. P., 2002. Locally linear embedding for classification. Pattern Recognition Group, Dept. of Imaging Science and Technology, Delft University of Technology, Delft, The Netherlands, Tech. Rep. PH-2002-01, 1-12.
  • De Silva, V. and Tenenbaum, J. B., 2004. Sparse multidimensional scaling using landmark points (Vol. 120). Technical report, Stanford University.
  • Diaz-Papkovich, A., Anderson-Trocmé, L., Ben-Eghan, C., and Gravel, S., 2019. UMAP reveals cryptic population structure and phenotype heterogeneity in large genomic cohorts. PLoS genetics, 15(11), e1008432.
  • Dijkstra, E. W., 2022. A note on two problems in connexion with graphs. In Edsger Wybe Dijkstra: his life, work, and legacy (pp. 287-290).
  • Donoho, D. L. and Grimes, C., 2003. Hessian eigenmaps: Locally linear embedding techniques for high dimensional data, The Proceedings of the National Academy of Sciences (PNAS), 100:5591-5596.
  • Floyd, R. W., 1962. Algorithm 97: shortest path. Communications of the ACM, 5(6), 345-345.
  • Jolliffe IT, Cadima J., 2016. Principal component analysis: a review and recent developments Philosophical transactions of the royal society A: Mathematical, Physical and Engineering Sciences 374(2065):20150202. https://doi.org/10.1098/rsta.2015.0202
  • Kaski, S., Nikkilä, J., Oja, M., Venna, J., Törönen, P. and Castrén, E., 2003. Trustworthiness and metrics in visualizing similarity of gene expression. BMC bioinformatics, 4, 1-13.
  • Kobak, D. and Berens, P., 2019. The art of using t-SNE for single-cell transcriptomics. Nature Communications 10, 5416. https://doi.org/10.1038/s41467-019-13056-x.
  • Kobak, D. and Linderman, G. C., 2021. Initialization is critical for preserving global data structure in both t-SNE and UMAP. Nature biotechnology, 39(2), 156-157.
  • Kraemer, D., Reichstein, M., and Mahecha, M. D., 2018. dimRed and coRanking—Unifying dimensionality reduction in R. The R Journal, 10(1), 342–358. https://doi.org/10.32614/RJ-2018-038.
  • Kullback, S. and Leibler, R. A., 1951. On information and sufficiency. The annals of mathematical statistics, 22(1), 79-86.
  • Lee, J., 2010. Introduction to topological manifolds (Vol. 202). Springer Science & Business Media.
  • Lee, J. A. and Verleysen, M., 2007. Nonlinear dimensionality reduction, Springer.
  • Lee, J. A. and Verleysen, M., 2009. Quality assessment of dimensionality reduction: Rank-based criteria. Neurocomputing., 72(7-9):1431–1443.
  • Lueks, W., Mokbel, B., Biehl, M. and Hammer, B., 2011. How to Evaluate Dimensionality Reduction? Improving the Co-ranking Matrix. arXiv:1110.3917.
  • McInnes, L., Healy, J. and Melville, J., 2018. Umap: Uniform manifold approximation and projection for dimension reduction. arXiv preprint arXiv:1802.03426.
  • Mikolov, T., Chen, K., Corrado, G. and Dean, J., 2013. Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781.
  • Moon, K. R., Van Dijk, D., Wang, Z., Gigante, S., Burkhardt, D. B., Chen, W. S., Yim, K., van den Elzen, A., Hirn, M. J., Coifman, R. R., Ivanova, N. B., Wolf, G., and Krishnaswamy, S., 2019. Visualizing structure and transitions in high-dimensional biological data. Nature biotechnology, 37(12), 1482-1492.
  • Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M. and Duchesnay, É., 2011. Scikit-learn: Machine learning in Python. The Journal of machine Learning research, 12, 2825-2830.
  • Roweis, S. T. and Saul L. K., 2000. Nonlinear dimensionality reduction by locally linear embedding, Science, 290:2323-2326.
  • Tenenbaum, J. B., De Silva, V. and Langford, J.C., 2000. A Global Geometric Framework for Nonlinear Dimensionality Reduction, Science, 290:2319-2323.
  • Van Der Maaten, L. and Hinton, G., 2008. Visualizing Data using t-SNE, Journal of Machine Learning Research, 9 2579-2605.
  • Venna, J. and Kaski, S., 2006. Local multidimensional scaling. Neural Networks, 19(6-7), 889-899.
  • Venna, J., Peltonen, J., Nybo, K., Aidos, H. and Kaski, S., 2010. Information retrieval perspective to nonlinear dimensionality reduction for data visualization. Journal of Machine Learning Research, 11(2).
  • Zhang, Z., 2003. Learning metrics via discriminant kernels and multidimensional scaling: Toward expected euclidean representation. In Icml, 2,872-879.
There are 32 citations in total.

Details

Primary Language Turkish
Subjects Statistical Data Science, Pure Mathematics (Other)
Journal Section Articles
Authors

Hatice Çoban Eroğlu 0000-0001-7418-7785

Early Pub Date November 13, 2025
Publication Date November 14, 2025
Submission Date December 2, 2024
Acceptance Date July 18, 2025
Published in Issue Year 2025 Volume: 25 Issue: 6

Cite

APA Çoban Eroğlu, H. (2025). Manifold Öğrenme Tekniklerinin Uygulaması ve Karşılaştırmalı Analizi. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi, 25(6), 1348-1358.
AMA Çoban Eroğlu H. Manifold Öğrenme Tekniklerinin Uygulaması ve Karşılaştırmalı Analizi. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi. November 2025;25(6):1348-1358.
Chicago Çoban Eroğlu, Hatice. “Manifold Öğrenme Tekniklerinin Uygulaması Ve Karşılaştırmalı Analizi”. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi 25, no. 6 (November 2025): 1348-58.
EndNote Çoban Eroğlu H (November 1, 2025) Manifold Öğrenme Tekniklerinin Uygulaması ve Karşılaştırmalı Analizi. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi 25 6 1348–1358.
IEEE H. Çoban Eroğlu, “Manifold Öğrenme Tekniklerinin Uygulaması ve Karşılaştırmalı Analizi”, Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi, vol. 25, no. 6, pp. 1348–1358, 2025.
ISNAD Çoban Eroğlu, Hatice. “Manifold Öğrenme Tekniklerinin Uygulaması Ve Karşılaştırmalı Analizi”. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi 25/6 (November2025), 1348-1358.
JAMA Çoban Eroğlu H. Manifold Öğrenme Tekniklerinin Uygulaması ve Karşılaştırmalı Analizi. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi. 2025;25:1348–1358.
MLA Çoban Eroğlu, Hatice. “Manifold Öğrenme Tekniklerinin Uygulaması Ve Karşılaştırmalı Analizi”. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi, vol. 25, no. 6, 2025, pp. 1348-5.
Vancouver Çoban Eroğlu H. Manifold Öğrenme Tekniklerinin Uygulaması ve Karşılaştırmalı Analizi. Afyon Kocatepe Üniversitesi Fen Ve Mühendislik Bilimleri Dergisi. 2025;25(6):1348-5.