Research Article
BibTex RIS Cite

Manifold öğrenme yöntemlerinin ileri seviye regresyon yöntemleri ile genelleştirilmesi

Year 2022, Volume: 37 Issue: 1, 485 - 496, 10.11.2021
https://doi.org/10.17341/gazimmfd.704793

Abstract

Doğrusal olmayan boyut indirgeme yöntemleri, diğer bir adı ile Manifold Öğrenme (MÖ) yöntemleri konusunda son zamanlarda ciddi araştırmalar yapılmaktadır. MÖ yöntemleri, yüksek boyutlu verinin içinde aslında daha az boyutlu bir uzayda doğrusal olmayan bir manifoldun yer aldığı varsayımı üzerine çizge tabanlı bir dönüşüm yapmaktadır. Yüksek boyutlu uzayda yer alan verinin daha az boyutlu uzaya dönüştürülmesi için, veriler arası komşuluk ilişkilerinin korunması hedeflenir. MÖ yöntemlerinin birçoğu, eğitim verisinin tamamını birden alt uzaya dönüştür ve dönüşüme ait herhangi bir dönüşüm matrisi ya da analitik yapısı belli bir gömüleme fonksiyonu üretmezler. Bu sebepten ötürü, sonradan gelebilecek test verilerinin aynı alt uzaya dönüşümleri yapılamaz. Dönüşümün yapılabilmesi için, test verileri, önceki eğitim verileri ile birlikte, ilgili manifold öğrenme yöntemine tekrardan verilerek, öğrenme işlemi yeniden başlatılır. Ancak, her yeni test verisi geldikçe bu durumun tekrarlanması gerekeceğinden, hesaplama maliyeti artacaktır. Bu nedenle, özellikle de sınıflandırma amaçlı çalışmalar için, manifold öğrenme yöntemlerinin, yeni gelecek test verisini alt uzaya dönüştürecek genel çözümlerine gereksinim vardır. Bu çalışmada, literatürde, örneklem dışı veri problemi olarak bilinen bu sorunun üstesinden gelmek için ileri seviye regresyon yöntemleri kullanılmıştır. İlgili manifold öğrenme yöntemi, regresyon yöntemleri ile modellenerek, dönüşüme ait gömüleme fonksiyonları üretilmiş ve geliştirilen modellerin performansları hiperspektral verilerin sınıflandırılması üzerinde ayrıntılı bir biçimde analiz edilmiştir.

Supporting Institution

İstanbul Teknik Üniversitesi, BAP Birimi

Project Number

MGA-2018-41152

References

  • 1. Lunga, D., Prasad, S., Crawford, M. M., Ersoy, O., and Data, H., A review of advances in manifold learning, IEEE Signal Processing Magazine, 2014, 55–66, 2014.
  • 2. Camps-Valls, G., Tuia, D., Gomez-Chova, L., Jimenez, S., and Malo, J., Remote Sensing Image Processing, Morgan &Claypool Publishers series, 2012.
  • 3. Bellman, R., Adaptive Control Processes: A Guided Tour. Princeton, NJ: Princeton Univ. Press, 1961.
  • 4. Bishop, C. M., Neural networks for pattern recognition, Oxford University Press, 1995.
  • 5. D. Xiao and J. Zhang, Feature selection: Evaluation, application, and small sample performance, Pattern Analysis and Machine Intelligence, 19 (2), 197–201, 2009.
  • 6. Jia, X., Kuo, B.-C., and Crawford, M. M., Feature Mining for Hyperspectral Image Classification, Proc. IEEE, 101 (3), 2013.
  • 7. Roweis, S. T., and Saul, L. K., Nonlinear dimensionality reduction by locally linear embedding, Science, 290 (5500), 2323–2326, 2000.
  • 8. Saul L. K. and Roweis, S. T., Think globally, fit locally: Unsupervised learning of low dimensional manifold, J. Mach. Learn. Res., 4, 119–155, 2003.
  • 9. Yan, S., Xu, D., Zhang, B., Zhang, H. J., Yang, Q., and S. Lin, Graph embedding and extensions: A general framework for dimensionality reduction, IEEE Trans. Pattern Anal. Mach. Intell., 29 (1), 40–51, 2007.
  • 10. Belkin M, Niyogi P (2001). Laplacian eigenmaps and spectral techniques for embedding and clustering. MIT Press, In: Advances in neural information processing systems, pp 585–591
  • 11. Tenenbaum, J., Silva, V. de., and Langford, L., A global geometric framework for nonlinear dimensionality reduction, Science, 290 (5500), 2319–2323, 2000.
  • 12. Donoho D. and Grimes C., Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data, Proc Nat Acad Sci USA, 100 (10), 5591–5596, 2003.
  • 13. Zhang Z. and Zha H., Principal manifolds and nonlinear dimension reduction via local tangent space alignment, SIAM J Scientif Comput., 26 (1), 313–338, 2005.
  • 14. Zhu, F., Wang, Y., Xiang, S., Fan, B., and Pan, C., Structured Sparse Method for Hyperspectral Unmixing, ISPRS Journal of Photogrammetry and Remote Sensing, 88, 101-118, 2014.
  • 15. Danping, L., Yuntao, Q. Jun Z. et. al., A Manifold Alignment Approach for Hyperspectral Image Visualization with Natural Color, IEEE Transactions on Geoscience and Remote Sensing, 54 (6), 3151-3162, 2016.
  • 16. Hsiuhan L. Y. and Crawford, M. M., Domain Adaptation with Preservation of Manifold for Hyperspectral Image Classification, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 9 (2), 543-555, 2016.
  • 17. Bengio, Y., Paiement, J., Vincent, Dellallaeu, O., Roux, L., and Quimet, M., Out-of sample extensions for LLE, Isomap, MDS, eigenmaps, and spectral clustering, in Advances in Neural Information Processing System, vol. 16. Cambridge, MA, MIT Press, 2004.
  • 18. Vural, E. and Guillemot, C., Out-of-Sample Generalizations for Supervised Manifold Learning for Classification, IEEE Transactions on Image Processing, 25 (3), 1410-1424, 2016.
  • 19. He, X. and Niyogi, P., Locality preserving projections, in Advances in Neural Information Processing System, vol. 16, Cambridge, MA, MIT Press, 2004.
  • 20. He, X., Cai, D., Yan, S., and Zhang, H., Neighborhood preserving embedding, in Proc. IEEE Int. Conf. Comput. Vis., 1208–1213, 2005.
  • 21. Taşkın, G. and Crawford M. M., An out-of-sample extension to manifold learning via meta-modelling, IEEE Transactions on Image Processing, 28 (10), 5227-5237, 2019.
  • 22. Cai, D., He, X., and Han, J., Spectral Regression for Dimensionality Reduction Spectral Regression for Dimensionality Reduction, University of Illinois at Urbana-Champaign, Tech. Rep. May, 2007.
  • 23. Liu, B., Xia, S. X., Meng, F. R., and Zhou, Y., Extreme spectral regression for efficient regularized subspace learning, Neurocomputing, 149 (Part A), 171–179, 2015.
  • 24. Barkan, O., Weill J., and Averbuch, A., Gaussian process regression for out-of-sample extension, in IEEE Int. Workshop Mach. Learn. Signal Process. (MLSP), 1-6, 2016.
  • 25. Hang, R., Liu, Q., Song, H., Sun, Y., Zhu, F., and Pei, H., Graph regularized nonlinear ridge regression for remote sensing data analysis, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 10 (1), 277–285, 2017.
  • 26. Tang, M., Nie, F., and Jain, R., A graph regularized dimension reduction method for out-of-sample data, Neurocomputing, 225, 58-63, 2017.
  • 27. Orsenigo and C. Vercellis, “Kernel ridge regression for out-of-sample mapping in supervised manifold learning,” Expert Systems with Applications, vol. 39, no. 9, pp. 7757–7762, 2012. [Online]. Available: http://dx.doi.org/10.1016/j.eswa.2012.01.060
  • 28. G. Liu, Z. Lin, and Y. Yu, “Multi-output regression on the output manifold,” Pattern Recognition, vol. 42, no. 11, pp. 2737–2743, 2009. [Online]. Available: http://dx.doi.org/10.1016/j.patcog.2009.05.001
  • 29. Saunders, C., Gammerman, A., and Vovk, V., Ridge regression learning algorithm in dual variables. In Proceedings of the 15th Int. Con. on Mac. Learn., Morgan Kaufmann, 515–521, 1998.
  • 30. Smola, A. J. and Schölkopf, B., A tutorial on support vector regression, Statistics and Computing, 14 (3), 199–222, 2004.
  • 31. Williams, C. K., and Rasmussen, C. E., Gaussian processes for machine learning, MIT Press, 2006.
  • 32. Tipping, M. E., The Relevance Vector Machine, Advances in Neural Information Processing Systems 12, Cambridge, Mass: MIT Press, 2000.
  • 33. Huang, G.-B., and Chen, L., Enhanced random search based incremental extreme learning machine. Neurocomputing, 71 (16–18), 3460–3468, 2008.
  • 34.  Van der Maaten, L.J.P., An Introduction to Dimensionality Reduction Using MatLab, Report MICC 07-07, July, 2007.
Year 2022, Volume: 37 Issue: 1, 485 - 496, 10.11.2021
https://doi.org/10.17341/gazimmfd.704793

Abstract

Project Number

MGA-2018-41152

References

  • 1. Lunga, D., Prasad, S., Crawford, M. M., Ersoy, O., and Data, H., A review of advances in manifold learning, IEEE Signal Processing Magazine, 2014, 55–66, 2014.
  • 2. Camps-Valls, G., Tuia, D., Gomez-Chova, L., Jimenez, S., and Malo, J., Remote Sensing Image Processing, Morgan &Claypool Publishers series, 2012.
  • 3. Bellman, R., Adaptive Control Processes: A Guided Tour. Princeton, NJ: Princeton Univ. Press, 1961.
  • 4. Bishop, C. M., Neural networks for pattern recognition, Oxford University Press, 1995.
  • 5. D. Xiao and J. Zhang, Feature selection: Evaluation, application, and small sample performance, Pattern Analysis and Machine Intelligence, 19 (2), 197–201, 2009.
  • 6. Jia, X., Kuo, B.-C., and Crawford, M. M., Feature Mining for Hyperspectral Image Classification, Proc. IEEE, 101 (3), 2013.
  • 7. Roweis, S. T., and Saul, L. K., Nonlinear dimensionality reduction by locally linear embedding, Science, 290 (5500), 2323–2326, 2000.
  • 8. Saul L. K. and Roweis, S. T., Think globally, fit locally: Unsupervised learning of low dimensional manifold, J. Mach. Learn. Res., 4, 119–155, 2003.
  • 9. Yan, S., Xu, D., Zhang, B., Zhang, H. J., Yang, Q., and S. Lin, Graph embedding and extensions: A general framework for dimensionality reduction, IEEE Trans. Pattern Anal. Mach. Intell., 29 (1), 40–51, 2007.
  • 10. Belkin M, Niyogi P (2001). Laplacian eigenmaps and spectral techniques for embedding and clustering. MIT Press, In: Advances in neural information processing systems, pp 585–591
  • 11. Tenenbaum, J., Silva, V. de., and Langford, L., A global geometric framework for nonlinear dimensionality reduction, Science, 290 (5500), 2319–2323, 2000.
  • 12. Donoho D. and Grimes C., Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data, Proc Nat Acad Sci USA, 100 (10), 5591–5596, 2003.
  • 13. Zhang Z. and Zha H., Principal manifolds and nonlinear dimension reduction via local tangent space alignment, SIAM J Scientif Comput., 26 (1), 313–338, 2005.
  • 14. Zhu, F., Wang, Y., Xiang, S., Fan, B., and Pan, C., Structured Sparse Method for Hyperspectral Unmixing, ISPRS Journal of Photogrammetry and Remote Sensing, 88, 101-118, 2014.
  • 15. Danping, L., Yuntao, Q. Jun Z. et. al., A Manifold Alignment Approach for Hyperspectral Image Visualization with Natural Color, IEEE Transactions on Geoscience and Remote Sensing, 54 (6), 3151-3162, 2016.
  • 16. Hsiuhan L. Y. and Crawford, M. M., Domain Adaptation with Preservation of Manifold for Hyperspectral Image Classification, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 9 (2), 543-555, 2016.
  • 17. Bengio, Y., Paiement, J., Vincent, Dellallaeu, O., Roux, L., and Quimet, M., Out-of sample extensions for LLE, Isomap, MDS, eigenmaps, and spectral clustering, in Advances in Neural Information Processing System, vol. 16. Cambridge, MA, MIT Press, 2004.
  • 18. Vural, E. and Guillemot, C., Out-of-Sample Generalizations for Supervised Manifold Learning for Classification, IEEE Transactions on Image Processing, 25 (3), 1410-1424, 2016.
  • 19. He, X. and Niyogi, P., Locality preserving projections, in Advances in Neural Information Processing System, vol. 16, Cambridge, MA, MIT Press, 2004.
  • 20. He, X., Cai, D., Yan, S., and Zhang, H., Neighborhood preserving embedding, in Proc. IEEE Int. Conf. Comput. Vis., 1208–1213, 2005.
  • 21. Taşkın, G. and Crawford M. M., An out-of-sample extension to manifold learning via meta-modelling, IEEE Transactions on Image Processing, 28 (10), 5227-5237, 2019.
  • 22. Cai, D., He, X., and Han, J., Spectral Regression for Dimensionality Reduction Spectral Regression for Dimensionality Reduction, University of Illinois at Urbana-Champaign, Tech. Rep. May, 2007.
  • 23. Liu, B., Xia, S. X., Meng, F. R., and Zhou, Y., Extreme spectral regression for efficient regularized subspace learning, Neurocomputing, 149 (Part A), 171–179, 2015.
  • 24. Barkan, O., Weill J., and Averbuch, A., Gaussian process regression for out-of-sample extension, in IEEE Int. Workshop Mach. Learn. Signal Process. (MLSP), 1-6, 2016.
  • 25. Hang, R., Liu, Q., Song, H., Sun, Y., Zhu, F., and Pei, H., Graph regularized nonlinear ridge regression for remote sensing data analysis, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 10 (1), 277–285, 2017.
  • 26. Tang, M., Nie, F., and Jain, R., A graph regularized dimension reduction method for out-of-sample data, Neurocomputing, 225, 58-63, 2017.
  • 27. Orsenigo and C. Vercellis, “Kernel ridge regression for out-of-sample mapping in supervised manifold learning,” Expert Systems with Applications, vol. 39, no. 9, pp. 7757–7762, 2012. [Online]. Available: http://dx.doi.org/10.1016/j.eswa.2012.01.060
  • 28. G. Liu, Z. Lin, and Y. Yu, “Multi-output regression on the output manifold,” Pattern Recognition, vol. 42, no. 11, pp. 2737–2743, 2009. [Online]. Available: http://dx.doi.org/10.1016/j.patcog.2009.05.001
  • 29. Saunders, C., Gammerman, A., and Vovk, V., Ridge regression learning algorithm in dual variables. In Proceedings of the 15th Int. Con. on Mac. Learn., Morgan Kaufmann, 515–521, 1998.
  • 30. Smola, A. J. and Schölkopf, B., A tutorial on support vector regression, Statistics and Computing, 14 (3), 199–222, 2004.
  • 31. Williams, C. K., and Rasmussen, C. E., Gaussian processes for machine learning, MIT Press, 2006.
  • 32. Tipping, M. E., The Relevance Vector Machine, Advances in Neural Information Processing Systems 12, Cambridge, Mass: MIT Press, 2000.
  • 33. Huang, G.-B., and Chen, L., Enhanced random search based incremental extreme learning machine. Neurocomputing, 71 (16–18), 3460–3468, 2008.
  • 34.  Van der Maaten, L.J.P., An Introduction to Dimensionality Reduction Using MatLab, Report MICC 07-07, July, 2007.
There are 34 citations in total.

Details

Primary Language Turkish
Subjects Engineering
Journal Section Makaleler
Authors

Gülşen Taşkın 0000-0002-2294-4462

Project Number MGA-2018-41152
Publication Date November 10, 2021
Submission Date March 19, 2020
Acceptance Date June 28, 2021
Published in Issue Year 2022 Volume: 37 Issue: 1

Cite

APA Taşkın, G. (2021). Manifold öğrenme yöntemlerinin ileri seviye regresyon yöntemleri ile genelleştirilmesi. Gazi Üniversitesi Mühendislik Mimarlık Fakültesi Dergisi, 37(1), 485-496. https://doi.org/10.17341/gazimmfd.704793
AMA Taşkın G. Manifold öğrenme yöntemlerinin ileri seviye regresyon yöntemleri ile genelleştirilmesi. GUMMFD. November 2021;37(1):485-496. doi:10.17341/gazimmfd.704793
Chicago Taşkın, Gülşen. “Manifold öğrenme yöntemlerinin Ileri Seviye Regresyon yöntemleri Ile genelleştirilmesi”. Gazi Üniversitesi Mühendislik Mimarlık Fakültesi Dergisi 37, no. 1 (November 2021): 485-96. https://doi.org/10.17341/gazimmfd.704793.
EndNote Taşkın G (November 1, 2021) Manifold öğrenme yöntemlerinin ileri seviye regresyon yöntemleri ile genelleştirilmesi. Gazi Üniversitesi Mühendislik Mimarlık Fakültesi Dergisi 37 1 485–496.
IEEE G. Taşkın, “Manifold öğrenme yöntemlerinin ileri seviye regresyon yöntemleri ile genelleştirilmesi”, GUMMFD, vol. 37, no. 1, pp. 485–496, 2021, doi: 10.17341/gazimmfd.704793.
ISNAD Taşkın, Gülşen. “Manifold öğrenme yöntemlerinin Ileri Seviye Regresyon yöntemleri Ile genelleştirilmesi”. Gazi Üniversitesi Mühendislik Mimarlık Fakültesi Dergisi 37/1 (November 2021), 485-496. https://doi.org/10.17341/gazimmfd.704793.
JAMA Taşkın G. Manifold öğrenme yöntemlerinin ileri seviye regresyon yöntemleri ile genelleştirilmesi. GUMMFD. 2021;37:485–496.
MLA Taşkın, Gülşen. “Manifold öğrenme yöntemlerinin Ileri Seviye Regresyon yöntemleri Ile genelleştirilmesi”. Gazi Üniversitesi Mühendislik Mimarlık Fakültesi Dergisi, vol. 37, no. 1, 2021, pp. 485-96, doi:10.17341/gazimmfd.704793.
Vancouver Taşkın G. Manifold öğrenme yöntemlerinin ileri seviye regresyon yöntemleri ile genelleştirilmesi. GUMMFD. 2021;37(1):485-96.