Derleme
BibTex RIS Kaynak Göster

Grafik Sinir Ağlarına Genel Bir Bakış

Yıl 2023, Cilt: 13 Sayı: 2, 39 - 56, 19.07.2023

Öz

Grafik sinir ağları (GNN), yapay sinir ağı (ANN) ailesine mensup ve grafikler üzerinden bilgi çıkarımı işlemi gerçekleştiren bir derin öğrenme yöntemidir. İlk kullanımı 2008 yılında gerçekleşmiş, gelişimi ise 2014 yılı ve sonrasında olmuştur. Evrişimli sinir ağlarının (CNN) görseller üzerinde beklenen performansı karşılamamasına karşılık olarak geliştirilen GNN’ler; fizik, kimya, biyoloji, siber güvenlik gibi birçok alanda kullanılmaktadır. Bu çalışmada GNN’ler ve modelleri temel olarak anlatılmış ve kapsamlı bir literatür taraması gerçekleştirilmiştir. Çalışma içerisinde bir GNN modelinin tasarım adımlarına değinilerek geliştirilen modeller incelenmiş, GNN modellerinin CNN ve yinelemeli sinir ağları (RNN) etkili modellere karşı güçlü ve zayıf yönleri gösterilmiştir.

Teşekkür

Bu çalışma Milli Savunma Üniversitesi Atatürk Stratejik Araştırmalar ve Lisansüstü Eğitim Enstitüsü Bilgisayar Mühendisliği Anabilim Dalı Siber Güvenlik Tezsiz Yüksek Lisans Programına bağlı olarak yürütülen “Siber Güvenlikte Grafik Sinir Ağları” adlı dönem projesinin bir bölümüdür.

Kaynakça

  • Referans1 D.H. Hubel, T.N. Wiesel, “Receptive fields, binocular interaction and functional architecture in the cat's visual cortex.” J Physiol. 1962 Jan; vol. 160, no. 1, pp. 106–154.2, 1962.
  • Referans2 A. Sperduti, A. Starita, “Supervised neural networks for the classification of structures.” in IEEE Transactions on Neural Networks, vol. 8, no. 3, pp. 714-735, May 1997
  • Referans3 F. Scarselli, M. Gori, A.C Tsoi, M. Hagenbuchner, G. Monfardini, “The Graph Neural Network Model.” IEEE Transactions on Neural Networks, vol. 20, no. 1, pp. 61-80. 2019.
  • Referans4 M. Gori, G. Monfardini, F.A. Scarsell, “New Model for Learning in Graph Domains.” Proceedings. 2005 IEEE International Joint Conference on Neural Networks, vol. 2, pp. 729-734, 2005.
  • Referans5 M. Gori, M. Maggini, L. Sarti, “A recursive neural network model for processing directed acyclic graphs with labeled edges.” in Proceedings of the International Joint Conference on Neural Networkss, Portland (USA), pp. 1351–1355, July 2003.
  • Referans6 Z. Chen, J. Xu, C. Alippi, S.X. Ding, Y. Shardt, T. Peng, C. Yang, C. “Graph neural network-based fault diagnosis: a Review.” ArXiv. 2021
  • Referans7 J. Zhou, et. Al., “Graph neural networks: A review of methods and applications.” Al Open. Vol. 1, p.57-81, 2020.
  • Referans8 N.A. Asıf, Y. Sarker, R.K. Chakrabbortty, M.J. Ryan, H. Ahamed, D.K., Saha, F.R. Badal, S.K. Das, F. Alı, S.I Moyeen, R. Islam, Z. Tasneemi “Graph Neural Network: A Comprehensive Review on Non-Euclidean Space.” April 27, 2021.
  • Referans9 M.M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst, “Geometric deep learning: going beyond Euclidean data.” IEEE SIG PROC MAG, 2017
  • Referans10 Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, P.S. Yu, “A Comprehensive Survey on Graph Neural Networks.” Journal Of Latex Class Fıles, Vol. 32, No. 1, pp. 4-24, August 2019
  • Referans11 M. Kampffmeyer, Y. Chen, X. Liang, H. Wang, Y. Zhang, E.R. Xing, “Rethinking Knowledge Graph Propagation for Zero-Shot Learning.” CVPR 2019
  • Referans12 Z. Hu, Y. Dong, K. Wang, Y. Sun, Y. “Heterogeneous graph transformer.” In: Proceedings of WWW, 2020, pp. 2704–2710
  • Referans13 C. Zhang, D. Song, C. Huang, A. Swami, N.V. Chawla, “Heterogeneous graph neural network.” In: Proceedings of KDD. 2019, pp. 793–803.
  • Referans14 M. Schlichtkrull, T.N. Kipf, P. Bloem, R. van den Berg, I. Titov, M. Welling, “Modeling relational data with graph convolutional networks.” In: Proceedings of ESWC. Springer, 2018, pp. 593–607.
  • Referans15 D. Beck, G. Haffari, T. Cohn, “Graph-to-sequence learning using gated graph neural network.” Proceedings of ACL, 2018, pp. 273–283.
  • Referans16 Y. Ma, S. Wang, C.C. Aggarwal, D. Yin, J. Tang, J., “Multi-dimensional graph convolutional networks.” In: Proceedings of SDM, 2019, pp. 657–665.
  • Referans17 Y. Li, R. Yu, C. Shahabi, Y. Liu, “Diffusion convolutional recurrent neural network: Data-driven traffic forecasting.” in Proceedings of the 7th International Conference on Learning Representations, 2018.
  • Referans18 B. Yu, H. Yin, Z. Zhu, “Spatiotemporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting. Proceedings of IJCAI. 2018, pp. 3634–3640.
  • Referans19 A. Jain, A.R. Zamir, S. Savarese, A. Saxena, “Structural-rnn: deep learning on spatio-temporal graphs.” In: Proceedings of CVPR, 2016, pp. 5308–5317.
  • Referans20 S. Yan, Y. Xiong, and D. Lin, “Spatial temporal graph convolutional networks for skeleton-based action recognition.” in Proc. 32nd AAAI Conf. Artif. Intell., 2018. pp. 3634–3640
  • Referans21 F. Manessi, A. Rozza, M. Manzo, “Dynamic graph convolutional networks. Pattern Recogn.” Pattern Recognition Vol. 97 Is. C, Jan 2020.
  • Referans22 A. Pareja, G. Domeniconi, J. Chen, T. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. Schardl, “Evolvegcn: evolving graph convolutional networks for dynamic graphs.” Proceedings of AAAI 34, 2020, pp. 5363–5370.
  • Referans23 J. Skarding, B. Gabrys, K. Musial, “Foundations and modelling of dynamic networks using Dynamic Graph Neural Networks: A survey.” in IEEE Access, vol. 9, pp. 79143-79168, 2021.
  • Referans24 A. Goldenberg, A.X. Zheng, S.E. Fienberg, E.M. Airoldi, “A Survey of Statistical Network Models.” Foundations and Trends® in Machine Learning, Vol. 2, no. 2, pp. 129–233, February 2010.
  • Referans25 D. Xu, C. Ruan, K. Motwani, E. Korpeoglu, S. Kumar, K. Achan, “Generative Graph Convolutional Network for Growing Graphs.” ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 2019, pp. 3167– 3171,
  • Referans26 T.N. Kipf, M. Welling, “Variational graph auto-encoders.” In: NIPS Bayesian Deep Learning Workshop.CoRR, 2016.
  • Referans27 T.N. Kipf, M. Welling, “Semi-supervised classification with graph convolutional networks.” In 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings. OpenReview.net, 2017.
  • Referans28 M. Ou, P. Cui, J. Pei, Z. Zhang, W. Zhu, “Asymmetric transitivity preserving graph embedding.” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp. 1105–1114.
  • Referans29 X. Shi, Z, Chen, H. Wang, D.Y. Yeung, W.K. Wong, W.C. Woo, “Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting.” NeurIPS 2015.
  • Referans30 X. Shi, Z. Gao, L. Lausen, H. Wang, D.Y. Yeung, E.K. Wong, W.C. Woo, “Deep learning for precipitation nowcasting: A benchmark and A new model.” In Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing, 2017, pp. 5617–5627.
  • Referans31 Y. Wang, M. Long, J. Wang, Z. Gao, P.S. Yu, “PredRNN: Recurrent neural networks for predictive learning using spatiotemporal LSTMs.” Advances in Neural Information Processing Systems. 2017, pp. 880–889.
  • Referans32 Y. Wang, M. Long, J. Wang, Z. Gao, P.S. Yu, “Predrnn++: Towards A resolution of the deep-in-time dilemma in spatiotemporal predictive learning.” In Jennifer G. Dy and Andreas Krause, editors, Proceedings of the 35th International Conference on Machine Learning. ICML 2018, Stockholmsmassan, Stockholm, Swe- ¨ den, July 10-15, 2018, vol. 80 of Proceedings of Machine Learning Research, pages 5110–5119. PMLR, 2018.
  • Referans33 J. Xu, H. Xu, B. Ni, X. Yang, T. Darrell, “Video prediction via example guidance.” In Hal Daume III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learnin., volume 119 of Proceedings of Machine Learning Research, PMLR, Jul 2020, pp. 10628–10637.
  • Referans34 O. Ronneberger, P. Fischer, T. Brox, “U-net: Convolutional networks for biomedical image segmentation.” CoRR, 2015.
  • Referans35 Y. Aoyagi, N. Murata, H. Sakaino, “Spatio-temporal Predictive Network For Videos With Physical Properties.” CVPR2021, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2021, pp. 2268-2278.
  • Referans36 Y. Seo, M. Defferrard, P. Vandergheynst, X. Bresson, “Structured Sequence Modeling with Graph Convolutional Recurrent Networks.” In Neural Information Processing, Lecture Notes in Computer Science, 2018, pp. 362–373.
  • Referans37 J. Deng, W. Dong, R. Socher, L. Li, K. Li, F. Li, “Imagenet: A large-scale hierarchical image database.” In 2009 IEEE Conference on Computer Vision and Pattern Recognition, 2009, pp. 248–255.
  • Referans38 M. Monfort, B. Zhou, S.A. Bargal, A. Andonian, T. Yan, K. Ramakrishnan, L.M. Brown, Q. Fan, D. Gutfreund, C. Vondrick, A. Oliva, “Moments in time dataset: one million videos for event understanding.” CoRR, 2018.
  • Referans39 B. Zhou, A. Lapedriza, A. Khosla, A. Oliva, A. Torralba, “Places: A 10 million image database for scene recognition.” IEEE Transactions on Pattern Analysis and Machine Intelligence. Vol. 40 no. 6, pp. 1452–1464, 2018.
  • Referans40 P. Goyal, S.R. Chhetri, N.S. Mehrabi, E. Ferrara, A. Canedo, “DynamicGEM: A library for dynamic graph embedding methods.” arXiv preprint arXiv:1811.10734, 2018.
  • Referans41 Y. Ma, Z. Guo, Z. Ren, E. Zhao, J. Tang, D. Yin, “Streaming Graph Neural Networks.” Arxiv, October 2018.
  • Referans42 S. Kumar, X. Zhang, J. Leskovec, “Predicting dynamic embedding trajectory in temporal interaction networks.” In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019, pp. 1269–1278.
  • Referans43 R. Trivedi, H. Dai, Y. Wang, L. Song, “Know-evolve: Deep temporal reasoning for dynamic knowledge graphs.” In Proceedings of the 34th International Conference on Machine Learning, ICML 2017. Sydney, NSW, Australia, 6-11 August 2017, volume 70 of Proceedings of Machine Learning Research, PLMR, 2017, pp. 3462–3471.
  • Referans44 A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, L. Kaiser, I. Polosukhin, “Attention is All you Need.” In, Advances in Neural Information Processing Systems vol. 30, pp. 5998–6008, 2017.
  • Referans45 D. Xu, C. Ruan, E. Korpeoglu, S. Kumar, K. Achan, “Inductive representation learning on temporal graphs.” ArXiv, 2020.
  • Referans46 Y. Feng, H. You, Z. Zhang, R. Ji, Y. Gao, “Hypergraph neural networks.” In: Proceedings of AAAI, vol. 33, pp. 3558–3565. 2018.
  • Referans47 T. Derr Y. Ma J. Tang “Signed graph convolutional networks.” in 2018 IEEE International Conference on Data Mining (ICDM), IEEE, 2018, pp. 929–934.
  • Referans48 S. Pan, R. Hu, G. Long, J. Jiang, L. Yao, C. Zhang, “Adversarially regularized graph autoencoder for graph embedding.” in Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, 2018. Proceedings of IJCAI, 2018. Pp. 2609–2615.
  • Referans49 C. Wang, S. Pan, G. Long, X. Zhu, J. Jiang, “Mgae: marginalized graph autoencoder for graph clustering.” In: Proceedings of CIKM. 2017. pp. 889–898.
  • Referans50 J. Park, M. Lee, H.J. Chang, K. Lee, J.Y. Choi, “Symmetric graph convolutional autoencoder for unsupervised graph representation learning.” In: Proceedings of ICCV, 2019. pp. 6519–6528.
  • Referans51 G. Cui, J. Zhou, C. Yang, Z. Liu, “Adaptive graph encoder for attributed graph embedding." In: Proceedings of KDD, 2020. pp. 976–985.
  • Referans52 P. Velickovic, W. Fedus, W.L. Hamilton, P. Lio, Y. Bengio, R.D. Hjelm, “Deep Graph Infomax.” ICLR 2019.
  • Referans53 F.Y. Sun, J. Hoffmann, V. Verma, J. Tang, “Infograph: unsupervised and semisupervised graph-level representation learning via mutual information maximization.” Proceedings of ICLR. 2020.
  • Referans54 K. Hassani, A.H. Khasahmadi, “Contrastive multi-view representation learning on graphs.” In: Proceedings of ICML, pp. 4116–4126. 2020.
  • Referans55 K. Xu, C. Li, Y. Tian, T. Sonobe, K. Kawarabayashi, S. Gegelka, “Representation Learning on Graphs with Jumping Knowledge Networks.” In: Proceeding of ICML. 2018. pp. 5449–5458.
  • Referans56 J.G. Zilly, R.K. Srivastava, J. Koutnik, J. Schmidhuber, “Recurrent highway networks.” In: Proceedings of ICML. 2016, pp. 4189–4198.
  • Referans57 A. Rahimi, T. Cohn, and T. Baldwin, “Semi-supervised User Geolocation via Graph Convolutional Networks.” In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, vol 1, pp. 2009–2019, 2018.
  • Referans58 T. Pham, T. Tran, D. Phung, S. Venkatesh, “Column networks for collective classification.” In: Proceedings of AAAI, 2017. pp. 2485–2491.
  • Referans59 G. Li, M. Muller, A. Thabet, B. Ghanem, “DeepGCNs: Can GCNs Go as Deep as CNNs?.” ICCV 2019 In: Proceedings of ICCV, 2019, pp. 9267–9276.
  • Referans60 K. He, X. Zhang, S. Ren, J. Sun, “Deep Residual Learning for Image Recognition.” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 770-778.
  • Referans61 K. He, X. Zhang, S. Ren, J. Sun, “Identity Mappings in Deep Residual Networks.” European Conference on Computer Vision, ECCV 2016: Computer Vision – ECCV 2016, pp 630-645.
  • Referans62 G. Huang, Z. Liu, L. Van Der Maaten, K.Q. Weinberger, “Densely connected convolutional networks.” In: Proceedings of CVPR, 2016, pp. 4700–4708.
  • Referans63 Y. Pei, T. Huang, W.V. Ipenburg, M. Pechenizkiy, “ResGCN: Attention-based Deep Residual Modeling for Anomaly Detection on Attributed Networks.” Mach Learn vol. 111, pp. 519–541, 2022.
  • Referans64 R.N. Abirami, P.M.D.R. Vincent, K. Srinivasan, U. Tariq, C.Y. Chang, “Deep CNN and Deep GAN in Computational Visual Perception-Driven Image Analysis.” Complexity, cilt 2021, Article ID 5541134, 30 sayfa, 2021.
  • Referans65 W. Maass, T. Natschlager, H. Markram, “Real-time computing without stable states: A new framework for neural computation based on perturbations.” Neural Computation, vol. 14, no. 11, pp. 2531–2560, 2002.
  • Referans66 H. Jaeger, The “Echo State.” Approach to Analysing and Training Recurrent Neural Networks-With an Erratum Note, German National Research Center for Information Technology GMD Technical Report, vol. 148. p. 13. 2001.
  • Referans67 C. Gallicchio, A. Micheli, “Graph Echo State Networks.” The 2010 International Joint Conference on Neural Networks (IJCNN), 2010, pp. 1-8.
  • Referans68 H. Dai, Z. Kozareva, B. Dai, A. Smola, L. Song, “Learning Steady-States of Iterative Algorithms Over Graphs.” 35th International Conference on Machine Learning, PMLR 80, 2018, pp. 1106-1114.
  • Referans69 M. Tiezzi, G.M. Dimitri, M. Bianchini, M. Maggini, F. Scarselli, “Inductive–transductive learning with graph neural networks.” In: IAPR Workshop on Artificial Neural Networks in Pattern Recognition. Springer. pp. 201–212. 2018.
  • Referans70 K. Cho, B. Van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, Y. Bengio, “Learning phrase representations using rnn encoder–decoder for statistical machine translation.” Proceedings of EMNLP, 2014, pp. 1724–1734.
  • Referans71 Y. Gao, D. Glowacka, “Deep Gate Recurrent Neural Network.” JMLR: Workshop and Conference Proceedings, 2016, pp.350–365.
  • Referans72 J. Cheng, L. Dong, M. Lapata, “Long short-term memorynetworks for machine reading.” In: Proceedings of EMNLP, 2016, pp. 551–561.
  • Referans73 Y. Li, D. Tarlow, M. Brockschmidt, R. Zemel, “Gated graph sequence neural networks.” in Proceedings of the 5th International Conference on Learning Representations, 2016, pp. 273-283.
  • Referans74 D. Beck, G. Haffari, T. Cohn, “Graph-to-sequence learning using gated graph neural network.” Proceedings of ACL, 2018, 273–283.
  • Referans75 J Guo, “BackPropagation Through Time.” 2013.
  • Referans76 V. Ranganathan, S. Natarajan, “A New Backpropagation Algorithm without Gradient Descent.” Arxiv. 2018.
  • Referans77 S. Ruder, “An overview of gradient descent optimization algorithms.” Arxiv. 2016.
  • Referans78 K.S. Tai, R. Socher, C.D. Manning, “Improved semantic representations from treestructured long short-term memory networks.” In: Proceeding of IJCNLP, 2015, pp. 1556–1566.
  • Referans79 R. Socher, A. Perelygin, J.Y. Wu, J. Chuang, C.D. Manniing, A.Y. Ng, C. Potts, “Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank.” Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, 2103, pp. 1631–1642.
  • Referans80 M. Marelli, L. Bentivogli, M. Baroni, R. Bernardi, S. Menini, R. Zamparelli, “SemEval-2014 Task 1: Evaluation of Compositional Distributional Semantic Models on Full Sentences through Semantic Relatedness and Textual Entailment.” Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval 2014), 2014, pp. 1–8.
  • Referans81 N. Peng, H. Poon, C. Quirk, K. Toutanova, W.T. Yih, “Crosssentence N-ary relation extraction with graph LSTMs.” Trans. Assoc. Comput. Linguistics, vol. 5, pp. 101–115, Dec. 2017.
  • Referans82 X. Liang, X. Shen, J. Feng, L. Lin, S. Yan, “Semantic object parsing with graph lstm.” In: Proceedings of ECCV, 2016, pp. 125–143.
  • Referans83 V. Zayats, M. Ostendorf, “Conversation modeling on reddit using a graph-structured LSTM.” Trans. Assoc. Comput. Linguistics, TACL. vol. 6, pp. 121–132, 2018.
  • Referans84 Y. Zhang, Q. Liu, L. Song, “Sentence-state LSTM for text representation.” In: Proceedings of ACL. Vol. 1, pp. 317–327, 2018.
  • Referans85 Z. Huang, X. Xu, K. Yu, “Bidirectional LSTM-CRF Models for Sequence Tagging.” ArXiv, 2015.
  • Referans86 Y. Shulman, “SimPool: Towards Topology Based Graph Pooling with Structural Similarity Features.” ArXiv. 2020.
  • Referans87 S. Mallat, “A Wavelet Tour of Signal Processing.” Elsevier. 1999.
  • Referans88 J. Bruna, W. Zaremba, A. Szlam, Y. LeCun, “Spectral networks and locally connected networks on graphs.” in 2nd Int. Conf. Learn. Repres., Banff, AB, Canada 2014.
  • Referans89 M. Henaff, J. Bruna, Y. Lecun, “Deep Convolutional Networks on Graph-Structured Data.” ArXiv. 2015.
  • Referans90 D.K. Hammond, P. Vandergheynst, R. Gribonval, “Wavelets on graphs via spectral graph theory.” Appl. Comput. Harmon. Anal. Vol. 30, pp. 129–150, 2011.
  • Referans91 H.P. Maretic, P. Frossard, “Graph Laplacian mixture model.” Arxiv, 2020.
  • Referans92 M. Defferrard, X. Bresson, P. Vandergheynst, “Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering.” NeurIPS, 2016, pp. 3844–3852.
  • Referans93 B. Li, S. Tang, H. Yu, “Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units.” Communications in Computational Physics vol. 27, no. 2, pp. 379--411, 2020.
  • Referans94 S. Tang, B. Li, H. Yu, “ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units using Chebyshev Approximations.” ArXiv, 2019.
  • Referans95 R. Li S. Wang F. Zhu J. Huang “Adaptive graph convolutional neural networks.” In Thirty-Second AAAI Conference on Artificial Intelligence, 2018, pp. 2546-3553.
  • Referans96 Y. Wang, Y. Sun, Z. Liu, E. Sarma, M.M. Bronstein, J.M. Solomon, “Dynamic Graph CNN for Learning on Point Clouds.” ACM Transactions on Graphics, Vol. 38, no. 146, pp 1–12, 2019.
  • Referans97 R. Li S. Wang F. Zhu J. Huang, “Adaptive graph convolutional neural networks.” In Thirty-Second AAAI Conference on Artificial Intelligence, 2018, pp. 2546-3553.
  • Referans98 C. Zhuang Q. Ma, “Dual graph convolutional networks for graphbased semi-supervised classification.” in Proc. Web Conf., 2018, pp. 499– 508.
  • Referans99 B. Xu, H. Shen, Q. Cao, Y. Qiu, X. Cheng, “Graph wavelet neural network.” In: Proceedings of ICLR. 2019.
  • Referans100 D.K. Duvenaud, D. Maclaurin, J. Aguileraiparraguirre, R. Gomezbombarelli, T.D. Hirzel, A. Aspuruguzik, R.P. Adams, “Convolutional networks on graphs for learning molecular fingerprints.” In: Proceedings of NIPS, Neural Inf. Process. Syst., 2015, pp. 2224–2232.
  • Referans101 J. Atwood, D. Towsley, “Diffusion-Convolutional Neural Networks.” 29th Conference on Neural Information Processing Systems (NIPS 2016), 2016, pp. 1993–2001.
  • Referans102 M. Niepert, M. Ahmed, K. Kutzkov, “Learning convolutional neural networks for graphs.” in Proc. Int. Conf. Mach. Learn., 2016, pp. 2014– 2023.
  • Referans103 H. Gao, Z. Wang S. Ji “Large-scale learnable graph convolutional networks.” in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, ACM, 2018, 1416–1424.
  • Referans104 W.L. Hamilton, R. Ying, L. Leskovec, “Inductive representation learning on large graphs.” In Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS'17). 2017, pp. 1025–1035.
  • Referans105 J. Chen, J. Zhu, L. Song, “Stochastic Training of Graph Convolutional Networks with Variance Reduction” Proceedings of the 35 th International Conference on Machine Learning, Stockholm, Sweden, PMLR 80, 2018.
  • Referans106 R. Ying, R. He, K. Chen, P. Eksombatchai, W.L. Hamilton, J. Leskovec, “Graph convolutional neural networks for web-scale recommender systems.” in Proc. 24th ACM SIGKDD Int. Conf. Knowl. Discovery Data Mining. 2018. Pp. 974–983.
  • Referans107 J. Chen, T. Ma, C. Xiao, “FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling.” ICLR, 2018.
  • Referans108 W. Huang, T. Zhang, Y. Rong, J. Huang, “Adaptive sampling towards fast graph representation learning.” Proceedings of NeurIPS, 2018, pp. 4558–4567.
  • Referans109 D. Zou, Z. Hu, Y. Wang, S. Jiang, Y. Sun, Q. Gu, “Layer-dependent importance sampling for training deep and large graph convolutional networks.” In: Proceedings of NeurIPS, 2019, pp. 11249–11259.
  • Referans110 W.L. Chiang, X. Liu, S. Si, Y. Li, S. Bengio, C.J. Hsieh, “Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks.” In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD ’19), 2019, pp. 257–266.
  • Referans111 H. Zeng, H. Zhou, A. Srivastava, R. Kannan, V.K. Prasanna, “Graphsaint: graph sampling based inductive learning method” In: Proceedings of ICLR. 2020.
  • Referans112 O. Vinyals, S. Bengio, M. Kudlur, “Order Matters: Sequence to Sequence for Sets.” ArXiv. 2015.
  • Referans113 M. Zhang, Z. Cui, M. Neumann, Y. Chen, A”n End-to-End Deep Learning Architecture for Graph Classification” AAAI'18/IAAI'18/ EAAI'18: Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence and Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth AAAI Symposium on Educational Advances in Artificial Intelligence, 2018, no. 544, pp. 4438–4445.
  • Referans114 I.S. Dhillon, Y. Guan, B. Kulis, “Weighted graph cuts without eigenvectors amultilevel approach.” IEEE TPAMI vol. 29, pp. 1944–1957, 2007.
  • Referans115 M. Simonovsky, N. Komodakis, “Dynamic edge-conditioned filters in convolutional neural networks on graphs.” In: Proceedings of CVPR, 2017. pp. 3693–3702.
  • Referans116 Z. Ying, J. You, C. Morris, X. Ren, W. Hamilton, J. Leskovec, “Hierarchical graph representation learning with differentiable pooling.” Advances in neural information processing systems, NeurIPS, 2018, pp. 4805-4815.
  • Referans117 H. Gao, S. Ji, “Graph u-nets.” In: Proceedings of ICML, 2019, pp. 2083–2092.
  • Referans118 Y. Ma, S. Wang, C.C. Aggarwal, J. Tang, “Graph Convolutional Networks with EigenPooling.” In: Proceedings of KDD, 2019, pp. 723–731.
  • Referans119 J. Lee, I. Lee, J. Kang, “Self-attention graph pooling.” ICML, 2019, pp. 3734–3743.
  • Referans120 L., Waikhom, R., Patgiri, “Graph Neural Networks: Methods, Applications, and Opportunities” Arxiv, 2021.
  • Referans121 C., Liu, Y., Zhan, C., Li, B., Du, J., Wu, W., Hu, T., Liu, & D., Tao, “Graph Pooling for Graph Neural Networks: Progress, Challenges, and Opportunities,” ArXiv, 2022, abs/2204.07321.
  • Referans122 S. Georgousis, M. P. Kenning and X. Xie, "Graph Deep Learning: State of the Art and Challenges," in IEEE Access, vol. 9, pp. 22106-22140, 2021, doi: 10.1109/ACCESS.2021.3055280.

An Overview of Graph Neural Networks

Yıl 2023, Cilt: 13 Sayı: 2, 39 - 56, 19.07.2023

Öz

Graph neural networks (GNN) are a deep learning method that belongs to the artificial neural network (ANN) family and performs information extraction from graphs. It was first used in 2008 and its development started in 2014 and onwards. GNNs, which were developed in response to the failure of convolutional neural networks (CNN) to meet the expected performance on visuals, are used in many fields such as physics, chemistry, biology and cyber security. In this study, GNNs and their models are basically explained and a comprehensive literature review is carried out. In the study, the developed models are examined by referring to the design steps of a GNN model, and the strengths and weaknesses of GNN models against CNN and recurrent neural networks (RNN) effective models are shown.

Kaynakça

  • Referans1 D.H. Hubel, T.N. Wiesel, “Receptive fields, binocular interaction and functional architecture in the cat's visual cortex.” J Physiol. 1962 Jan; vol. 160, no. 1, pp. 106–154.2, 1962.
  • Referans2 A. Sperduti, A. Starita, “Supervised neural networks for the classification of structures.” in IEEE Transactions on Neural Networks, vol. 8, no. 3, pp. 714-735, May 1997
  • Referans3 F. Scarselli, M. Gori, A.C Tsoi, M. Hagenbuchner, G. Monfardini, “The Graph Neural Network Model.” IEEE Transactions on Neural Networks, vol. 20, no. 1, pp. 61-80. 2019.
  • Referans4 M. Gori, G. Monfardini, F.A. Scarsell, “New Model for Learning in Graph Domains.” Proceedings. 2005 IEEE International Joint Conference on Neural Networks, vol. 2, pp. 729-734, 2005.
  • Referans5 M. Gori, M. Maggini, L. Sarti, “A recursive neural network model for processing directed acyclic graphs with labeled edges.” in Proceedings of the International Joint Conference on Neural Networkss, Portland (USA), pp. 1351–1355, July 2003.
  • Referans6 Z. Chen, J. Xu, C. Alippi, S.X. Ding, Y. Shardt, T. Peng, C. Yang, C. “Graph neural network-based fault diagnosis: a Review.” ArXiv. 2021
  • Referans7 J. Zhou, et. Al., “Graph neural networks: A review of methods and applications.” Al Open. Vol. 1, p.57-81, 2020.
  • Referans8 N.A. Asıf, Y. Sarker, R.K. Chakrabbortty, M.J. Ryan, H. Ahamed, D.K., Saha, F.R. Badal, S.K. Das, F. Alı, S.I Moyeen, R. Islam, Z. Tasneemi “Graph Neural Network: A Comprehensive Review on Non-Euclidean Space.” April 27, 2021.
  • Referans9 M.M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst, “Geometric deep learning: going beyond Euclidean data.” IEEE SIG PROC MAG, 2017
  • Referans10 Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, P.S. Yu, “A Comprehensive Survey on Graph Neural Networks.” Journal Of Latex Class Fıles, Vol. 32, No. 1, pp. 4-24, August 2019
  • Referans11 M. Kampffmeyer, Y. Chen, X. Liang, H. Wang, Y. Zhang, E.R. Xing, “Rethinking Knowledge Graph Propagation for Zero-Shot Learning.” CVPR 2019
  • Referans12 Z. Hu, Y. Dong, K. Wang, Y. Sun, Y. “Heterogeneous graph transformer.” In: Proceedings of WWW, 2020, pp. 2704–2710
  • Referans13 C. Zhang, D. Song, C. Huang, A. Swami, N.V. Chawla, “Heterogeneous graph neural network.” In: Proceedings of KDD. 2019, pp. 793–803.
  • Referans14 M. Schlichtkrull, T.N. Kipf, P. Bloem, R. van den Berg, I. Titov, M. Welling, “Modeling relational data with graph convolutional networks.” In: Proceedings of ESWC. Springer, 2018, pp. 593–607.
  • Referans15 D. Beck, G. Haffari, T. Cohn, “Graph-to-sequence learning using gated graph neural network.” Proceedings of ACL, 2018, pp. 273–283.
  • Referans16 Y. Ma, S. Wang, C.C. Aggarwal, D. Yin, J. Tang, J., “Multi-dimensional graph convolutional networks.” In: Proceedings of SDM, 2019, pp. 657–665.
  • Referans17 Y. Li, R. Yu, C. Shahabi, Y. Liu, “Diffusion convolutional recurrent neural network: Data-driven traffic forecasting.” in Proceedings of the 7th International Conference on Learning Representations, 2018.
  • Referans18 B. Yu, H. Yin, Z. Zhu, “Spatiotemporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting. Proceedings of IJCAI. 2018, pp. 3634–3640.
  • Referans19 A. Jain, A.R. Zamir, S. Savarese, A. Saxena, “Structural-rnn: deep learning on spatio-temporal graphs.” In: Proceedings of CVPR, 2016, pp. 5308–5317.
  • Referans20 S. Yan, Y. Xiong, and D. Lin, “Spatial temporal graph convolutional networks for skeleton-based action recognition.” in Proc. 32nd AAAI Conf. Artif. Intell., 2018. pp. 3634–3640
  • Referans21 F. Manessi, A. Rozza, M. Manzo, “Dynamic graph convolutional networks. Pattern Recogn.” Pattern Recognition Vol. 97 Is. C, Jan 2020.
  • Referans22 A. Pareja, G. Domeniconi, J. Chen, T. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. Schardl, “Evolvegcn: evolving graph convolutional networks for dynamic graphs.” Proceedings of AAAI 34, 2020, pp. 5363–5370.
  • Referans23 J. Skarding, B. Gabrys, K. Musial, “Foundations and modelling of dynamic networks using Dynamic Graph Neural Networks: A survey.” in IEEE Access, vol. 9, pp. 79143-79168, 2021.
  • Referans24 A. Goldenberg, A.X. Zheng, S.E. Fienberg, E.M. Airoldi, “A Survey of Statistical Network Models.” Foundations and Trends® in Machine Learning, Vol. 2, no. 2, pp. 129–233, February 2010.
  • Referans25 D. Xu, C. Ruan, K. Motwani, E. Korpeoglu, S. Kumar, K. Achan, “Generative Graph Convolutional Network for Growing Graphs.” ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 2019, pp. 3167– 3171,
  • Referans26 T.N. Kipf, M. Welling, “Variational graph auto-encoders.” In: NIPS Bayesian Deep Learning Workshop.CoRR, 2016.
  • Referans27 T.N. Kipf, M. Welling, “Semi-supervised classification with graph convolutional networks.” In 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings. OpenReview.net, 2017.
  • Referans28 M. Ou, P. Cui, J. Pei, Z. Zhang, W. Zhu, “Asymmetric transitivity preserving graph embedding.” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp. 1105–1114.
  • Referans29 X. Shi, Z, Chen, H. Wang, D.Y. Yeung, W.K. Wong, W.C. Woo, “Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting.” NeurIPS 2015.
  • Referans30 X. Shi, Z. Gao, L. Lausen, H. Wang, D.Y. Yeung, E.K. Wong, W.C. Woo, “Deep learning for precipitation nowcasting: A benchmark and A new model.” In Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing, 2017, pp. 5617–5627.
  • Referans31 Y. Wang, M. Long, J. Wang, Z. Gao, P.S. Yu, “PredRNN: Recurrent neural networks for predictive learning using spatiotemporal LSTMs.” Advances in Neural Information Processing Systems. 2017, pp. 880–889.
  • Referans32 Y. Wang, M. Long, J. Wang, Z. Gao, P.S. Yu, “Predrnn++: Towards A resolution of the deep-in-time dilemma in spatiotemporal predictive learning.” In Jennifer G. Dy and Andreas Krause, editors, Proceedings of the 35th International Conference on Machine Learning. ICML 2018, Stockholmsmassan, Stockholm, Swe- ¨ den, July 10-15, 2018, vol. 80 of Proceedings of Machine Learning Research, pages 5110–5119. PMLR, 2018.
  • Referans33 J. Xu, H. Xu, B. Ni, X. Yang, T. Darrell, “Video prediction via example guidance.” In Hal Daume III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learnin., volume 119 of Proceedings of Machine Learning Research, PMLR, Jul 2020, pp. 10628–10637.
  • Referans34 O. Ronneberger, P. Fischer, T. Brox, “U-net: Convolutional networks for biomedical image segmentation.” CoRR, 2015.
  • Referans35 Y. Aoyagi, N. Murata, H. Sakaino, “Spatio-temporal Predictive Network For Videos With Physical Properties.” CVPR2021, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2021, pp. 2268-2278.
  • Referans36 Y. Seo, M. Defferrard, P. Vandergheynst, X. Bresson, “Structured Sequence Modeling with Graph Convolutional Recurrent Networks.” In Neural Information Processing, Lecture Notes in Computer Science, 2018, pp. 362–373.
  • Referans37 J. Deng, W. Dong, R. Socher, L. Li, K. Li, F. Li, “Imagenet: A large-scale hierarchical image database.” In 2009 IEEE Conference on Computer Vision and Pattern Recognition, 2009, pp. 248–255.
  • Referans38 M. Monfort, B. Zhou, S.A. Bargal, A. Andonian, T. Yan, K. Ramakrishnan, L.M. Brown, Q. Fan, D. Gutfreund, C. Vondrick, A. Oliva, “Moments in time dataset: one million videos for event understanding.” CoRR, 2018.
  • Referans39 B. Zhou, A. Lapedriza, A. Khosla, A. Oliva, A. Torralba, “Places: A 10 million image database for scene recognition.” IEEE Transactions on Pattern Analysis and Machine Intelligence. Vol. 40 no. 6, pp. 1452–1464, 2018.
  • Referans40 P. Goyal, S.R. Chhetri, N.S. Mehrabi, E. Ferrara, A. Canedo, “DynamicGEM: A library for dynamic graph embedding methods.” arXiv preprint arXiv:1811.10734, 2018.
  • Referans41 Y. Ma, Z. Guo, Z. Ren, E. Zhao, J. Tang, D. Yin, “Streaming Graph Neural Networks.” Arxiv, October 2018.
  • Referans42 S. Kumar, X. Zhang, J. Leskovec, “Predicting dynamic embedding trajectory in temporal interaction networks.” In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2019, pp. 1269–1278.
  • Referans43 R. Trivedi, H. Dai, Y. Wang, L. Song, “Know-evolve: Deep temporal reasoning for dynamic knowledge graphs.” In Proceedings of the 34th International Conference on Machine Learning, ICML 2017. Sydney, NSW, Australia, 6-11 August 2017, volume 70 of Proceedings of Machine Learning Research, PLMR, 2017, pp. 3462–3471.
  • Referans44 A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, L. Kaiser, I. Polosukhin, “Attention is All you Need.” In, Advances in Neural Information Processing Systems vol. 30, pp. 5998–6008, 2017.
  • Referans45 D. Xu, C. Ruan, E. Korpeoglu, S. Kumar, K. Achan, “Inductive representation learning on temporal graphs.” ArXiv, 2020.
  • Referans46 Y. Feng, H. You, Z. Zhang, R. Ji, Y. Gao, “Hypergraph neural networks.” In: Proceedings of AAAI, vol. 33, pp. 3558–3565. 2018.
  • Referans47 T. Derr Y. Ma J. Tang “Signed graph convolutional networks.” in 2018 IEEE International Conference on Data Mining (ICDM), IEEE, 2018, pp. 929–934.
  • Referans48 S. Pan, R. Hu, G. Long, J. Jiang, L. Yao, C. Zhang, “Adversarially regularized graph autoencoder for graph embedding.” in Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, 2018. Proceedings of IJCAI, 2018. Pp. 2609–2615.
  • Referans49 C. Wang, S. Pan, G. Long, X. Zhu, J. Jiang, “Mgae: marginalized graph autoencoder for graph clustering.” In: Proceedings of CIKM. 2017. pp. 889–898.
  • Referans50 J. Park, M. Lee, H.J. Chang, K. Lee, J.Y. Choi, “Symmetric graph convolutional autoencoder for unsupervised graph representation learning.” In: Proceedings of ICCV, 2019. pp. 6519–6528.
  • Referans51 G. Cui, J. Zhou, C. Yang, Z. Liu, “Adaptive graph encoder for attributed graph embedding." In: Proceedings of KDD, 2020. pp. 976–985.
  • Referans52 P. Velickovic, W. Fedus, W.L. Hamilton, P. Lio, Y. Bengio, R.D. Hjelm, “Deep Graph Infomax.” ICLR 2019.
  • Referans53 F.Y. Sun, J. Hoffmann, V. Verma, J. Tang, “Infograph: unsupervised and semisupervised graph-level representation learning via mutual information maximization.” Proceedings of ICLR. 2020.
  • Referans54 K. Hassani, A.H. Khasahmadi, “Contrastive multi-view representation learning on graphs.” In: Proceedings of ICML, pp. 4116–4126. 2020.
  • Referans55 K. Xu, C. Li, Y. Tian, T. Sonobe, K. Kawarabayashi, S. Gegelka, “Representation Learning on Graphs with Jumping Knowledge Networks.” In: Proceeding of ICML. 2018. pp. 5449–5458.
  • Referans56 J.G. Zilly, R.K. Srivastava, J. Koutnik, J. Schmidhuber, “Recurrent highway networks.” In: Proceedings of ICML. 2016, pp. 4189–4198.
  • Referans57 A. Rahimi, T. Cohn, and T. Baldwin, “Semi-supervised User Geolocation via Graph Convolutional Networks.” In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, vol 1, pp. 2009–2019, 2018.
  • Referans58 T. Pham, T. Tran, D. Phung, S. Venkatesh, “Column networks for collective classification.” In: Proceedings of AAAI, 2017. pp. 2485–2491.
  • Referans59 G. Li, M. Muller, A. Thabet, B. Ghanem, “DeepGCNs: Can GCNs Go as Deep as CNNs?.” ICCV 2019 In: Proceedings of ICCV, 2019, pp. 9267–9276.
  • Referans60 K. He, X. Zhang, S. Ren, J. Sun, “Deep Residual Learning for Image Recognition.” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 770-778.
  • Referans61 K. He, X. Zhang, S. Ren, J. Sun, “Identity Mappings in Deep Residual Networks.” European Conference on Computer Vision, ECCV 2016: Computer Vision – ECCV 2016, pp 630-645.
  • Referans62 G. Huang, Z. Liu, L. Van Der Maaten, K.Q. Weinberger, “Densely connected convolutional networks.” In: Proceedings of CVPR, 2016, pp. 4700–4708.
  • Referans63 Y. Pei, T. Huang, W.V. Ipenburg, M. Pechenizkiy, “ResGCN: Attention-based Deep Residual Modeling for Anomaly Detection on Attributed Networks.” Mach Learn vol. 111, pp. 519–541, 2022.
  • Referans64 R.N. Abirami, P.M.D.R. Vincent, K. Srinivasan, U. Tariq, C.Y. Chang, “Deep CNN and Deep GAN in Computational Visual Perception-Driven Image Analysis.” Complexity, cilt 2021, Article ID 5541134, 30 sayfa, 2021.
  • Referans65 W. Maass, T. Natschlager, H. Markram, “Real-time computing without stable states: A new framework for neural computation based on perturbations.” Neural Computation, vol. 14, no. 11, pp. 2531–2560, 2002.
  • Referans66 H. Jaeger, The “Echo State.” Approach to Analysing and Training Recurrent Neural Networks-With an Erratum Note, German National Research Center for Information Technology GMD Technical Report, vol. 148. p. 13. 2001.
  • Referans67 C. Gallicchio, A. Micheli, “Graph Echo State Networks.” The 2010 International Joint Conference on Neural Networks (IJCNN), 2010, pp. 1-8.
  • Referans68 H. Dai, Z. Kozareva, B. Dai, A. Smola, L. Song, “Learning Steady-States of Iterative Algorithms Over Graphs.” 35th International Conference on Machine Learning, PMLR 80, 2018, pp. 1106-1114.
  • Referans69 M. Tiezzi, G.M. Dimitri, M. Bianchini, M. Maggini, F. Scarselli, “Inductive–transductive learning with graph neural networks.” In: IAPR Workshop on Artificial Neural Networks in Pattern Recognition. Springer. pp. 201–212. 2018.
  • Referans70 K. Cho, B. Van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, Y. Bengio, “Learning phrase representations using rnn encoder–decoder for statistical machine translation.” Proceedings of EMNLP, 2014, pp. 1724–1734.
  • Referans71 Y. Gao, D. Glowacka, “Deep Gate Recurrent Neural Network.” JMLR: Workshop and Conference Proceedings, 2016, pp.350–365.
  • Referans72 J. Cheng, L. Dong, M. Lapata, “Long short-term memorynetworks for machine reading.” In: Proceedings of EMNLP, 2016, pp. 551–561.
  • Referans73 Y. Li, D. Tarlow, M. Brockschmidt, R. Zemel, “Gated graph sequence neural networks.” in Proceedings of the 5th International Conference on Learning Representations, 2016, pp. 273-283.
  • Referans74 D. Beck, G. Haffari, T. Cohn, “Graph-to-sequence learning using gated graph neural network.” Proceedings of ACL, 2018, 273–283.
  • Referans75 J Guo, “BackPropagation Through Time.” 2013.
  • Referans76 V. Ranganathan, S. Natarajan, “A New Backpropagation Algorithm without Gradient Descent.” Arxiv. 2018.
  • Referans77 S. Ruder, “An overview of gradient descent optimization algorithms.” Arxiv. 2016.
  • Referans78 K.S. Tai, R. Socher, C.D. Manning, “Improved semantic representations from treestructured long short-term memory networks.” In: Proceeding of IJCNLP, 2015, pp. 1556–1566.
  • Referans79 R. Socher, A. Perelygin, J.Y. Wu, J. Chuang, C.D. Manniing, A.Y. Ng, C. Potts, “Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank.” Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, 2103, pp. 1631–1642.
  • Referans80 M. Marelli, L. Bentivogli, M. Baroni, R. Bernardi, S. Menini, R. Zamparelli, “SemEval-2014 Task 1: Evaluation of Compositional Distributional Semantic Models on Full Sentences through Semantic Relatedness and Textual Entailment.” Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval 2014), 2014, pp. 1–8.
  • Referans81 N. Peng, H. Poon, C. Quirk, K. Toutanova, W.T. Yih, “Crosssentence N-ary relation extraction with graph LSTMs.” Trans. Assoc. Comput. Linguistics, vol. 5, pp. 101–115, Dec. 2017.
  • Referans82 X. Liang, X. Shen, J. Feng, L. Lin, S. Yan, “Semantic object parsing with graph lstm.” In: Proceedings of ECCV, 2016, pp. 125–143.
  • Referans83 V. Zayats, M. Ostendorf, “Conversation modeling on reddit using a graph-structured LSTM.” Trans. Assoc. Comput. Linguistics, TACL. vol. 6, pp. 121–132, 2018.
  • Referans84 Y. Zhang, Q. Liu, L. Song, “Sentence-state LSTM for text representation.” In: Proceedings of ACL. Vol. 1, pp. 317–327, 2018.
  • Referans85 Z. Huang, X. Xu, K. Yu, “Bidirectional LSTM-CRF Models for Sequence Tagging.” ArXiv, 2015.
  • Referans86 Y. Shulman, “SimPool: Towards Topology Based Graph Pooling with Structural Similarity Features.” ArXiv. 2020.
  • Referans87 S. Mallat, “A Wavelet Tour of Signal Processing.” Elsevier. 1999.
  • Referans88 J. Bruna, W. Zaremba, A. Szlam, Y. LeCun, “Spectral networks and locally connected networks on graphs.” in 2nd Int. Conf. Learn. Repres., Banff, AB, Canada 2014.
  • Referans89 M. Henaff, J. Bruna, Y. Lecun, “Deep Convolutional Networks on Graph-Structured Data.” ArXiv. 2015.
  • Referans90 D.K. Hammond, P. Vandergheynst, R. Gribonval, “Wavelets on graphs via spectral graph theory.” Appl. Comput. Harmon. Anal. Vol. 30, pp. 129–150, 2011.
  • Referans91 H.P. Maretic, P. Frossard, “Graph Laplacian mixture model.” Arxiv, 2020.
  • Referans92 M. Defferrard, X. Bresson, P. Vandergheynst, “Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering.” NeurIPS, 2016, pp. 3844–3852.
  • Referans93 B. Li, S. Tang, H. Yu, “Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units.” Communications in Computational Physics vol. 27, no. 2, pp. 379--411, 2020.
  • Referans94 S. Tang, B. Li, H. Yu, “ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units using Chebyshev Approximations.” ArXiv, 2019.
  • Referans95 R. Li S. Wang F. Zhu J. Huang “Adaptive graph convolutional neural networks.” In Thirty-Second AAAI Conference on Artificial Intelligence, 2018, pp. 2546-3553.
  • Referans96 Y. Wang, Y. Sun, Z. Liu, E. Sarma, M.M. Bronstein, J.M. Solomon, “Dynamic Graph CNN for Learning on Point Clouds.” ACM Transactions on Graphics, Vol. 38, no. 146, pp 1–12, 2019.
  • Referans97 R. Li S. Wang F. Zhu J. Huang, “Adaptive graph convolutional neural networks.” In Thirty-Second AAAI Conference on Artificial Intelligence, 2018, pp. 2546-3553.
  • Referans98 C. Zhuang Q. Ma, “Dual graph convolutional networks for graphbased semi-supervised classification.” in Proc. Web Conf., 2018, pp. 499– 508.
  • Referans99 B. Xu, H. Shen, Q. Cao, Y. Qiu, X. Cheng, “Graph wavelet neural network.” In: Proceedings of ICLR. 2019.
  • Referans100 D.K. Duvenaud, D. Maclaurin, J. Aguileraiparraguirre, R. Gomezbombarelli, T.D. Hirzel, A. Aspuruguzik, R.P. Adams, “Convolutional networks on graphs for learning molecular fingerprints.” In: Proceedings of NIPS, Neural Inf. Process. Syst., 2015, pp. 2224–2232.
  • Referans101 J. Atwood, D. Towsley, “Diffusion-Convolutional Neural Networks.” 29th Conference on Neural Information Processing Systems (NIPS 2016), 2016, pp. 1993–2001.
  • Referans102 M. Niepert, M. Ahmed, K. Kutzkov, “Learning convolutional neural networks for graphs.” in Proc. Int. Conf. Mach. Learn., 2016, pp. 2014– 2023.
  • Referans103 H. Gao, Z. Wang S. Ji “Large-scale learnable graph convolutional networks.” in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, ACM, 2018, 1416–1424.
  • Referans104 W.L. Hamilton, R. Ying, L. Leskovec, “Inductive representation learning on large graphs.” In Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS'17). 2017, pp. 1025–1035.
  • Referans105 J. Chen, J. Zhu, L. Song, “Stochastic Training of Graph Convolutional Networks with Variance Reduction” Proceedings of the 35 th International Conference on Machine Learning, Stockholm, Sweden, PMLR 80, 2018.
  • Referans106 R. Ying, R. He, K. Chen, P. Eksombatchai, W.L. Hamilton, J. Leskovec, “Graph convolutional neural networks for web-scale recommender systems.” in Proc. 24th ACM SIGKDD Int. Conf. Knowl. Discovery Data Mining. 2018. Pp. 974–983.
  • Referans107 J. Chen, T. Ma, C. Xiao, “FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling.” ICLR, 2018.
  • Referans108 W. Huang, T. Zhang, Y. Rong, J. Huang, “Adaptive sampling towards fast graph representation learning.” Proceedings of NeurIPS, 2018, pp. 4558–4567.
  • Referans109 D. Zou, Z. Hu, Y. Wang, S. Jiang, Y. Sun, Q. Gu, “Layer-dependent importance sampling for training deep and large graph convolutional networks.” In: Proceedings of NeurIPS, 2019, pp. 11249–11259.
  • Referans110 W.L. Chiang, X. Liu, S. Si, Y. Li, S. Bengio, C.J. Hsieh, “Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks.” In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD ’19), 2019, pp. 257–266.
  • Referans111 H. Zeng, H. Zhou, A. Srivastava, R. Kannan, V.K. Prasanna, “Graphsaint: graph sampling based inductive learning method” In: Proceedings of ICLR. 2020.
  • Referans112 O. Vinyals, S. Bengio, M. Kudlur, “Order Matters: Sequence to Sequence for Sets.” ArXiv. 2015.
  • Referans113 M. Zhang, Z. Cui, M. Neumann, Y. Chen, A”n End-to-End Deep Learning Architecture for Graph Classification” AAAI'18/IAAI'18/ EAAI'18: Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence and Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth AAAI Symposium on Educational Advances in Artificial Intelligence, 2018, no. 544, pp. 4438–4445.
  • Referans114 I.S. Dhillon, Y. Guan, B. Kulis, “Weighted graph cuts without eigenvectors amultilevel approach.” IEEE TPAMI vol. 29, pp. 1944–1957, 2007.
  • Referans115 M. Simonovsky, N. Komodakis, “Dynamic edge-conditioned filters in convolutional neural networks on graphs.” In: Proceedings of CVPR, 2017. pp. 3693–3702.
  • Referans116 Z. Ying, J. You, C. Morris, X. Ren, W. Hamilton, J. Leskovec, “Hierarchical graph representation learning with differentiable pooling.” Advances in neural information processing systems, NeurIPS, 2018, pp. 4805-4815.
  • Referans117 H. Gao, S. Ji, “Graph u-nets.” In: Proceedings of ICML, 2019, pp. 2083–2092.
  • Referans118 Y. Ma, S. Wang, C.C. Aggarwal, J. Tang, “Graph Convolutional Networks with EigenPooling.” In: Proceedings of KDD, 2019, pp. 723–731.
  • Referans119 J. Lee, I. Lee, J. Kang, “Self-attention graph pooling.” ICML, 2019, pp. 3734–3743.
  • Referans120 L., Waikhom, R., Patgiri, “Graph Neural Networks: Methods, Applications, and Opportunities” Arxiv, 2021.
  • Referans121 C., Liu, Y., Zhan, C., Li, B., Du, J., Wu, W., Hu, T., Liu, & D., Tao, “Graph Pooling for Graph Neural Networks: Progress, Challenges, and Opportunities,” ArXiv, 2022, abs/2204.07321.
  • Referans122 S. Georgousis, M. P. Kenning and X. Xie, "Graph Deep Learning: State of the Art and Challenges," in IEEE Access, vol. 9, pp. 22106-22140, 2021, doi: 10.1109/ACCESS.2021.3055280.
Toplam 122 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Mühendislik
Bölüm Akademik ve/veya teknolojik bilimsel makale
Yazarlar

Hamza Talha Gümüş 0000-0001-7360-8138

Can Eyüpoğlu

Erken Görünüm Tarihi 17 Temmuz 2023
Yayımlanma Tarihi 19 Temmuz 2023
Gönderilme Tarihi 14 Mart 2023
Yayımlandığı Sayı Yıl 2023 Cilt: 13 Sayı: 2

Kaynak Göster

APA Gümüş, H. T., & Eyüpoğlu, C. (2023). Grafik Sinir Ağlarına Genel Bir Bakış. EMO Bilimsel Dergi, 13(2), 39-56.
AMA Gümüş HT, Eyüpoğlu C. Grafik Sinir Ağlarına Genel Bir Bakış. EMO Bilimsel Dergi. Temmuz 2023;13(2):39-56.
Chicago Gümüş, Hamza Talha, ve Can Eyüpoğlu. “Grafik Sinir Ağlarına Genel Bir Bakış”. EMO Bilimsel Dergi 13, sy. 2 (Temmuz 2023): 39-56.
EndNote Gümüş HT, Eyüpoğlu C (01 Temmuz 2023) Grafik Sinir Ağlarına Genel Bir Bakış. EMO Bilimsel Dergi 13 2 39–56.
IEEE H. T. Gümüş ve C. Eyüpoğlu, “Grafik Sinir Ağlarına Genel Bir Bakış”, EMO Bilimsel Dergi, c. 13, sy. 2, ss. 39–56, 2023.
ISNAD Gümüş, Hamza Talha - Eyüpoğlu, Can. “Grafik Sinir Ağlarına Genel Bir Bakış”. EMO Bilimsel Dergi 13/2 (Temmuz 2023), 39-56.
JAMA Gümüş HT, Eyüpoğlu C. Grafik Sinir Ağlarına Genel Bir Bakış. EMO Bilimsel Dergi. 2023;13:39–56.
MLA Gümüş, Hamza Talha ve Can Eyüpoğlu. “Grafik Sinir Ağlarına Genel Bir Bakış”. EMO Bilimsel Dergi, c. 13, sy. 2, 2023, ss. 39-56.
Vancouver Gümüş HT, Eyüpoğlu C. Grafik Sinir Ağlarına Genel Bir Bakış. EMO Bilimsel Dergi. 2023;13(2):39-56.

EMO BİLİMSEL DERGİ
Elektrik, Elektronik, Bilgisayar, Biyomedikal, Kontrol Mühendisliği Bilimsel Hakemli Dergisi
TMMOB ELEKTRİK MÜHENDİSLERİ ODASI 
IHLAMUR SOKAK NO:10 KIZILAY/ANKARA
TEL: +90 (312) 425 32 72 (PBX) - FAKS: +90 (312) 417 38 18
bilimseldergi@emo.org.tr