<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.4 20241031//EN"
        "https://jats.nlm.nih.gov/publishing/1.4/JATS-journalpublishing1-4.dtd">
<article  article-type="research-article"        dtd-version="1.4">
            <front>

                <journal-meta>
                                                                <journal-id>jista</journal-id>
            <journal-title-group>
                                                                                    <journal-title>Journal of Intelligent Systems: Theory and Applications</journal-title>
            </journal-title-group>
                                        <issn pub-type="epub">2651-3927</issn>
                                                                                            <publisher>
                    <publisher-name>Özer UYGUN</publisher-name>
                </publisher>
                    </journal-meta>
                <article-meta>
                                        <article-id pub-id-type="doi">10.38016/jista.1691406</article-id>
                                                                <article-categories>
                                            <subj-group  xml:lang="en">
                                                            <subject>Artificial Intelligence (Other)</subject>
                                                    </subj-group>
                                            <subj-group  xml:lang="tr">
                                                            <subject>Yapay Zeka (Diğer)</subject>
                                                    </subj-group>
                                    </article-categories>
                                                                                                                                                        <title-group>
                                                                                                                        <trans-title-group xml:lang="en">
                                    <trans-title>Cognitive Mapping in Ottoman-Turkish Music: A Machine Learning Approach to Modal Transitions (16th–19th Centuries)</trans-title>
                                </trans-title-group>
                                                                                                                                                                                                <article-title>OSMANLI-TÜRK MUSİKİSİNDE BİLİŞSEL GEÇKİ HARİTALAMASI: 16.–19. YÜZYILLAR ARASINDA MAKAMSAL GEÇİŞLERİN YAPAY ZEKA TEMELLİ ANALİZİ</article-title>
                                                                                                    </title-group>
            
                                                    <contrib-group content-type="authors">
                                                                        <contrib contrib-type="author">
                                                                    <contrib-id contrib-id-type="orcid">
                                        https://orcid.org/0000-0002-6316-0189</contrib-id>
                                                                <name>
                                    <surname>Eraslan</surname>
                                    <given-names>İsmail</given-names>
                                </name>
                                                                    <aff>TRABZON UNIVERSITY</aff>
                                                            </contrib>
                                                                                </contrib-group>
                        
                                        <pub-date pub-type="pub" iso-8601-date="20260330">
                    <day>03</day>
                    <month>30</month>
                    <year>2026</year>
                </pub-date>
                                        <volume>9</volume>
                                        <issue>2026</issue>
                                        <fpage>1</fpage>
                                        <lpage>9</lpage>
                        
                        <history>
                                    <date date-type="received" iso-8601-date="20250504">
                        <day>05</day>
                        <month>04</month>
                        <year>2025</year>
                    </date>
                                                    <date date-type="accepted" iso-8601-date="20250902">
                        <day>09</day>
                        <month>02</month>
                        <year>2025</year>
                    </date>
                            </history>
                                        <permissions>
                    <copyright-statement>Copyright © 2018, Journal of Intelligent Systems: Theory and Applications</copyright-statement>
                    <copyright-year>2018</copyright-year>
                    <copyright-holder>Journal of Intelligent Systems: Theory and Applications</copyright-holder>
                </permissions>
            
                                                                                                <trans-abstract xml:lang="en">
                            <p>Between the 16th and 19th centuries, Ottoman-Turkish music underwent profound transformations that positioned it not merely as a form of art but as a carrier of cultural memory, aesthetic intuition, and cognitive structures. This study analyzes modal transitions (geçki) through AI-assisted cognitive mapping, offering a new framework that bridges traditional musical aesthetics with contemporary technological epistemologies. A repertoire of approximately 150 classical works digitized through score transcriptions, meşk transmissions, and performance recordings was examined using LSTM and Transformer-based deep learning models. The analysis revealed not only the frequency, directionality, and central nodes of modal transitions but also their historical differentiations. Findings show that artificial intelligence should not be reduced to a statistical tool of accuracy; rather, it emerges as a cognitive partner capable of approximating cultural meaning and aesthetic intuition. The prominence of Rast and Hicaz as central hubs, the recurrent role of Uşşak as a target makam, and the visibility of historical variations in cognitive maps collectively demonstrate the dual success of this approach in both technical and cultural terms. The originality of this research lies in its conceptualization of music not simply as a numerical sequence of melodic shifts but as a cognitive-ecological system in which collective memory, cultural identity, and aesthetic orientation converge. Accordingly, this article is not only addressed to musicologists but also invites contributions from cognitive scientists, AI researchers, philosophers of aesthetics, and cultural memory scholars. It redefines the relationship between human and machine as a dialogical partnership in the co-production of meaning, rather than a one-sided process of mechanical computation. In this way, the historical modal transitions of Ottoman-Turkish music become an epistemological laboratory in the age of artificial intelligence, offering both a reinterpretation of the past and a philosophical horizon for the future.</p></trans-abstract>
                                                                                                                                    <abstract><p>Osmanlı-Türk musikisi, 16. ile 19. yüzyıllar arasında geçirdiği dönüşümlerle yalnızca bir sanat biçimi olarak değil, aynı zamanda kültürel hafızanın, estetik sezginin ve bilişsel örüntülerin taşıyıcısı olarak incelenmesi gereken bir fenomen haline gelmiştir. Bu çalışma, makamsal geçişlerin (geçki) yapay zekâ destekli bilişsel haritalama yöntemiyle çözümlemesini yaparak, geleneksel müzik estetiği ile çağdaş teknolojik epistemolojiler arasında yeni bir bağlam sunmaktadır. Araştırmada yaklaşık 150 klasik eser seçilmiş; bu eserler nota transkripsiyonları, meşk aktarımı ve icra kayıtlarından elde edilen veri setiyle dijitalleştirilmiştir. LSTM ve Transformer tabanlı derin öğrenme modelleri aracılığıyla makamlar arası geçişlerin sıklıkları, yönelimleri, yoğunluk merkezleri ve tarihsel farklılaşmaları ortaya konmuştur. Elde edilen bulgular, yapay zekânın yalnızca istatistiksel doğruluk sağlayan bir araç değil, aynı zamanda kültürel anlamı yeniden üreten ve estetik sezgiye yaklaşabilen bir bilişsel ortak olabileceğini göstermektedir. Rast ve Hicaz makamlarının merkezî düğüm rolü üstlenmesi, Uşşak’ın çoğunlukla hedef makam olarak belirginleşmesi ve geçkilerin tarihsel çeşitliliğinin bilişsel haritalarda görünür kılınması, bu yaklaşımın hem teknik hem kültürel bir başarı taşıdığını kanıtlamaktadır. Çalışmanın özgünlüğü, müziği yalnızca melodik geçişlerin matematiksel dizisi olarak değil; kolektif hafıza, kültürel aidiyet ve estetik yönelimlerin iç içe geçtiği bir bilişsel-ekolojik sistem olarak kavramsallaştırmasında yatmaktadır. Bu nedenle makale, yalnızca müzikoloji uzmanlarına değil, bilişsel bilimciler, yapay zekâ araştırmacıları, estetik felsefeciler ve kültürel bellek çalışmalarıyla ilgilenen disiplinler arası uzmanlara da hitap etmektedir. Araştırma, insan ile makine arasındaki ilişkiyi, mekanik bir işlemden öte, kültürel ve estetik anlamın ortaklaşa üretildiği bir diyalog olarak yeniden tanımlamaktadır. Böylece Osmanlı-Türk musikisinin tarihsel geçişleri, yapay zekâ çağında epistemolojik bir laboratuvar niteliği kazanmakta hem geçmişi yeniden anlamlandırmakta hem de geleceğe yönelik düşünsel ufuklar açmaktadır.</p></abstract>
                                                            
            
                                                                                        <kwd-group>
                                                    <kwd>Osmanlı Türk Musikisi</kwd>
                                                    <kwd>  bilişsel haritalama</kwd>
                                                    <kwd>  yapay zeka</kwd>
                                                    <kwd>  makam geçişleri</kwd>
                                                    <kwd>  derin öğrenme</kwd>
                                            </kwd-group>
                            
                                                <kwd-group xml:lang="en">
                                                    <kwd>Ottoman-Turkish music</kwd>
                                                    <kwd>  cognitive mapping</kwd>
                                                    <kwd>  artificial intelligence</kwd>
                                                    <kwd>  modal transitions</kwd>
                                                    <kwd>  deep leanrning</kwd>
                                            </kwd-group>
                                                                                                                                        </article-meta>
    </front>
    <back>
                            <ref-list>
                                    <ref id="ref1">
                        <label>1</label>
                        <mixed-citation publication-type="journal">Arslan, M., Kaya, E., 2024. Cultural context in AI-based music cognition: A case study on Ottoman-Turkish makam. Journal of Intelligent Systems: Theory and Applications, 7(1), 21-38.</mixed-citation>
                    </ref>
                                    <ref id="ref2">
                        <label>2</label>
                        <mixed-citation publication-type="journal">Briot, J.-P., Hadjeres, G., Pachet, F.-D., 2020. Deep learning techniques for music generation. Springer.</mixed-citation>
                    </ref>
                                    <ref id="ref3">
                        <label>3</label>
                        <mixed-citation publication-type="journal">Chen, Y., Yılmaz, H., 2023. Modal transition modeling in Turkish makam music using deep neural networks. Journal of Intelligent Systems: Theory and Applications, 6(2), 77-94.</mixed-citation>
                    </ref>
                                    <ref id="ref4">
                        <label>4</label>
                        <mixed-citation publication-type="journal">Clarke, E., 2005. Ways of listening: An ecological approach to the perception of musical meaning. Oxford University Press.</mixed-citation>
                    </ref>
                                    <ref id="ref5">
                        <label>5</label>
                        <mixed-citation publication-type="journal">Creswell, J. W., 2014. Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). SAGE Publications.</mixed-citation>
                    </ref>
                                    <ref id="ref6">
                        <label>6</label>
                        <mixed-citation publication-type="journal">Cook, N., 2013. Beyond the score: Music as performance. Oxford University Press.</mixed-citation>
                    </ref>
                                    <ref id="ref7">
                        <label>7</label>
                        <mixed-citation publication-type="journal">Goehr, L., 2007. The imaginary museum of musical works: An essay in the philosophy of music. Oxford University Press.</mixed-citation>
                    </ref>
                                    <ref id="ref8">
                        <label>8</label>
                        <mixed-citation publication-type="journal">Herremans, D., Chuan, C.-H., Chew, E., 2017. A functional taxonomy of music generation systems. ACM Computing Surveys, 50(5), 1-30.</mixed-citation>
                    </ref>
                                    <ref id="ref9">
                        <label>9</label>
                        <mixed-citation publication-type="journal">Huang, C. Z. A., Vaswani, A., Uszkoreit, J., Shazeer, N., Simon, I., Hawthorne, C., Eck, D., 2020. Music transformer: Generating music with long-term structure. Proceedings of the International Conference on Learning Representations (ICLR).</mixed-citation>
                    </ref>
                                    <ref id="ref10">
                        <label>10</label>
                        <mixed-citation publication-type="journal">Lee, S., Kim, J., 2024. Towards culturally sensitive AI in music: Bridging cognitive musicology and deep learning. Frontiers in Artificial Intelligence, 7, 112-128.</mixed-citation>
                    </ref>
                                    <ref id="ref11">
                        <label>11</label>
                        <mixed-citation publication-type="journal">Levitin, D. J., 2019. The organized mind: Thinking straight in the age of information overload. Penguin Books.</mixed-citation>
                    </ref>
                                    <ref id="ref12">
                        <label>12</label>
                        <mixed-citation publication-type="journal">London, J., 2012. Hearing in time: Psychological aspects of musical meter. Oxford University Press.</mixed-citation>
                    </ref>
                                    <ref id="ref13">
                        <label>13</label>
                        <mixed-citation publication-type="journal">Nattiez, J.-J., 1990. Music and discourse: Toward a semiology of music. Princeton University Press.</mixed-citation>
                    </ref>
                                    <ref id="ref14">
                        <label>14</label>
                        <mixed-citation publication-type="journal">Nettl, B., 2005. The study of ethnomusicology: Thirty-one issues and concepts. University of Illinois Press.</mixed-citation>
                    </ref>
                                    <ref id="ref15">
                        <label>15</label>
                        <mixed-citation publication-type="journal">Patel, A. D., 2008. Music, language, and the brain. Oxford University Press.</mixed-citation>
                    </ref>
                                    <ref id="ref16">
                        <label>16</label>
                        <mixed-citation publication-type="journal">Peeters, G., 2004. A large set of audio features for sound description (similarity and classification) in the CUIDADO project. Technical Report, IRCAM.</mixed-citation>
                    </ref>
                                    <ref id="ref17">
                        <label>17</label>
                        <mixed-citation publication-type="journal">Rowe, R., 2001. Machine musicianship. MIT Press.</mixed-citation>
                    </ref>
                                    <ref id="ref18">
                        <label>18</label>
                        <mixed-citation publication-type="journal">Seroussi, E., 2014. Tradition and transformation in Turkish music. Routledge.</mixed-citation>
                    </ref>
                                    <ref id="ref19">
                        <label>19</label>
                        <mixed-citation publication-type="journal">Serra, X., Gómez, E., Herrera, P., Pauws, S., 2011. Musical audio content description with the MPEG-7 standard. IEEE Transactions on Speech and Audio Processing, 11(6), 642-656.</mixed-citation>
                    </ref>
                                    <ref id="ref20">
                        <label>20</label>
                        <mixed-citation publication-type="journal">Sturm, B. L., 2016. The state of the art ten years after a state of the art: Future research in music information retrieval. Journal of New Music Research, 45(3), 183-210.</mixed-citation>
                    </ref>
                                    <ref id="ref21">
                        <label>21</label>
                        <mixed-citation publication-type="journal">Thagard, P., 2005. Mind: Introduction to cognitive science (2nd ed.). MIT Press.</mixed-citation>
                    </ref>
                                    <ref id="ref22">
                        <label>22</label>
                        <mixed-citation publication-type="journal">Tzanetakis, G., Kapur, A., Schloss, W. A., Wright, M., 2018. Computational ethnomusicology: Analyzing diverse musical cultures with AI. Computer Music Journal, 42(2), 20-34.</mixed-citation>
                    </ref>
                                    <ref id="ref23">
                        <label>23</label>
                        <mixed-citation publication-type="journal">Zatorre, R., Chen, J. L., Penhune, V. B., 2007. When the brain plays music: Auditory–motor interactions in music perception and production. Nature Reviews Neuroscience, 8(7), 547-558.</mixed-citation>
                    </ref>
                                    <ref id="ref24">
                        <label>24</label>
                        <mixed-citation publication-type="journal">Zhang, L., 2023. Cognitive-inspired deep learning models for non-Western music analysis. Journal of Intelligent Systems: Theory and Applications, 6(1), 45-60.</mixed-citation>
                    </ref>
                            </ref-list>
                    </back>
    </article>
