Araştırma Makalesi
BibTex RIS Kaynak Göster

A Developed Graphical User Interface-Based on Different Generative Pre-trained Transformers Models

Yıl 2024, , 18 - 32, 30.04.2024
https://doi.org/10.56941/odutip.1413597

Öz

Objective: The article investigates the integration of advanced Generative Pretrained Transformers (GPT) models into a user-friendly Graphical User Interface (GUI). The primary objective of this work is to simplify access to complex Natural Language Processing (NLP) tasks for a diverse range of users, including those with limited technical background.
Method: The development process of the GUI was comprehensive and systematic: Needs Assessment: This stage involved understanding the requirements and expectations of potential users to ensure the GUI effectively addresses their needs. Preliminary Design and Development: The initial designs were created and developed into a functional GUI, emphasizing the integration of features supporting various NLP tasks like text summarization, translation, and question-answering. Iterative Refinement: Continuous improvements were made based on user feedback, focusing on enhancing user experience, ease of navigation, and customization capabilities.
Results: The developed GUI successfully integrated GPT models, including GPT-4 Turbo and GPT-3.5, resulting in an intuitive and adaptable interface. It demonstrated efficiency in performing various NLP tasks, thereby making these advanced language processing tools accessible to a broader audience. The GUI's design, emphasizing user-friendliness and adaptability, was particularly noted for its ability to cater to both technical and non-technical users.
Conclusion: In conclusion, the article illustrates the significant impact of combining advanced GPT models with a Graphical User Interface to democratize the use of NLP tools. This integration not only makes complex language processing more accessible but also marks a pivotal step in the inclusive application of AI technology across various domains. The successful implementation of the GUI highlights the potential of AI in enhancing user interaction and broadening the scope of technology usage in everyday tasks.

Etik Beyan

Ethics committee approval is not required in this study.

Destekleyen Kurum

There are no financial supports.

Teşekkür

We thank you so much in advance and looking forward to receiving your reply soon.

Kaynakça

  • Han X, Zhang Z, Ding N, Gu Y, Liu X, Huo Y, et al. Pre-trained models: Past, present and future. AI Open. 2021; 2: 225-50.
  • Yenduri G, Srivastava G, Maddikunta PKR, Jhaveri RH, Wang W, Vasilakos AV, et al. Generative Pre-trained Transformer: A Comprehensive Review on Enabling Technologies, Potential Applications, Emerging Challenges, and Future Directions. arXiv preprint arXiv:230510435. 2023.
  • Brockman G, Sutskever I. The OpenAI team,“. Introducing OpenAI. 2015; 11.
  • Dong L, Xu S, Xu B, editors. Speech-transformer: a no-recurrence sequence-to-sequence model for speech recognition. 2018 IEEE international conference on acoustics, speech and signal processing (ICASSP); 2018: IEEE.
  • Kim M, Corradini D, Sinha S, Orso A, Pasqua M, Tzoref-Brill R, et al., editors. Enhancing REST API Testing with NLP Techniques. Proceedings of the 32nd ACM SIGSOFT International Symposium on Software Testing and Analysis; 2023.
  • Ball CJ. Hacking APIs: Breaking Web Application Programming Interfaces: No Starch Press; 2022.
  • Hat R. What is a REST API? 2021. URL: https://www redhat com/en/topics/api/what-is-a-rest-api (visited on 08/06/2021).
  • Lin J, Pradeep R, Teofili T, Xian J. Vector search with OpenAI embeddings: Lucene is all you need. arXiv preprint arXiv:230814963. 2023.
  • Cheng SW, Chang CW, Chang WJ, Wang HW, Liang CS, Kishimoto T, et al. The Now and Future of ChatGPT and GPT in Psychiatry. Psychiatry and Clinical Neurosciences. 2023.
  • Lennon B. Machine translation: A tale of two cultures. A companion to translation studies. 2014; 133-46.
  • Bengio Y, Ducharme R, Vincent P. A neural probabilistic language model. Advances in neural information processing systems. 2000; 13.
  • Sutskever I, Vinyals O, Le QV. Sequence to sequence learning with neural networks. Advances in neural information processing systems. 2014; 27.
  • Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:14090473. 2014.
  • Mikolov T, Chen K, Corrado G, Dean J. Efficient estimation of word representations in vector space. arXiv preprint arXiv:13013781. 2013.
  • Józefowicz R, Vinyals O, Schuster M, Shazeer N, Wu Y. Exploring the limits of language modeling. arXiv [Preprint](2016). arXiv preprint arXiv:160202410.
  • Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, et al. Attention is all you need. Advances in neural information processing systems. 2017; 30.
  • Ouyang L, Wu J, Jiang X, Almeida D, Wainwright C, Mishkin P, et al. Training language models to follow instructions with human feedback. Advances in Neural Information Processing Systems. 2022; 35: 27730-44.
  • Brown T, Mann B, Ryder N, Subbiah M, Kaplan JD, Dhariwal P, et al. Language models are few-shot learners. Advances in neural information processing systems. 2020; 33: 1877-901.
  • Kim TW. Application of artificial intelligence chatbot, including ChatGPT in education, scholarly work, programming, and content generation and its prospects: a narrative review. Journal of Educational Evaluation for Health Professions. 2023; 20: 38-45.
  • Dai D, Sun Y, Dong L, Hao Y, Ma S, Sui Z, et al., editors. Why can GPT learn in-context? language models secretly perform gradient descent as meta-optimizers. Findings of the Association for Computational Linguistics: ACL 2023; 4005-4019.
  • Masood A, Hashmi A. Cognitive Computing Recipes: Artificial Intelligence Solutions Using Microsoft Cognitive Services and TensorFlow: Apress; 2019.

Farklı Üretken, Önceden Eğitimli Dönüştürücüler Modellerine Dayalı Geliştirilmiş Grafik Kullanıcı Arayüzü

Yıl 2024, , 18 - 32, 30.04.2024
https://doi.org/10.56941/odutip.1413597

Öz

Amaç: Makale, gelişmiş Üretken Önceden Eğitimli Dönüştürücüler (GPT) modellerinin kullanıcı dostu Grafik Kullanıcı Arayüzü'ne (GUI) entegrasyonunu araştırmaktadır. Bu çalışmanın temel amacı, sınırlı teknik altyapıya sahip kullanıcılar da dahil olmak üzere çeşitli kullanıcıların karmaşık Doğal Dil İşleme (NLP) görevlerine erişimini kolaylaştırmaktır.
Yöntem: GUI'nin geliştirme süreci kapsamlı ve sistematikti: Gereksinim Değerlendirmesi: Bu aşama, potansiyel kullanıcıların gereksinimlerini ve beklentilerini anlamayı içeriyordu ve GUI'nin bu gereksinimleri etkili bir şekilde ele almasını sağlamak için yapıldı. Ön Tasarım ve Geliştirme: İlk tasarımlar, metin özetleme, çeviri ve soru cevaplama gibi çeşitli NLP görevlerini destekleyen özelliklerin entegrasyonunu vurgulayan işlevsel bir GUI olarak oluşturuldu ve geliştirildi. Yinelemeli İyileştirme: Kullanıcının geri bildirimlerine dayalı olarak kullanıcı deneyiminin, gezinme kolaylığının ve özelleştirme yeteneklerinin geliştirilmesine odaklanan sürekli iyileştirmeler yapıldı
Bulgular: Geliştirilen GUI, GPT-4 Turbo ve GPT-3.5 gibi GPT modellerine başarılı bir şekilde entegre edildi ve sezgisel ve uyarlanabilir bir arayüz ortaya çıkarıldı. Farklı NLP görevlerini etkili bir şekilde gerçekleştirme yeteneği göstererek, bu gelişmiş dil işleme araçlarını daha geniş bir kitleye erişilebilir hale getirdi. Kullanıcı dostu ve uyarlanabilirliği vurgulayan GUI tasarımı, teknik ve teknik olmayan kullanıcılara hitap etme yeteneği özellikle dikkat çekti.
Sonuç: Sonuç olarak, makale, gelişmiş GPT modellerini Grafik Kullanıcı Arayüzü ile birleştirmenin NLP araçlarının kullanımını demokratikleştirmedeki önemli etkisini göstermektedir. Bu entegrasyon, karmaşık dil işleme araçlarını sadece daha erişilebilir hale getirmekle kalmaz, aynı zamanda çeşitli alanlarda yapay zeka (AI) teknolojisinin kapsayıcı uygulamasında da bir dönüm noktası işaret eder. GUI'nin başarılı bir şekilde uygulanması, AI'nin kullanıcı etkileşimini geliştirme potansiyelini ve günlük görevlerde teknoloji kullanımının kapsamını genişletme potansiyelini vurgular.

Kaynakça

  • Han X, Zhang Z, Ding N, Gu Y, Liu X, Huo Y, et al. Pre-trained models: Past, present and future. AI Open. 2021; 2: 225-50.
  • Yenduri G, Srivastava G, Maddikunta PKR, Jhaveri RH, Wang W, Vasilakos AV, et al. Generative Pre-trained Transformer: A Comprehensive Review on Enabling Technologies, Potential Applications, Emerging Challenges, and Future Directions. arXiv preprint arXiv:230510435. 2023.
  • Brockman G, Sutskever I. The OpenAI team,“. Introducing OpenAI. 2015; 11.
  • Dong L, Xu S, Xu B, editors. Speech-transformer: a no-recurrence sequence-to-sequence model for speech recognition. 2018 IEEE international conference on acoustics, speech and signal processing (ICASSP); 2018: IEEE.
  • Kim M, Corradini D, Sinha S, Orso A, Pasqua M, Tzoref-Brill R, et al., editors. Enhancing REST API Testing with NLP Techniques. Proceedings of the 32nd ACM SIGSOFT International Symposium on Software Testing and Analysis; 2023.
  • Ball CJ. Hacking APIs: Breaking Web Application Programming Interfaces: No Starch Press; 2022.
  • Hat R. What is a REST API? 2021. URL: https://www redhat com/en/topics/api/what-is-a-rest-api (visited on 08/06/2021).
  • Lin J, Pradeep R, Teofili T, Xian J. Vector search with OpenAI embeddings: Lucene is all you need. arXiv preprint arXiv:230814963. 2023.
  • Cheng SW, Chang CW, Chang WJ, Wang HW, Liang CS, Kishimoto T, et al. The Now and Future of ChatGPT and GPT in Psychiatry. Psychiatry and Clinical Neurosciences. 2023.
  • Lennon B. Machine translation: A tale of two cultures. A companion to translation studies. 2014; 133-46.
  • Bengio Y, Ducharme R, Vincent P. A neural probabilistic language model. Advances in neural information processing systems. 2000; 13.
  • Sutskever I, Vinyals O, Le QV. Sequence to sequence learning with neural networks. Advances in neural information processing systems. 2014; 27.
  • Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:14090473. 2014.
  • Mikolov T, Chen K, Corrado G, Dean J. Efficient estimation of word representations in vector space. arXiv preprint arXiv:13013781. 2013.
  • Józefowicz R, Vinyals O, Schuster M, Shazeer N, Wu Y. Exploring the limits of language modeling. arXiv [Preprint](2016). arXiv preprint arXiv:160202410.
  • Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, et al. Attention is all you need. Advances in neural information processing systems. 2017; 30.
  • Ouyang L, Wu J, Jiang X, Almeida D, Wainwright C, Mishkin P, et al. Training language models to follow instructions with human feedback. Advances in Neural Information Processing Systems. 2022; 35: 27730-44.
  • Brown T, Mann B, Ryder N, Subbiah M, Kaplan JD, Dhariwal P, et al. Language models are few-shot learners. Advances in neural information processing systems. 2020; 33: 1877-901.
  • Kim TW. Application of artificial intelligence chatbot, including ChatGPT in education, scholarly work, programming, and content generation and its prospects: a narrative review. Journal of Educational Evaluation for Health Professions. 2023; 20: 38-45.
  • Dai D, Sun Y, Dong L, Hao Y, Ma S, Sui Z, et al., editors. Why can GPT learn in-context? language models secretly perform gradient descent as meta-optimizers. Findings of the Association for Computational Linguistics: ACL 2023; 4005-4019.
  • Masood A, Hashmi A. Cognitive Computing Recipes: Artificial Intelligence Solutions Using Microsoft Cognitive Services and TensorFlow: Apress; 2019.
Toplam 21 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Klinik Tıp Bilimleri (Diğer)
Bölüm Orjinal makale
Yazarlar

Ekrem Küçük 0009-0008-4313-8559

İpek Balıkçı Çiçek 0000-0002-3805-9214

Zeynep Küçükakçalı 0000-0001-7956-9272

Cihan Yetiş 0000-0001-9703-946X

Cemil Çolak 0000-0001-5406-098X

Yayımlanma Tarihi 30 Nisan 2024
Gönderilme Tarihi 2 Ocak 2024
Kabul Tarihi 5 Nisan 2024
Yayımlandığı Sayı Yıl 2024

Kaynak Göster

Vancouver Küçük E, Balıkçı Çiçek İ, Küçükakçalı Z, Yetiş C, Çolak C. A Developed Graphical User Interface-Based on Different Generative Pre-trained Transformers Models. ODU Tıp Derg. 2024;11(1):18-32.