İşin Olsun Platformu İlanlarında İçerik Kontrolü
Öz
Anahtar Kelimeler
dil modelleri , bert , transformers , doğal dil işleme , yapay zekâ , derin öğrenme
References
- [Online]. Available: https://isinolsun.com/
- U. U. Acikalin, B. Bardak, and M. Kutlu, “Turkish sentiment analysis using bert,” in 2020 28th Signal Processing and Communications Applications Conference (SIU), 2020, pp. 1–4.
- J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “Bert: Pre-training of deep bidirectional transformers for language understanding,” 2019.
- Y. Zhang, R. Jin, and Z.-H. Zhou, “Understanding bag-of-words model: A statistical framework,” International Journal of Machine Learning and Cybernetics, vol. 1, pp. 43–52, 12 2010.
- [Online]. Available: https://huggingface.co/transformers/model doc/bert. html
- S. Schweter, “Berturk - bert models for turkish,” Apr. 2020. [Online]. Available: https://doi.org/10.5281/zenodo.3770924
- [Online]. Available: https://www.perspectiveapi.com/
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, “Attention is all you need,” 2017. [Online]. Available: https://arxiv.org/pdf/1706.03762.pdf
- [Online]. Available: https://en.wikipedia.org/wiki/Scunthorpe problem
- C. Sun, X. Qiu, Y. Xu, and X. Huang, “How to fine-tune bert for text classification?” 2020.
