Research Article

Mitigating Data Imbalance Problem in Transformer-Based Intent Detection

Number: 32 December 31, 2021
TR EN

Mitigating Data Imbalance Problem in Transformer-Based Intent Detection

Abstract

There are two major problems when deploying a practical intent detection system for a new customer. First, domain-specific data from the customer could be limited and imbalanced. Additionally, despite different customers might share the same domain, their intent categories might be different from each other. Thus, it might be difficult to combine the datasets collected for different customers into a single and larger one. In this paper, we use class weights in the loss computation to alleviate the data imbalance problem. The class weights are defined inversely proportional to the frequency of the class in the training set in order to give more influence to less observed classes. We also employ a two-pass fine-tuning procedure to utilize the information in different in-domain datasets. Experimental results show that intent detection performance is improved significantly when the weighted loss function is used together with the two-pass transfer learning procedure. The absolute performance improvement in percent detection accuracy is approximately 2% over a transformer-based baseline.

Keywords

Supporting Institution

TÜRKİYE BİLİMSEL VE TEKNOLOJİK ARAŞTIRMA KURUMU (TÜBİTAK)

Project Number

3189149

Thanks

This work was supported by The Scientific and Technological Research Council of Turkey (TUBITAK) under the project number 3189149.

References

  1. Büyük, O., Erden, M. and Arslan, L. M. (2021). "Leveraging the information in in-domain datasets for transformer-based intent detection," Innovations in Intelligent Systems and Applications Conference (ASYU 2021), 2021, pp. 1-4, doi: 10.1109/ASYU52992.2021.9599055.
  2. Casanueva, I., Temčinas, T., Gerz, D., Henderson, M., Vulić, I. (2020). “Efficient intent detection with dual sentence encoders,” arXiv preprint, arXiv:2003.04807.
  3. Deveci, C., Demirbağ, S., Erden, M., Arslan, L.M. (2020) “Query Intent Classification with Short Sentences in Agglutinative Languages,” IEEE 28th Signal Processing and Communications Applications Conference (SIU 2020), Gaziantep, Turkey.
  4. Devlin, J., Chang, M. W., Lee, K., Toutanova, K. (2018) “BERT: Pre-training of deep bidirectional transformers for language understanding,” arXiv preprint, arXiv:1810.04805.
  5. Dündar, E.B., Kiliç, O.F., Çekiç, T., Manav, Y., Deniz, O. (2020) “Large scale intent detection in Turkish short sentences with contextual word embeddings,” 12th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (KDIR 2020), pp. 187-192.
  6. Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., Stoyanov, V. (2019). “Roberta: A robustly optimized bert pretraining approach,” arXiv preprint, arXiv:1907.11692.
  7. Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I. (2019). “Language models are unsupervised multitask learners,” OpenAI blog, 1(8), 9.
  8. Squad, SQuAD2.0 The Stanford Question Answering Dataset (2021), https://rajpurkar.github.io/SQuAD-explorer/. Song, K., Tan, X., Qin, T., Lu, J., Liu, T.Y. (2020). “MPnet: Masked and permuted pre-training for language understanding,” arXiv preprint, arXiv:2004.09297.

Details

Primary Language

English

Subjects

Engineering

Journal Section

Research Article

Publication Date

December 31, 2021

Submission Date

December 25, 2021

Acceptance Date

January 2, 2022

Published in Issue

Year 2021 Number: 32

APA
Büyük, O., Erden, M., & Arslan, L. (2021). Mitigating Data Imbalance Problem in Transformer-Based Intent Detection. Avrupa Bilim Ve Teknoloji Dergisi, 32, 445-450. https://doi.org/10.31590/ejosat.1044812