In this paper, a mode decomposition analysis-based adaptive approach is proposed to provide high diagnostic performance for automated voice pathology detection systems. The aim of the study is to develop a reliable and effective system using adaptive cepstral domain features derived from the empirical mode decomposition (EMD), ensemble empirical mode decomposition (EEMD), and complete empirical mode decomposition with adaptive noise (CEEMDAN) methods. The descriptive feature sets are obtained by applying mel-frequency cepstral coefficients (MFCCs) and their derivatives, linear predictive coefficients (LPCs) and linear predictive cepstral coefficients (LPCCs) techniques to each decomposition level. The class-balanced data are generated on the VOice ICar fEDerico II database samples using the synthetic minority oversampling technique (SMOTE). The ReliefF algorithm is used to select the most effective and distinctive features. A combination of selected features and a support vector machine (SVM) classifier is used to identify pathological voices. In the pathology detection approach, the results show that the cepstral features based on EMD and SVM-cubic achieves the highest performance with 99.85\% accuracy, 99.85\% F1-score and 0.997 Matthews correlation coefficient (MCC). In pathology-type classification, the cepstral features based on EEMD and SVM-quadratic approach provided the highest performance with 96.49\% accuracy, 96.46\% F1 and 0.949 MCC values. The comprehensive results of this study reveal that mode decomposition-based approaches are more successful and effective than traditional methods for detection and classification of pathological voices.
Voice pathology SMOTE algorithm mode decomposition cepstral-domain coefficients ReliefF algorithm support vector machine
Primary Language | English |
---|---|
Subjects | Biological Mathematics, Applied Mathematics (Other) |
Journal Section | Research Articles |
Authors | |
Publication Date | December 30, 2024 |
Submission Date | April 25, 2024 |
Acceptance Date | December 18, 2024 |
Published in Issue | Year 2024 Volume: 4 Issue: 4 |