None
In this paper, a Bayesian paradigm of a mixture model with finite and non-finite components is expounded for a generic prior and likelihood that can be of any distributional random noise. The mixture model consists of stylized properties-proportional allocation, sample size allocation, and latent (unobserved) variable for similar probabilistic generalization. The Expectation-Maximization (EM) algorithm technique of parameter estimation was adopted to estimate the stated stylized parameters. The Markov Chain Monte Carlo (MCMC) and Metropolis–Hastings sampler algorithms were adopted as an alternative to the EM algorithm when it is not analytically feasible, that is, when the unobserved variable cannot be replaced by imposed expectations (means) and when there is need for correction of exploration of posterior distribution by means of acceptance ratio quantity, respectively. Label switching for exchangeability of posterior distribution via truncated or alternating prior distributional form was imposed on the posterior distribution for robust tailoring inference through Maximum a Posterior (MAP) index. In conclusion, it was deduced via simulation study that the number of components grows large for all permutations to be considered for subsample permutations.
Bayesian paradigm expectation-maximization MCMC proportional allocation metropolis–hastings
None
Primary Language | English |
---|---|
Subjects | Statistical Analysis, Statistical Theory, Theory of Sampling |
Journal Section | Research Article |
Authors | |
Project Number | None |
Early Pub Date | December 30, 2023 |
Publication Date | December 31, 2023 |
Submission Date | September 21, 2023 |
Published in Issue | Year 2023 Issue: 45 |
As of 2021, JNT is licensed under a Creative Commons Attribution-NonCommercial 4.0 International Licence (CC BY-NC). |