In the past decade, mixed-effects modeling has received a great deal of attention in the applied and theoretical statistical literature. They are very flexible tools in analyzing repeated measures, panel data, cross-sectional data, and hierarchical data. However, the complex nature of these models has motivated researchers to study different aspects of this problem. One of which is to test the significance of random effects used to model unobserved heterogeneity in the population. The method of likelihood ratio test based on the normality assumption of the error term and random effects has been proposed. However, this assumption does not necessarily hold in practice. In this paper, we propose an optimal test based on the so-called uniform local asymptotic normality to detect the possible presence of random effects in linear mixed models. We show that the proposed test statistic is, consistent, locally asymptotically optimal even for a model that does not require the traditional assumption of normality and is comparable to the classical L.ratio-test when the standard assumptions are met. Finally, simulation studies and real data analysis are also conducted to empirically examine the performance of this procedure.
Primary Language | English |
---|---|
Subjects | Statistics |
Journal Section | Statistics |
Authors | |
Publication Date | August 6, 2021 |
Published in Issue | Year 2021 |