- Tytuł:
- A Two-Component Normal Mixture Alternative to the Fay-Herriot Model
- Autorzy:
-
Chakraborty, Adrijo
Datta, Gauri Sankar
Mandal, Abhyuday - Powiązania:
- https://bibliotekanauki.pl/articles/465632.pdf
- Data publikacji:
- 2016
- Wydawca:
- Główny Urząd Statystyczny
- Tematy:
-
Hierarchical Bayes
heavy-tail distribution
non-informative priors
robustness to outliers
small area estimation - Opis:
- This article considers a robust hierarchical Bayesian approach to deal with random effects of small area means when some of these effects assume extreme values, resulting in outliers. In the presence of outliers, the standard Fay-Herriot model, used for modeling area-level data, under normality assumptions of random effects may overestimate the random effects variance, thus providing less than ideal shrinkage towards the synthetic regression predictions and inhibiting the borrowing of information. Even a small number of substantive outliers of random effects results in a large estimate of the random effects variance in the Fay-Herriot model, thereby achieving little shrinkage to the synthetic part of the model or little reduction in the posterior variance associated with the regular Bayes estimator for any of the small areas. While the scale mixture of normal distributions with a known mixing distribution for the random effects has been found to be effective in the presence of outliers, the solution depends on the mixing distribution. As a possible alternative solution to the problem, a two-component normal mixture model has been proposed, based on non-informative priors on the model variance parameters, regression coefficients and the mixing probability. Data analysis and simulation studies based on real, simulated and synthetic data show an advantage of the proposed method over the standard Bayesian Fay-Herriot solution derived under normality of random effects.
- Źródło:
-
Statistics in Transition new series; 2016, 17, 1; 67-90
1234-7655 - Pojawia się w:
- Statistics in Transition new series
- Dostawca treści:
- Biblioteka Nauki