Previous |  Up |  Next

Article

Keywords:
$\phi$-divergence; empirical distributions; parameter estimation; hypotheses testing
Summary:
For data generated by stationary Markov chains there are considered estimates of chain parameters minimizing $\phi $–divergences between theoretical and empirical distributions of states. Consistency and asymptotic normality are established and the asymptotic covariance matrices are evaluated. Testing of hypotheses about the stationary distributions based on $\phi $–divergences between the estimated and empirical distributions is considered as well. Asymptotic distributions of $\phi $–divergence test statistics are found, enabling to specify asymptotically $\alpha $-level tests.
References:
[1] Billingsley P.: Statistical methods in Markov chains. Ann. Math. Statist. 32 (1961), 12–40 DOI 10.1214/aoms/1177705136 | MR 0123420 | Zbl 0104.12802
[2] Birch M. W.: A new proof of the Pearson–Fisher Theorem. Ann. Math. Statist. 35 (1964), 817–824 DOI 10.1214/aoms/1177703581 | MR 0169324 | Zbl 0259.62017
[3] Bishop Y. M. M., Fienberg S. E., Holland P. W.: Discrete Multivariate Analysis. Theory and Practice. The MIT Press, Cambridge, Massachusetts 1975 MR 0381130 | Zbl 1131.62043
[4] Cressie N., Read T. R. C.: Multinomial goodness of fit tests. J. Royal Statist. Soc., Ser. B 46 (1984), 440–464 MR 0790631 | Zbl 0571.62017
[5] Drost F. C., Kallenberg W. C. M., Moore D. S., Oosterhoff J.: Power approximations to multinomial tests of fit. J. Amer. Statist. Assoc. 84 (1989), 130–141 DOI 10.1080/01621459.1989.10478748 | MR 0999671 | Zbl 0683.62027
[6] Glesser L. J., Moore D. S.: The effect of dependence on chi–squared and empiric distribution tests of fit. Ann. Statist. 11 (1983), 1100-1108 MR 0720256
[7] Glesser L. J., Moore D. S.: The effect of positive dependence on chi–squared tests for categorical data. J. Royal Statis. Soc., Ser. B 47 (1983), 459–465 MR 0844476
[8] Liese F., Vajda I.: Convex Statistical Distances. Teubner, Leipzig 1987 MR 0926905 | Zbl 0656.62004
[9] Menéndez M. L., Morales D., Pardo L., Vajda I.: Divergence–based estimation and testing of statistical models of classification. J. Multivariate Anal. 54 (1996), 329–354 DOI 10.1006/jmva.1995.1060 | MR 1345543
[10] Menéndez M. L., Morales D., Pardo L., Vajda I.: Testing in stationary models based on $f$–divergences of observed and theoretical frequencies. Kybernetika 33 (1997), 465–475 MR 1603997
[11] Moore D. S.: The effect of dependence on chi–squared tests of fit. Ann. Statist. 10 (1982), 1163–1171 DOI 10.1214/aos/1176345981 | MR 0673651 | Zbl 0507.62045
[12] Morales D., Pardo L., Vajda I.: Asymptotic divergence of estimates of discrete distributions. J. Statist. Plann. Inference 48 (1995), 347–369 DOI 10.1016/0378-3758(95)00013-Y | MR 1368984 | Zbl 0839.62004
[13] Read T. R. C., Cressie N. A. C.: Goodness–of–Fit Statistics for Discrete Multivariate Data. Springer, Berlin 1988 MR 0955054 | Zbl 0663.62065
[14] Salicrú M., Morales D., Menéndez M. L., Pardo L.: On the applications of divergence type measures in testing statistical hypotheses. J. Multivariate Anal. 51 (1994), 372–391 DOI 10.1006/jmva.1994.1068 | Zbl 0815.62003
[15] Tavaré S., Altham P. M. E.: Serial dependence of observations leading to contingency tables, and corrections to chi–squared statistics. Biometrika 70 (1983), 139–144 DOI 10.1093/biomet/70.1.139 | MR 0742983 | Zbl 0502.62043
Partner of
EuDML logo