Previous |  Up |  Next

Article

Keywords:
divergences; sufficiency; Bayes sufficiency; deficiency
Summary:
The paper studies the relations between $\phi$-divergences and fundamental concepts of decision theory such as sufficiency, Bayes sufficiency, and LeCam's deficiency. A new and considerably simplified approach is given to the spectral representation of $\phi $-divergences already established in Österreicher and Feldman [28] under restrictive conditions and in Liese and Vajda [22], [23] in the general form. The simplification is achieved by a new integral representation of convex functions in terms of elementary convex functions which are strictly convex at one point only. Bayes sufficiency is characterized with the help of a binary model that consists of the joint distribution and the product of the marginal distributions of the observation and the parameter, respectively. LeCam's deficiency is expressed in terms of $\phi $-divergences where $\phi $ belongs to a class of convex functions whose curvature measures are finite and satisfy a normalization condition.
References:
[1] M. S. Ali, D. Silvey: A general class of coefficients of divergence of one distribution from another. J. Roy. Statist. Soc. Ser. B 28 (1966), 131-140. MR 0196777 | Zbl 0203.19902
[2] S. Arimoto: Information-theoretical considerations on estimation problems. Inform. Control. 19 (1971), 181-194. DOI 10.1016/S0019-9958(71)90065-9 | MR 0309224 | Zbl 0222.94022
[3] A. R. Barron, L. Györfi, E. C. van der Meulen: Distribution estimates consistent in total variation and two types of information divergence. IEEE Trans. Inform. Theory 38 (1990), 1437-1454. DOI 10.1109/18.149496
[4] A. Berlinet, I. Vajda, E. C. van der Meulen: About the asymptotic accuracy of Barron density estimates. IEEE Trans. Inform. Theory 44 (1990), 999-1009. DOI 10.1109/18.669143 | MR 1616679 | Zbl 0952.62029
[5] A. Bhattacharyya: On some analogues to the amount of information and their uses in statistical estimation. Sankhya 8 (1946), 1-14. MR 0020242
[6] H. Chernoff: A measure of asymptotic efficiency for test of a hypothesis based on the sum of observations. Ann. Math. Statist. 23 (1952), 493-507. DOI 10.1214/aoms/1177729330 | MR 0057518
[7] B. S. Clarke, A. R. Barron: Information-theoretic asymptotics of Bayes methods. IEEE Trans. Inform. Theory 36 (1990), 453-471. DOI 10.1109/18.54897 | MR 1053841 | Zbl 0709.62008
[8] I. Csiszár: Eine informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffscher Ketten. Publ. Math. Inst. Hungar. Acad. Sci.8 (1963), 84-108. MR 0164374
[9] I. Csiszár: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2, (1967), 299-318. MR 0219345 | Zbl 0157.25802
[10] T. Cover, J. Thomas: Elements of Information Theory. Wiley, New York 1991. MR 1122806 | Zbl 1140.94001
[11] M. H. De Groot: Optimal Statistical Decisions. McGraw Hill, New York 1970. MR 0356303
[12] D. Feldman, F. Österreicher: A note on $f$-divergences. Studia Sci. Math. Hungar. 24 (1989), 191-200. MR 1051149 | Zbl 0725.62005
[13] A. Guntuboyina: Lower bounds for the minimax risk using $f$-divergences, and applications. IEEE Trans. Inform. Theory 57 (2011), 2386-2399. DOI 10.1109/TIT.2011.2110791 | MR 2809097
[14] C. Guttenbrunner: On applications of the representation of $f$-divergences as averaged minimal Bayesian risk. In: Trans. 11th Prague Conf. Inform. Theory, Statist. Dec. Funct., Random Processes A, 1992, pp. 449-456.
[15] L. Jager, J. A. Wellner: Goodness-of-fit tests via phi-divergences. Ann. Statist. 35 (2007), 2018-2053. DOI 10.1214/0009053607000000244 | MR 2363962 | Zbl 1126.62030
[16] T. Kailath: The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. Commun. Technol. 15 (1990), 52-60. DOI 10.1109/TCOM.1967.1089532
[17] S. Kakutani: On equivalence of infinite product measures. Ann. Math. 49 (1948), 214-224. DOI 10.2307/1969123 | MR 0023331 | Zbl 0030.02303
[18] S. Kullback, R. Leibler: On information and sufficiency. Ann. Math. Statist. 22 (1951), 79-86. DOI 10.1214/aoms/1177729694 | MR 0039968 | Zbl 0042.38403
[19] L. LeCam: Locally asymptotically normal families of distributions. Univ. Calif. Publ. 3, (1960), 37-98. MR 0126903
[20] L. LeCam: Asymptotic Methods in Statistical Decision Theory. Springer, Berlin 1986.
[21] F. Liese, I. Vajda: Convex Statistical Distances. Teubner, Leipzig 1987. MR 0926905 | Zbl 0656.62004
[22] F. Liese, I. Vajda: On divergence and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394-4412. DOI 10.1109/TIT.2006.881731 | MR 2300826
[23] F. Liese, I. Vajda: $f$-divergences: Sufficiency, deficiency and testing of gypotheses. In: Advances in Inequalities from Probability Theory and Statistics. (N. S. Barnett and S. S. Dragomir, eds.), Nova Science Publisher, Inc., New York 2008, pp. 113-149. MR 2459971
[24] F. Liese, K. J. Miescke: Statistical Decision Theory, Estimation, Testing and Selection. Springer, New York 2008. MR 2421720 | Zbl 1154.62008
[25] K. Matusita: Decision rules based on the distance, for problems of fit, two samples and estimation. Ann. Math. Statist. 26 (1955), 613-640. DOI 10.1214/aoms/1177728422 | MR 0073899 | Zbl 0065.12101
[26] D. Mussmann: Decision rules based on the distance, for problems of fit, two samples and estimation. Studia Sci. Math. Hungar. 14 (1979), 37-41.
[27] X. Nguyen, M. J. Wainwright, M. I. Jordan: On surrogate loss functions and $f$-divergences. Ann. Statist. 37 (2009), 2018-2053. DOI 10.1214/08-AOS595 | MR 2502654 | Zbl 1162.62060
[28] F. Österreicher, D. Feldman: Divergenzen von Wahrscheinlichkeitsverteilungen - integralgeometrisch betrachtet. Acta Math. Sci. Hungar. 37 (1981), 329-337. DOI 10.1007/BF01895132 | MR 0619882 | Zbl 0477.60013
[29] F. Österreicher, I. Vajda: Statistical information and discrimination. IEEE Trans. Inform. Theory 39 (1993), 1036-1039. DOI 10.1109/18.256536 | MR 1237725 | Zbl 0792.62005
[30] J. Pfanzagl: A characterization of sufficiency by power functions. Metrika 21 (1974), 197-199. DOI 10.1007/BF01893900 | MR 0365797 | Zbl 0289.62009
[31] H. V. Poor: Robust decision design using a distance criterion. IEEE Trans. Inform. Theory 26 (1980), 578-587. MR 0583942 | Zbl 0445.62017
[32] M. R. C. Read, N. A. C. Cressie: Goodness-of-Fit Statistics for Discrete Multivariate Data. Springer, Berlin 1988. MR 0955054 | Zbl 0663.62065
[33] A. Rényi: On measures of entropy and information. In: Proc. 4th Berkeley Symp. on Probab. Theory and Math. Statist. Berkeley Univ. Press, Berkeley 1961, pp. 547-561. MR 0132570 | Zbl 0106.33001
[34] A. W. Roberts, D. E. Varberg: Convex Functions. Academic Press, New York 1973. MR 0442824 | Zbl 0289.26012
[35] M. J. Schervish: Theory of Statistics. Springer, New York 1995. MR 1354146 | Zbl 0834.62002
[36] C. E. Shannon: A mathematical theory of communication. Bell. Syst. Tech. J. 27 (1948), 379-423, 623-656. DOI 10.1002/j.1538-7305.1948.tb01338.x | MR 0026286 | Zbl 1154.94303
[37] H. Strasser: Mathematical Theory of Statistics. De Gruyter, Berlin 1985. MR 0812467 | Zbl 0594.62017
[38] F. Topsøe: Information-theoretical optimization techniques. Kybernetika 15 (1979), 7-17. MR 0529888
[39] F. Topsøe: Some inequalities for information divergence and related measures of discrimination. IEEE Trans. Inform. Theory 46 (2000), 1602-1609. DOI 10.1109/18.850703 | MR 1768575
[40] E. Torgersen: Comparison of Statistical Experiments. Cambridge Univ. Press, Cambridge 1991. MR 1104437 | Zbl 1158.62006
[41] I. Vajda: On the $f$-divergence and singularity of probability measures. Periodica Math. Hungar. 2 (1972), 223-234. DOI 10.1007/BF02018663 | MR 0335163 | Zbl 0248.62001
[42] I. Vajda: Theory of Statistical Inference and Information. Kluwer Academic Publishers, Dordrecht - Boston - London 1989. Zbl 0711.62002
[43] I. Vajda: On metric divergences of probability measures. Kybernetika 45 (2009), 885-900. MR 2650071
[44] I. Vajda: On convergence of information contained in quantized observations. IEEE Trans. Inform. Theory. 48 (1980) 2163-2172. DOI 10.1109/TIT.2002.800497 | MR 1930280 | Zbl 1062.94533
[45] I. Vincze: On the concept and measure of information contained in an observation. In: Contribution to Probability. (J. Gani and V. F. Rohatgi, eds.) Academic Press, New York 1981, pp. 207-214. MR 0618690 | Zbl 0531.62002
Partner of
EuDML logo