Previous |  Up |  Next

Article

Keywords:
total variation; Hellinger divergence; Le Cam divergence; Information divergence; Jensen-Shannon divergence; metric divergences
Summary:
Standard properties of $\phi$-divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of $\phi $-divergences, or the metricity of their powers. This paper extends the previously known family of $\phi $-divergences with these properties. The extension consists of a continuum of $\phi $-divergences which are squared metric distances and which are mostly new but include also some classical cases like e.\,g. the Le Cam squared distance. The paper establishes also basic properties of the $\phi $-divergences from the extended class including the range of values and the upper and lower bounds attained under fixed total variation.
References:
[1] I. Csiszár: Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299–318. MR 0219345
[2] I. Csiszár: On topological properties of $f$-divergences. Studia Sci. Math. Hungar. 2 (1967), 329–339. MR 0219346
[3] B. Fuglede, T. Topsøe: Jensen–Shannon divergence and Hilbert space embedding. In: Proc. IEEE Internat. Symposium on Inform. Theory, IEEE Publications, New York 2004, p. 31.
[4] P. Kafka, F. Österreicher, I. Vincze: On powers of $f$-divergences defining a distance. Stud. Sci. Math. Hungar. 26 (1991), 329–339. MR 1197090
[5] M. Khosravifard, D. Fooladivanda, T. A. Gulliver: Confliction of the convexity and metric properties in $f$-divergences. IEICE Trans. on Fundamentals E90-A (2007), 1848–1853. DOI 10.1093/ietfec/e90-a.9.1848
[6] V. Kůs, D. Morales, I. Vajda: Extensions of the parametric families of divergences used in statistical inference. Kybernetika 44 (2008), 95–112. MR 2405058 | Zbl 1142.62002
[7] L. Le Cam: Asymptotic Methods in Statistical Decision Theory. Springer, New York 1986. MR 0856411 | Zbl 0605.62002
[8] F. Liese, I. Vajda: Convex Statistical Distances. Teubner, Leipzig 1987. MR 0926905 | Zbl 0656.62004
[9] F. Liese, I. Vajda: On divergence and informations in statistics and information theory. IEEE Trans. Inform. Theory 52 (2006), 4394–4412. DOI 10.1109/TIT.2006.881731 | MR 2300826
[10] K. Matusita: Decision rules based on the distance for problems of fit, two samples and estimation. Ann. Math. Statist. 26 (1955), 631–640. DOI 10.1214/aoms/1177728422 | MR 0073899 | Zbl 0065.12101
[11] F. Öesterreicher: On a class of perimeter-type distances of probability distributions. Kybernetika 32 (1996), 389–393. MR 1420130
[12] F. Österreicher, I. Vajda: A new class of metric divergences on probability spaces and its statistical applications. Ann. Inst. Statist. Math. 55 (2003), 639–653. DOI 10.1007/BF02517812 | MR 2007803
[13] I. Vajda: On the $f$-divergence and singularity of probability measures. Period. Math. Hungar. 2 (1972), 223–234. DOI 10.1007/BF02018663 | MR 0335163 | Zbl 0248.62001
[14] I. Vincze: On the concept and measure of information contained in an observation. In: Contributions to Probability (J. Gani and V. F. Rohatgi, eds.), Academic Press, New York 1981, pp. 207–214. MR 0618690 | Zbl 0531.62002
Partner of
EuDML logo