Previous |  Up |  Next

Article

Title: Extensions of the parametric families of divergences used in statistical inference (English)
Author: Kůs, Václav
Author: Morales, Domingo
Author: Vajda, Igor
Language: English
Journal: Kybernetika
ISSN: 0023-5954
Volume: 44
Issue: 1
Year: 2008
Pages: 95-112
Summary lang: English
.
Category: math
.
Summary: We propose a simple method of construction of new families of $\phi$%-divergences. This method called convex standardization is applicable to convex and concave functions $\psi(t)$ twice continuously differentiable in a neighborhood of $t=1$ with nonzero second derivative at the point $t=1$. Using this method we introduce several extensions of the LeCam, power, $% \chi^a$ and Matusita divergences. The extended families are shown to connect smoothly these divergences with the Kullback divergence or they connect various pairs of these particular divergences themselves. We investigate also the metric properties of divergences from these extended families. (English)
Keyword: divergences
Keyword: metric divergences
Keyword: families of $f$-divergences
MSC: 62B05
MSC: 62B10
MSC: 62H30
idZBL: Zbl 1142.62002
idMR: MR2405058
.
Date available: 2009-09-24T20:32:34Z
Last updated: 2012-06-06
Stable URL: http://hdl.handle.net/10338.dmlcz/135836
.
Reference: [1] Beirlant J., Devroye L., Győrfi, L., Vajda I.: Large deviations of divergence measures of partitions.J. Statist. Plann. Inference 93 (2001), 1–16 MR 1822385
Reference: [2] Csiszár I., Fisher J.: Informationsentfernungen im Raum der Narcheinlichkeitsverteilungen.Publ. Math. Inst. Hungar. Acad. Sci. 7 (1962), 159–180 MR 0191734
Reference: [3] Győrfi L., Vajda I.: Asymptotic distributions for goodness-of-fit statistics in a sequence of multinomial models.Statist. Probab. Lett. 56 (2002), 57–67 MR 1881531
Reference: [4] Hobza T., Molina, I., Vajda I.: On convergence of Fisher’s information in continuous models with quantized observations.Test 4 (2005), 151–179
Reference: [5] Kafka P., Österreicher, F., Vincze I.: On powers of Csiszár $f$-divergences defining a distance.Stud. Sci. Math. Hungar. 26 (1991), 415–422 MR 1197090
Reference: [6] Kullback S., Leibler R.: On information and sufficiency.Ann. Math. Statist. 22 (1951), 79–86 Zbl 0042.38403, MR 0039968
Reference: [7] Kullback S.: Statistics and Information Theory.Wiley, New York 1957
Reference: [8] Kůs V.: Blended $\phi $-divergences with examples.Kybernetika 39 (2003), 43–54 MR 1980123
Reference: [9] Cam L. Le: Asymptotic Methods in Statistical Decision Theory.Springer, New York 1986 Zbl 0605.62002, MR 0856411
Reference: [10] Liese F., Vajda I.: Convex Statistical Distances.Teubner, Leipzig 1987 Zbl 0656.62004, MR 0926905
Reference: [11] Liese F., Vajda I.: On divergences and informations in statistics and information theory.IEEE Trans. Inform. Theory 52 (2006), 4394–4412 MR 2300826
Reference: [12] Lindsay B. G.: Efficiency versus robustness: The case of minimum Hellinger distance and other methods.Ann. Statist. 22 (1994), 1081–1114 MR 1292557
Reference: [13] Morales D., Pardo, L., Vajda I.: Some new statistics for testing hypotheses in parametric models.J. Multivariate Anal. 62 (1997), 137–168 Zbl 0877.62020, MR 1467878
Reference: [14] Morales D., Pardo, L., Vajda I.: Limit laws for disparities of spacings.Nonparametric Statistics 15 (2003), 325–342 Zbl 1024.62020, MR 1987078
Reference: [15] Morales D., Pardo, L., Vajda I.: On the optimal number of classes in the Pearson goodness-of-fit tests.Kybernetika 41 (2005), 677–698 MR 2193859
Reference: [16] Österreicher F.: On a class of perimeter-type distances of probability distributions.Kybernetika 32 (1996), 389–393 Zbl 0897.60015, MR 1420130
Reference: [17] Österreicher F., Vajda I.: A new class of metric divergences on probability spaces and its applicability in statistics.Ann. Inst. Statist. Math. 55 (2003), 639–653 Zbl 1052.62002, MR 2007803
Reference: [18] Pardo L.: Statistical Inference Based on Divergence Measures.Chapman&Hall, London 2006 Zbl 1118.62008, MR 2183173
Reference: [19] Read T. C. R., Cressie N. A.: Goodness-of-fit Statistics for Discrete Multivariate Data.Springer, Berlin 1988 Zbl 0663.62065, MR 0955054
Reference: [20] Vajda I.: $\chi ^{a}$-divergence and generalized Fisher information.In: Trans. 6th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Academia, Prague 1973, pp. 872–886 Zbl 0297.62003, MR 0356302
Reference: [21] Vajda I.: Theory of Statistical Inference and Information.Kluwer, Boston 1989 Zbl 0711.62002
Reference: [22] Vajda I., Kůs V.: Relations Between Divergences, Total Variations and Euclidean Distances.Research Report No. 1853, Institute of Information Theory, Prague 1995
Reference: [23] Vajda I., Meulen E. C. van der: Optimization of Barron density stimates.IEEE Trans. Inform. Theory 47 (2001), 1867–1883 MR 1842524
.

Files

Files Size Format View
Kybernetika_44-2008-1_8.pdf 965.0Kb application/pdf View/Open
Back to standard record
Partner of
EuDML logo